CN112486344A - Judgment and display method for pen point estimated falling point - Google Patents
Judgment and display method for pen point estimated falling point Download PDFInfo
- Publication number
- CN112486344A CN112486344A CN202011636045.XA CN202011636045A CN112486344A CN 112486344 A CN112486344 A CN 112486344A CN 202011636045 A CN202011636045 A CN 202011636045A CN 112486344 A CN112486344 A CN 112486344A
- Authority
- CN
- China
- Prior art keywords
- pressure sensing
- sensing data
- data
- determining
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0414—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention relates to a method for judging and displaying an estimated pen point, which comprises the following steps: acquiring pressure sensing data; identifying and determining hand pressure sensing data in the pressure sensing data; identifying and determining nib induction point data in the pressure induction data; determining and storing relative position information of the stylus when the pen point sensing data and the hand pressure sensing data exist at the same time; acquiring hand pressure sensing data of a current user; and if the nib induction point data is not detected at present, determining nib estimated falling point data of the current nib estimated falling point according to the relative position information of the stylus. The invention has the advantages that when a user wears the VR head display equipment to watch the panoramic teaching video and records the physical notes through the pressure sensing device, the user can still see the estimated falling point of the pen point under the condition that the visual field is completely occupied by the VR head display equipment, thereby avoiding the ink marks of the falling point and keeping the integral cleanliness of the paper.
Description
Technical Field
The invention relates to a method for judging and displaying an estimated landing point of a pen point.
Background
Patent application No.: 2020107270473, patent name: a note recording system linked with a panoramic video enables a user to still record electronic notes and entity notes under the condition that the visual field is completely occupied by the panoramic teaching video when the user wears VR head display equipment to watch the panoramic teaching video, thereby deepening learning memory and improving learning efficiency.
However, in practice, the system has a drawback that when a user does not write, the user does not know the specific position of the pen point, and only after the pen point applies pressure to the pressure sensing pad, the specific position of the pen point can be displayed in the panoramic video, but at the moment, point-dropping ink marks are generated on the paper surface, and the user needs to determine the position to be written through multiple times of point-dropping ink marks. In the whole process of watching panoramic video learning, it can be expected that a plurality of dot-dropping ink marks are inevitably generated if the notes are frequently recorded, and the cleanliness of the whole paper surface is obviously influenced if the number of the dot-dropping ink marks is too large.
Disclosure of Invention
In order to overcome the defects of the background art, the invention provides a method for judging and displaying the estimated pen point, which can display the estimated pen point on the display area of a display screen at any time.
The invention provides a method for judging and displaying an estimated pen point, which comprises the following steps: acquiring pressure sensing data, wherein the pressure sensing data is generated by sensing the pressure on the surface of a pressure sensing device by the pressure sensing device; identifying and determining hand pressure sensing data in the pressure sensing data, wherein the hand pressure sensing data is generated by pressure applied to the pressure sensing device by the palm ruler side of the pen holding; identifying and determining nib induction point data in the pressure induction data, wherein the nib induction point data is generated by pressure applied by a nib to a pressure induction device; determining and storing relative position information of the stylus when the pen point sensing data and the hand pressure sensing data exist at the same time; acquiring hand pressure sensing data of a current user; and if the nib induction point data is not detected at present, determining nib estimated falling point data of the current nib estimated falling point according to the relative position information of the stylus.
Further, the step of determining nib estimated landing point data of the current nib estimated landing point according to the relative position information further includes: and marking or flickering image pixels of the image coordinate values corresponding to the estimated pen point falling data on the display area.
Further, after the step of identifying and determining hand pressure sensing data in the pressure sensing data, the method further comprises:
identifying and determining little finger outer knuckle pressure sensing data in the hand pressure sensing data, wherein the little finger outer knuckle pressure sensing data is generated by pressure exerted on a pressure sensing device by a little finger outer knuckle;
the relative position information of the hand pen is the relative position information between the little finger outer knuckle pressure sensing data and the pen point sensing data.
Preferably, the step of identifying and determining hand pressure sensing data in the pressure sensing data comprises:
determining each sensing area block exceeding the area of a preset area in the pressure sensing data;
determining knuckle empty lines which meet preset conditions in the pressure sensing data;
and determining hand pressure sensing data, wherein the hand pressure sensing data is an integral sensing area block formed by all sensing area blocks which are connected together through knuckle hollow stripes.
Preferably, the step of identifying and determining the little finger outer knuckle pressure sensing data in the hand pressure sensing data comprises:
respectively determining the sensing areas of sensing area blocks at the left end and the right end in the hand pressure sensing data;
comparing the sizes of the induction area blocks at the left end and the right end;
and determining the smaller induction area block in the left and right induction area blocks as little finger outer finger pressure induction data.
The invention also proposes a processing terminal comprising a memory for storing a program and a processor for executing the program, when executed by the processor, implementing the steps of the method as defined in any one of the above.
The invention has the advantages that when a user wears the VR head display equipment to watch the panoramic teaching video and records the physical notes through the pressure sensing device, the user can still see the estimated falling point of the pen point under the condition that the visual field is completely occupied by the VR head display equipment, thereby avoiding the ink marks of the falling point and keeping the integral cleanliness of the paper.
Drawings
FIG. 1 is a standard illustration of hand gestures while writing.
FIG. 2 is a flowchart illustrating a method for determining and displaying an estimated landing point of a pen tip according to an embodiment.
Detailed Description
The invention is further described below with reference to the accompanying drawings and specific examples.
In an embodiment, referring to fig. 1-2, a method for judging and displaying an estimated pen point may be applied to a data processing terminal, and includes:
s101, pressure sensing data are obtained, and the pressure sensing data are generated by sensing the pressure on the surface of a pressure sensing device; in this step, the pressure sensing device is composed of a plurality of sensing points capable of sensing the existence of pressure, the parameters of the pressure sensing device comprise physical size, sensing points and sensing resolution, for better understanding, the parameters can be analogized with a screen, and the three parameters are respectively equivalent to the screen size, the screen pixels and the screen resolution; the size of the induction point is not larger than the contact area of the pen point and the paper; at the same time, the pressure sensing device can acquire the coordinate value of each pressed sensing point, namely, the pressure sensing data is acquired.
S102, identifying and determining hand pressure sensing data in the pressure sensing data, wherein the hand pressure sensing data is generated by pressure applied to a pressure sensing device by the palm ruler side of the pen; the pressure sensing data acquired by the pressure sensing device may be derived from pressures of different objects, for example, when a user writes on a paper surface on the pressure sensing device under normal conditions, the palm ruler side and the pen point holding the pen generate pressure on the pressure sensing device; in order to prevent the paper from sliding, some users can press the paper surface by another hand or other objects to generate pressure on the pressure sensing device; the data related to the judgment of the estimated pen point among the plurality of possible pressure sensing data only comprise hand pressure sensing data and pen point sensing data; the purpose of the step is to identify and determine hand pressure sensing data generated by applying pressure to the pressure sensing device by the palm ruler side of the hand of the user holding the pen in the pressure sensing data.
S103, identifying and determining little finger outer knuckle pressure sensing data in the hand pressure sensing data, wherein the little finger outer knuckle pressure sensing data is generated by pressure exerted by a little finger outer knuckle on a pressure sensing device; when writing, the contact area of the hand and the paper surface can be further divided into: the contact area of the outer knuckle of the little finger, the contact area of the middle knuckle of the little finger, the contact area of the inner knuckle of the little finger and the contact area of the ulnar side of the palm part; in the process of normally writing a single character, the relative position of the contact area of the outer knuckle of the little finger and the position of the pen point is fixed and unchanged due to the posture of holding the pen by a person; therefore, after the hand pressure sensing data is determined in this step, the little finger outer knuckle pressure sensing data in the hand pressure sensing data needs to be further determined.
S104, identifying and determining nib induction point data in the pressure induction data, wherein the nib induction point data is generated by pressure applied by a nib to a pressure induction device; generally speaking, the coordinate values of one or more isolated sensing points in the hand pressure sensing data within the preset direction and range are nib sensing point data generated by the pressure applied by the nib to the pressure sensing device.
S105, determining and storing relative position information of the stylus when the pen point sensing data and the hand pressure sensing data exist at the same time; the relative position information of the hand pen is the relative position information between the little finger outer knuckle pressure sensing data and the pen point sensing data; because everybody palm size's difference to and the fingertip is held when holding the pen again at every turn and is held the slight change of holding a body position, all can cause the change of pen stylus relative position information, so pen stylus relative position information is really a dynamic data that changes at any time, when detecting nib response point data at every turn, all can update pen stylus relative position information.
S106, acquiring the hand pressure sensing data of the current user.
S107, if no nib induction point data is detected at present, nib estimated falling point data of the current nib estimated falling point is determined according to the relative position information of the stylus. When hand pressure sensing data is detected but pen point sensing point data is not detected, the fact that the user is ready to write is indicated, if the user cannot necessarily know the position of the pen point with the VR head display at the moment, through the step, the data processing terminal can determine the estimated landing point of the current pen point according to the previous relative position information of the hand pen.
S108, marking or flickering image pixels of the image coordinate values corresponding to the estimated pen point falling data on the display area. After the estimated falling point of the pen point is determined, the estimated falling point of the pen point is required to be displayed in a display screen of a VR head display, generally speaking, a display area exists in the display screen of the VR head display and is used for displaying a simulation paper image, the image resolution of the simulation paper image is the same as the sensing resolution of a pressure sensing device, and the image pixels of the image coordinate value corresponding to the estimated falling point data of the pen point on the simulation paper image are marked or flickered, so that a user can know the estimated falling point of the pen point.
In this embodiment, step S102 specifically includes the following steps:
determining each sensing area block which accords with a preset shape and exceeds the area of a preset area in the pressure sensing data; hand pressure sensing data has several characteristics in pressure sensing data: is necessarily a continuous area occupying a certain area and having an arc-like shape.
Determining knuckle empty lines which meet preset conditions in the pressure sensing data; the reason why the knuckle hollow lines refer to the areas without pressure lines of the pressure sensing device in the detected pressure areas is that the lines of the little finger joints or the ulnar side of the palm are concave after the palm is bent, so that the lines may not generate pressure on the pressure sensing device.
And determining hand pressure sensing data, wherein the hand pressure sensing data is an integral sensing area block formed by all sensing area blocks which are connected together through knuckle hollow stripes. The normal hand pressure sensing data has a plurality of knuckle empty lines, especially the knuckle empty lines are generated after the little finger is bent, if the finger knuckle empty lines are not generated, the sensing area block can be considered not to be caused by the hand of a user to the pressure sensing device due to writing reasons.
In this embodiment, step S103 specifically includes the following steps:
and respectively determining the sensing areas of the sensing area blocks at the left end and the right end in the hand pressure sensing data.
And comparing the sizes of the induction area blocks at the left end and the right end.
And determining the smaller induction area block in the left and right induction area blocks as little finger outer finger pressure induction data.
Generally speaking, the left and right end sensing areas in the hand pressure sensing data are the little finger outer knuckle contact area and the palm portion ulnar contact area, and the palm portion ulnar contact area is definitely larger than the area of the little finger outer knuckle contact area, so that the little finger outer knuckle pressure sensing data can be determined by comparing the areas, and the user can recognize whether the user writes with the left hand or the right hand.
Although the present invention has been described with reference to preferred embodiments, it will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and various changes in form and details may be made within the scope of the appended claims.
Claims (6)
1. A method for judging and displaying an estimated pen point is characterized by comprising the following steps:
acquiring pressure sensing data, wherein the pressure sensing data is generated by sensing the pressure on the surface of a pressure sensing device by the pressure sensing device;
identifying and determining hand pressure sensing data in the pressure sensing data, wherein the hand pressure sensing data is generated by pressure applied to the pressure sensing device by the palm ruler side of the pen holding;
identifying and determining nib induction point data in the pressure induction data, wherein the nib induction point data is generated by pressure applied by a nib to a pressure induction device;
determining and storing relative position information of the stylus when the pen point sensing data and the hand pressure sensing data exist at the same time;
acquiring hand pressure sensing data of a current user;
and if the nib induction point data is not detected at present, determining nib estimated falling point data of the current nib estimated falling point according to the relative position information of the stylus.
2. The method for judging and displaying an estimated pen tip drop point according to claim 1, wherein: the method comprises the following steps of determining nib estimated landing point data of the current nib estimated landing point according to the relative position information: and marking or flickering image pixels of the image coordinate values corresponding to the estimated pen point falling data on the display area.
3. The method for judging and displaying an estimated pen tip drop point according to claim 1, wherein: after the step of identifying and determining hand pressure sensing data in the pressure sensing data, the method further comprises the following steps: identifying and determining little finger outer knuckle pressure sensing data in the hand pressure sensing data, wherein the little finger outer knuckle pressure sensing data is generated by pressure exerted on a pressure sensing device by a little finger outer knuckle;
the relative position information of the hand pen is the relative position information between the little finger outer knuckle pressure sensing data and the pen point sensing data.
4. The method for judging and displaying an estimated pen tip drop point according to claim 1, wherein: the step of identifying and determining hand pressure sensing data in the pressure sensing data comprises:
determining each sensing area block exceeding the area of a preset area in the pressure sensing data;
determining knuckle empty lines which meet preset conditions in the pressure sensing data;
and determining hand pressure sensing data, wherein the hand pressure sensing data is an integral sensing area block formed by all sensing area blocks which are connected together through knuckle hollow stripes.
5. The method for judging and displaying an estimated pen tip drop point according to claim 4, wherein: the step of identifying and determining the little finger outer knuckle pressure sensing data in the hand pressure sensing data comprises the following steps:
respectively determining the sensing areas of sensing area blocks at the left end and the right end in the hand pressure sensing data;
comparing the sizes of the induction area blocks at the left end and the right end;
and determining the smaller induction area block in the left and right induction area blocks as little finger outer finger pressure induction data.
6. A processing terminal comprising a memory for storing a program and a processor for executing the program, characterized in that: the program when executed by a processor implements the steps of the method of any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011636045.XA CN112486344B (en) | 2020-12-31 | 2020-12-31 | Judgment and display method for pen point estimated falling point |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011636045.XA CN112486344B (en) | 2020-12-31 | 2020-12-31 | Judgment and display method for pen point estimated falling point |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112486344A true CN112486344A (en) | 2021-03-12 |
CN112486344B CN112486344B (en) | 2022-05-27 |
Family
ID=74916043
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011636045.XA Active CN112486344B (en) | 2020-12-31 | 2020-12-31 | Judgment and display method for pen point estimated falling point |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112486344B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113762045A (en) * | 2021-05-06 | 2021-12-07 | 腾讯科技(深圳)有限公司 | Click-to-read position identification method and device, click-to-read equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6434581B1 (en) * | 1991-03-20 | 2002-08-13 | Microsoft Corporation | Script character processing method for interactively adjusting space between writing element |
JP2006171854A (en) * | 2004-12-13 | 2006-06-29 | Canon Inc | Coordinate input device, coordinate input method, program, and storage medium |
CN104076951A (en) * | 2013-03-25 | 2014-10-01 | 崔伟 | Hand cursor system, finger lock, finger action detecting method and gesture detection method |
US20150338949A1 (en) * | 2014-05-21 | 2015-11-26 | Apple Inc. | Stylus tilt and orientation estimation from touch sensor panel images |
US20170017323A1 (en) * | 2015-07-17 | 2017-01-19 | Osterhout Group, Inc. | External user interface for head worn computing |
CN110377214A (en) * | 2019-06-04 | 2019-10-25 | 昆山龙腾光电有限公司 | Method for detecting angle and angle detection device |
-
2020
- 2020-12-31 CN CN202011636045.XA patent/CN112486344B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6434581B1 (en) * | 1991-03-20 | 2002-08-13 | Microsoft Corporation | Script character processing method for interactively adjusting space between writing element |
JP2006171854A (en) * | 2004-12-13 | 2006-06-29 | Canon Inc | Coordinate input device, coordinate input method, program, and storage medium |
CN104076951A (en) * | 2013-03-25 | 2014-10-01 | 崔伟 | Hand cursor system, finger lock, finger action detecting method and gesture detection method |
US20150338949A1 (en) * | 2014-05-21 | 2015-11-26 | Apple Inc. | Stylus tilt and orientation estimation from touch sensor panel images |
US20170017323A1 (en) * | 2015-07-17 | 2017-01-19 | Osterhout Group, Inc. | External user interface for head worn computing |
CN110377214A (en) * | 2019-06-04 | 2019-10-25 | 昆山龙腾光电有限公司 | Method for detecting angle and angle detection device |
Non-Patent Citations (1)
Title |
---|
刘扬等: "手写动态特征采集系统的设计", 《仪器仪表学报》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113762045A (en) * | 2021-05-06 | 2021-12-07 | 腾讯科技(深圳)有限公司 | Click-to-read position identification method and device, click-to-read equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112486344B (en) | 2022-05-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5347589A (en) | System and method for displaying handwriting parameters for handwriting verification | |
US5627348A (en) | Electronic stylus with writing feel | |
CN103268166B (en) | The former handwriting information collection of handwriting input device and display packing | |
EP1457870A3 (en) | System and method for differentiating between pointers used to contact touch surface | |
CN108762579B (en) | Method and system for generating handwriting of multiple pressure-sensitive pens and related components | |
CN104238768A (en) | Handwriting input apparatus and control method | |
CN112486344B (en) | Judgment and display method for pen point estimated falling point | |
CN107943324A (en) | A kind of man-machine interactive system and method based on writing | |
US12272118B2 (en) | Classifying pressure inputs | |
CN104156163A (en) | Method for displaying handwriting in PDF file | |
CN106843650A (en) | The touch identification method and system of a kind of touch screen integrated machine | |
CN111782131A (en) | Pen point implementation method, device, equipment and readable storage medium | |
JP5861818B2 (en) | Electronic writing device | |
CN101364271B (en) | Method for recognizing hand-written Chinese character strokes and recognition device | |
CN113703577B (en) | Drawing method, drawing device, computer equipment and storage medium | |
CN104951811B (en) | Row style line recognition methods and device applied to brush writing | |
US7542607B2 (en) | Digital pen and paper | |
JP2003015815A (en) | Handwriting input information acquisition method and handwriting input device | |
US9996256B2 (en) | Method for erasing electronic handwriting on a clipboard | |
US12429961B2 (en) | Writing instrument | |
CN110348427A (en) | A kind of face hand-written input system and method based on Image Acquisition | |
CN106959772B (en) | Input method and electronic equipment | |
JP4727698B2 (en) | Writing input device | |
CN114691009B (en) | Method, system, device and medium for always displaying handwriting data in forward direction | |
US20220404937A1 (en) | Maintaining Pressure Input Paths |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |