HK1179374B - Optical touchpad for touch and gesture recognition - Google Patents
Optical touchpad for touch and gesture recognition Download PDFInfo
- Publication number
- HK1179374B HK1179374B HK13106160.6A HK13106160A HK1179374B HK 1179374 B HK1179374 B HK 1179374B HK 13106160 A HK13106160 A HK 13106160A HK 1179374 B HK1179374 B HK 1179374B
- Authority
- HK
- Hong Kong
- Prior art keywords
- interface
- light
- touch
- prism
- image sensor
- Prior art date
Links
Abstract
The subject application relates to optical touchpad for touch and gesture recognition. An optical touchpad including a prism having a four-sided cross section including a light entry interface, a light exit interface, a touch interface and a back interface substantially parallel to and spaced apart from the touch interface. Collimated light enters the prism through the light entry interface, is reflected from the touch interface by total internal reflection, and exits the prism through the light exit interface. A first image sensor detects the collimated light exiting from the light exit interface and a second image sensor detects the image of an object positioned over the touch interface.
Description
Technical Field
The present invention relates generally to optical devices, and particularly, but not exclusively, to optical touch pads for touch and gesture recognition.
Background
Many kinds of input devices are used with electronic devices such as computers. The mouse is the most common input pointing device in portability for use with computers, but a touchpad is also commonly used, and almost all laptop computers are currently equipped with touchpads. The touch pad detects the position at which a user's finger touches the touch pad, and uses the detected position to control a cursor on a computer screen and select commands for the computer.
Two types of electronic touch pads (resistive or capacitive) are the most common, but optical fingerprint sensors have recently become available, which are often used for security keys. For example, a fingerprint sensor may be used to unlock a computer so that the computer may be operated, and even to unlock a physical door to access a building or room.
Optical gesture recognizers have also recently become available. Gesture recognizers use cameras to detect objects such as hands. The detected image of the hand is then analyzed by a computer or processor to recognize the gesture made by the hand. The detected gesture may be used, for example, to select a command for the computer in a computer and/or video game.
Disclosure of Invention
One aspect of the present invention relates to an optical touch pad, comprising: a prism having a four-sided cross-section including a light entry interface, a light exit interface, a touch interface, and a back interface substantially parallel to and spaced apart from the touch interface; a source of collimated light, wherein the collimated light enters the prism via the entry interface, reflects from the touch interface by total internal reflection, and exits the prism via the light exit interface; a first image sensor for detecting the collimated light exiting from the light exit interface; a lens for forming an image of an object positioned over the touch interface via the touch interface and the back interface; and a second image sensor optically coupled to the lens for detecting the image of the object.
Another aspect of the invention relates to a method for fingertip and gesture imaging, the method comprising: directing collimated light into a prism having a four-sided cross-section, the prism comprising a light entry interface, a light exit interface, a touch interface, and a backside interface substantially parallel to and spaced apart from the touch interface, wherein the collimated light enters the prism via the light entry interface, reflects from the touch interface by total internal reflection, and exits the prism via the light exit interface; imaging the collimated light exiting the light exit interface to obtain an image of the touch interface; and imaging an object positioned over the touch interface.
Yet another aspect of the present invention relates to a method for performing multi-touch using an optical touch pad, the method comprising: capturing a first image frame at a first time, the first image frame comprising one or more touch locations; capturing a second image frame at a second, consecutive time, the second image frame comprising a same number of touch locations as the first image frame; correlating the first image frame with the second image frame to produce correlation peak locations; and determining the location of each finger touch by separating the correlation peak locations, wherein each correlation peak corresponds to a touch location.
Yet another aspect of the present invention relates to an optical touch pad, comprising: a prism having a light entry interface, a light exit interface, and a touch interface; a source of collimated light, wherein the collimated light enters the prism via the light entry interface, reflects from the touch interface by total internal reflection, and exits the prism via the light exit interface; a first image sensor for detecting the collimated light exiting from the light exit interface; a lens positioned adjacent to the prism for forming an image of an object; and a second image sensor optically coupled to the lens for detecting the image of the object, wherein the image of the object is not formed via the prism.
Drawings
Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
FIG. 1 is a cross-sectional view of an embodiment of a prism illustrating total internal reflection.
FIG. 2 is a cross-sectional view of an embodiment of a prism illustrating frustrated total internal reflection when a fingertip touches the prism.
FIG. 3 is a schematic diagram of an embodiment of an optical touch detector and/or fingerprint sensor.
FIG. 4 is a block diagram of an embodiment of an optical sensing system.
FIG. 5 is a schematic diagram of an embodiment of fingertip positions being sensed by an optical touch detector.
FIG. 6 is a schematic diagram of an embodiment of a cursor position on a computer screen translated from a fingertip position on an optical touch detector such as that shown in FIG. 5.
FIG. 7 is a schematic diagram of an embodiment of a pair of fingertips sensed by an optical touch detector.
FIG. 8 is a schematic diagram of an embodiment of multiple cursor positions on a computer screen translated from fingertip positions on an optical touch detector such as that shown in FIG. 6.
FIG. 9 is a flow chart of an embodiment of a process for translating a fingertip position on an optical touchpad to a cursor position on a computer screen such as shown in FIGS. 5-6 or 7-8.
FIG. 10 is a schematic diagram of an embodiment of an optical touchpad for touch detection and gesture recognition.
FIG. 11 is a schematic diagram of an alternative embodiment of an optical touchpad for touch detection and gesture recognition.
Detailed Description
Embodiments of apparatuses, systems, and methods for optical touch and gesture recognition are described. Numerous specific details are described to provide a thorough understanding of embodiments of the invention, but one skilled in the relevant art will recognize: the invention may be practiced without one or more of the specific details, or with other methods, components, materials, etc. In some instances, well-known structures, materials, or operations are not shown or described in detail, but are nevertheless within the scope of the invention.
Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases "in one embodiment" or "in an embodiment" in this specification do not necessarily all refer to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Similar to commonly used electronic touch pads, optical touch sensors sense one or more fingertips touching the surface of the sensor-that is, there is typically physical contact between a finger and the sensor. An optical touch sensor can sense the position of a fingertip on the surface and, with sufficient resolution, can also detect a fingerprint of the fingertip. The gesture recognizer may use a camera to image the gesturing hand, but in contrast to the touch sensor, the hand may not be too close to the camera. Thus, fingertip sensors may be used as optical touch pads, but optical touch pads may not be used for gesture recognition, since the gesturing hand does not touch the touch pad. The following describes an optical device that can be used as both an optical touch pad and a gesture recognizer.
Fig. 1 illustrates the physical principle of total internal reflection in the prism 102. Prism 102 has a triangular cross-section with a refractive index n2And is made of a material having a composition different from n2Refractive index n of1Is surrounded by the medium. Due to its triangular cross-section, prism 102 includes three interfaces with surrounding media: surface 104, surface 106, and surface 108. Collimated light 110 is normally incident at interface 104 such that collimated light 110 passes through interface 104 and enters prism 102 substantially unaltered. Collimated light 110 at an incident angle θiIncident at the interface 106, e.g., at points 114, 116, 118, 120, etc., wherein the angle of incidence θiGreater than the critical angle theta at the interface 106c. Critical angle thetacDefined as follows:
equation (1)
In the example where the prism 102 is made of glass and surrounded by air, n21.5 (glass) and n11 (air), this means the critical angle θc41.81 degrees. If the angle of incidence thetaiAbove the critical angle-that is, if the strips are satisfiedItem thetai>θcThe collimated light 110 will be reflected by the interface 106 rather than being transmitted into the surrounding medium via the interface 106. The reflected light 112 remains collimated and has a reflection angle θrAccording to the law of reflection, said angle of reflection θrEqual to the angle of incidence thetai(θr=θi). The reflected collimated light 112 is normal to the interface 108 of the prism 102, and thus the collimated light 112 exits the prism 102 unchanged via the interface 108.
Fig. 2 illustrates the principle of Frustrated Total Internal Reflection (FTIR) in the prism 102 when a fingertip touches the prism. As in FIG. 1, prism 102 has a triangular cross-section and has a refractive index n2And surrounded by a medium, e.g. air, having a value different from n2Refractive index n of1. Collimated light 110 is normally incident at interface 104 such that collimated light 110 enters prism 102 substantially unchanged. After entering prism 102, collimated light 110, or a portion thereof, reflects from interface 106 and exits prism 102 unaltered via interface 108. In other embodiments, the collimated light 110 and the reflected collimated light 112 are not necessarily normal to the interfaces 104 and 108, respectively. In these embodiments, the collimated light is deflected as it enters the prism and as it exits the prism, but the total internal reflection condition (θ) at the interface 106 is suitably maintained (i.e., the light is reflected by the prism)i>θc)。
The tip of finger 202 may be used to touch interface 106 of prism 102. Typically, a fingertip includes both ridges and valleys, which together form the fingerprint of a person. When the tip of finger 202 is pressed against interface 106, the ridgeline at the tip of finger 202 physically touches points 114 and 118 of interface 106. Due to this contact, there is no longer a prism-air interface at points 114 and 118. Therefore, the critical angle given by equation (1) cannot be defined and the total internal reflection condition (θ) may no longer be satisfiedi>θc). Because the total internal reflection condition cannot be met, the incident collimated light 110 is not reflected at points 114 and 118 (where the fingertip ridge line makes contact with interface 106) but instead is scattered and/or absorbed by finger 202. Even though there may be a thin air gap between the prism and the ridgeline fingertip of the fingertip, due to the portion of light from inside the prismOr all tunneling through a thin air gap to the fingertip, the energy of the reflected light can be reduced or eliminated, which is known as frustrated total internal reflection. Thus, total internal reflection is considered frustrated, since light can now pass through into the contact material (typically the skin), and the internal reflection at the interface 106 is no longer complete. In contrast, the valley portion of the tip of finger 202 is not in contact with interface 106, which means that there is a prism-air interface at points 116 and 120. The incident collimated light 110 is thus reflected at points 116 and 120, as is the collimated light 112 without a finger 202 on the prism 102, for example.
FIG. 3 shows an embodiment of a touch sensor 300 that relies on frustrated total internal reflection in a prism, such as prism 102. Sensor 300 may include a source of collimated light including a light source 302 and a lens 304, lens 304 collimating light emitted by light source 302 into collimated light beam 110. The image sensor 306 may detect the collimated light 112 reflected from the portion of the interface 106. The detected reflected-collimated light 112 shows the presence of the tip of the finger 202 on the interface 106. In embodiments where image sensor 306 has sufficient resolution, image sensor 306 may also detect and image the fingerprint of finger 202 as: a light portion corresponding to a valley of the fingerprint and a dark portion corresponding to a ridge of the fingerprint. A lens 308 may be included to form an image of a fingertip or fingerprint on the interface 106 on the image sensor 306.
Fig. 4 illustrates an embodiment of an image sensing system 400, which includes an image sensor 402. A processor or computer 404 is coupled to the image sensor 402 and a display unit, such as a computer screen 406, is in turn coupled to the computer 404. In one embodiment of the system 400, the image sensor 402 may be the image sensor 306 of the touch sensor 300, but in other embodiments, the image sensor 402 may be one or both of the image sensors 1032 and 1038 of the touch and gesture recognition system 1000 (see FIG. 10).
Fig. 5-6 collectively illustrate an embodiment of an application of optical touch sensor 300 for detecting the position of a single fingertip and/or fingerprint. In FIG. 5, at time t1At the image sensor 306A fingerprinted image frame 502 having a fingerprint pattern 506 at a first location in the image frame. In one embodiment including a security feature, fingerprint pattern 506 may have been previously registered as a legitimate input fingerprint, and at time t1The previous first detected fingerprint pattern may also be used as a security key to unlock the computer. Alternatively, the fingerprint pattern 506 may be simply recognized as being at time t1Any previous input fingerprint. At time t2At this point, the image sensor 306 detects an image frame 504 having a fingerprint pattern 508, the fingerprint pattern 508 being the same as the fingerprint pattern 506 but having been shifted to a second location within the image frame.
Various computational methods may be used to find the second location relative to the first location. In one embodiment, the first location and the second location may be determined by correlating frame 502 with frame 504. From the correlation, a movement from the first fingerprint position 506 to the second fingerprint position 508 may be calculated. The correlation C (p, q) of two functions f (x, y) and g (x, y) can be defined as:
c (p, q) ═ je ^ f (x, y) g (x-p, y-q) dxdy equation (2)
Where f (x, y) is the first image frame 502, g (x, y) is the second image frame 504, and p and q are the x and y movements of the fingerprint position between the frame 502 and the frame 504. Using this correlation, the correlation peak location (p, q) can be defined as the location where correlation C (p, q) shows a peak. The correlation peak position (p, q) corresponds to a change from a first fingerprint position 506 to a second fingerprint position 508. Thus, at time t, respectively, as shown in fig. 61And time t2Computer screens 602 and 604 at (a) will display a cursor that moves from location 606 to location 608. The position of the cursor may be used to select commands for the computer. The correlation of frames, the generation and display of cursors, and other calculations may be performed by a computer or processor (see, e.g., fig. 4), although other calculation methods may also be used.
Fig. 7-8 collectively illustrate an embodiment of an application of an optical touch sensor 300 for detecting the location of multiple fingerprints. FIG. 7 shows two fingers touching the touch pad simultaneously. At time t1At, the image sensor 306 detects a frame 702 having a fingerprint pattern 706 at a first location and a fingerprint pattern 708 at a second location. At time t2At this point, image sensor 306 detects frame 704, which has fingerprint pattern 710 that is the same as fingerprint pattern 706 but has been shifted to a third location and fingerprint pattern 712 that is the same as fingerprint pattern 708 but has been shifted to a fourth location.
In this embodiment, correlation may also be used to determine a change in the location of the fingerprint. By correlating frame 702 with frame 704, for example using equation (2), two correlation peaks will be generated. The correlation peak position (p1, q1) corresponds to a change of the fingerprint pattern 706 from the first position to the third position, and the other correlation peak position (p2, q2) corresponds to a change of the fingerprint pattern 708 from the second position to the fourth position. Thus, at time t, respectively, as shown in fig. 81And time t2The computer screens 802 and 804 will display independently moving cursors 806 and 808. The cursor 806 may represent the fingerprint pattern 706 and the cursor 808 may represent the fingerprint pattern 708. The correlation of frames, the generation and display of cursors, and other calculations may be performed by a computer or processor (see, e.g., fig. 4), although other calculation methods may also be used.
FIG. 9 illustrates an embodiment of a process 900 for translating a location on a sensor to a cursor location on a computer screen as illustrated in FIGS. 5-8. At block 902, a processor, such as processor 404 (see FIG. 4), calculates a time t by the image sensor1And t2The correlation of two consecutive frames detected. At block 904, the processor identifies and separates the calculated correlation peak locations; the number of correlation peaks corresponds to the number of independent touches on the touch screen. At block 906, the processor 404 generates one or more cursors on the computer screen 406 (see FIG. 4), with one cursor corresponding to each correlation peak location.
FIG. 10 illustrates an embodiment of an optical touchpad 1000 that is capable of performing both touch detection and gesture recognition. The touch pad 1000 includes a prism 1002 having a four-sided cross section. Prism 1002 is coupled to both a source of collimated light that directs collimated light 1024 into the prism and an image sensor 1032 that receives the light exiting prism 1002. An additional image sensor 1038 is coupled to the prism 1002 to capture an image of the hand 1014 positioned over the prism.
Prism 1002 has a four-sided cross-section that includes four interfaces: a light entry interface 1004, a touch interface 1008, a light exit interface 1006, and a backside interface 1010. The touch interface 1008 and the back interface 1010 are spaced apart from and substantially parallel to each other. The optical axis 1012 of the prism is substantially normal to the touch interface 1008 and the back interface 1010. The light entry interface 1004 and the light exit interface 1006 connect the touch interface and the backside interface and are positioned at an angle relative to the optical axis 1012 such that the cross-sectional shape of the prism 1002 is substantially trapezoidal. However, in other embodiments, the prism 1002 may have a different quadrilateral cross-sectional shape or may have a non-polygonal cross-sectional shape.
The light source 1020 is optically coupled to an optical element, such as a lens 1022, the lens 1022 may shape light from the light source 1020 into a beam of collimated light 1024 that is directed toward the light entry interface 1004. In one embodiment, light source 1020 emits light in the infrared wavelength range. In other embodiments, however, the light source 1020 may emit light in other wavelength ranges (e.g., visible or ultraviolet ranges). In the illustrated embodiment, the optical element 1022 is a refractive lens, but in other embodiments, the optical element may be a reflective or diffractive optical element.
Image sensor 1032 is positioned to capture one or more images of the touch interface based on light reflected from the touch interface 1008. The collimated light 1024 reflects from the touch interface 1008 and becomes reflected collimated light 1026. The reflected collimated light 1026 exits the prism 1002 via the light exiting interface 1006. Optical element 1028 is positioned in the path of reflected collimated light 1026 to focus an image of touch interface 1008 onto image sensor 1032. In the illustrated embodiment, the optical element 1028 is a refractive lens, but in other embodiments, the optical element may be a reflective or diffractive optical element.
An additional image sensor 1038 is positioned along the optical axis 1012 to capture one or more images of an object, such as a hand 1014, positioned over the touch interface 1008. In the illustrated embodiment, the hand 1014 is illuminated by a light source 1018, but in other embodiments the hand 1014 may be illuminated by ambient light. Optical element 1034 is positioned along optical axis 1012 to focus an image of hand 1014 onto image sensor 1038. In the illustrated embodiment, optical element 1034 is a refractive lens, but in other embodiments, the optical element may be a reflective or diffractive optical element.
To enhance the signal-to-noise ratio of the signals detected by image sensors 1032 and 1038, in one embodiment, light source 1020 and light 1018 may have different wavelengths. For example, the incident light source 1020 may emit light having a wavelength range λ1And the light source 1018 emits light having a wavelength range lambda2Of visible light of (2), wherein λ1And λ2There is no overlap. Thus, in one embodiment of the optical touch pad 1000, the wavelength range λ may be made1The pass filter 1030 is disposed in front of the image sensor 1032, before or after the optical element 1028, and may be such that λ2The passed filter 1036 is positioned in front of image sensor 1038, before or after optical element 1034. Thus, in embodiments where the hand 1014 is illuminated by ambient light rather than light from the light source 1018, the filters 1030 and 1036 may be adjusted.
The touch sensor 1000 operates as both a touch detector and a gesture recognizer. During touch detection, light emitted by the light source 1020 is collimated by the optical element 1022 into collimated light 1024, which is then directed into the prism 1002 via the light entry interface 1004. In the illustrated embodiment, the collimated light 1024 is incident on the light entry interface 1004 at an angle of incidence of substantially 0 degrees (i.e., substantially normal to the entry interface 1004), but in other embodiments, the collimated light 1024 may be incident on the light entry interface 1004 at other angles.
After entering the prism 1002, the collimated light 1024 reflects from the touch interface 1008 and becomes reflected collimated light 1026. In the case where no fingertip is in contact with the touch surface 1008, the collimated light 1024 is fully reflected by total internal reflection at the interface 1008, but in the case where there is fingertip contact with the touch interface 1008, then the collimated light 1024 is only partially reflected according to frustrated total internal reflection. The reflected collimated light 1026 exits the prism 1002 via the light exiting interface 1006. In the illustrated embodiment, the reflected collimated light 1026 is incident on the light exit interface 1006 at an angle of incidence of substantially 0 degrees (i.e., substantially normal to the exit interface 1006), but in other embodiments, the reflected collimated light 1026 may be incident on the light exit interface 1006 at other angles.
After exiting prism 1002, the reflected collimated light 1026 is focused by optical element 1028, filtered by filter 1030 (if present), and directed onto image sensor 1032 for image capture. One or more images captured by image sensor 1032 may then be processed by circuitry and logic (see fig. 4) coupled to image sensor 1032. In one embodiment, the image may be processed as described in connection with fig. 5-8, but in other embodiments other types of processing may be used.
For gesture recognition, the hand 1014 is positioned over the touch interface 1008 and illuminated by ambient light or, if present, by a light source 1018. Light 1016 from the hand 1014 enters the prism 1002 via the touch interface 1008, passes through the prism and exits via the rear interface 1010. After exiting via the backside interface 1010, the light 1016 is focused by the optical element 1034, filtered by the filter 1036 (if present), and directed onto the image sensor 1038 for image capture. One or more images captured by image sensor 1038 may be processed by circuitry and logic (see fig. 4) coupled to the image sensor to determine the gesture being made by the hand. Light 1016 from hand 1014 does not undergo total internal reflection as it propagates through prism 1002 to image sensor 1038.
FIG. 11 illustrates an alternative embodiment of an optical touchpad 1100 for touch and gesture detection and recognition. Optical touch pad 1100 shares many similarities with optical touch pad 1000. The main difference between the two optical touch pads is that optical touch pad 1100 has an imaging lens 1034 and an image sensor 1038 can be disposed at another location so that light 1016 from object 1014 does not pass through prism 1002. An imaging lens 1034 is positioned adjacent to the prism 1002 for forming an image of the article 1014. Thus, the back interface 1010, which is parallel to the touch interface 1008, is not required. In addition, the optical touchpad 1100 can optionally use an optical filter 1036 to improve signal-to-noise ratio, but in other embodiments, the optical filter 1036 need not be used.
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize. These modifications can be made to the invention in light of the above detailed description.
The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification and the claims. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Claims (14)
1. An optical touch pad, comprising:
a prism having a four-sided cross-section including a light entry interface, a light exit interface, a touch interface, and a back interface substantially parallel to and spaced apart from the touch interface;
a source of collimated light, wherein the collimated light enters the prism via the light entry interface, reflects from the touch interface by total internal reflection, and exits the prism via the light exit interface;
a first image sensor for detecting the collimated light exiting from the light exit interface;
a lens for forming an image of an object positioned over the touch interface via the touch interface and the back interface, wherein the object does not touch the touch interface;
a second image sensor optically coupled to the lens for detecting the image of the object;
a first filter disposed between the light exit interface and the first image sensor, wherein the first filter passes a first range of wavelengths; and
a second filter disposed between the prism and the second image sensor, wherein the second filter passes a second range of wavelengths;
wherein the first wavelength range is infrared light and the second wavelength range is visible light.
2. The optical touch pad of claim 1, wherein the prisms have a substantially trapezoidal cross section.
3. The optical touch pad of claim 1, wherein the source of collimated light comprises:
a light source; and
a collimating lens coupled to the light source.
4. The optical touch pad of claim 1, wherein the source of collimated light emits light at a first wavelength range.
5. The optical touch pad of claim 4, further comprising a second light source that emits light in a second wavelength range for illuminating the object.
6. The optical touch pad of claim 1, further comprising a second lens that images the touch interface of the prism onto the first image sensor.
7. The optical touch pad of claim 1, further comprising a first processor coupled to the first image sensor and a second processor coupled to the second image sensor.
8. A method for fingertip and gesture imaging, the method comprising:
directing collimated light into a prism having a four-sided cross-section, the prism comprising a light entry interface, a light exit interface, a touch interface, and a backside interface substantially parallel to and spaced apart from the touch interface, wherein the collimated light enters the prism via the light entry interface, reflects from the touch interface by total internal reflection, and exits the prism via the light exit interface;
imaging the collimated light exiting the light exit interface to obtain an image of the touch interface;
imaging an object positioned above the touch interface, wherein the object does not touch the touch interface;
filtering light exiting through the light exit interface using a first filter that passes a first range of wavelengths; and
filtering light from the object with a second filter, the second filter passing a second range of wavelengths;
wherein the first wavelength range is infrared light and the second wavelength range is visible light.
9. The method of claim 8, wherein the prisms have a substantially trapezoidal cross-section.
10. The method of claim 8, wherein imaging the collimated light exiting from the light exit interface includes:
directing the collimated light onto a first image sensor using a first lens; and
capturing an image of the touch interface using the first image sensor.
11. The method of claim 8, wherein imaging an object positioned over the touch interface includes:
directing light from the object through the touch interface and a back interface of the prism to a second image sensor using a second lens; and
capturing an image of the object using the second image sensor.
12. The method of claim 8, wherein the collimated light is in a first wavelength range.
13. The method of claim 12, wherein the light from the object is in a second wavelength range.
14. The method of claim 8, further comprising:
capturing a first image frame at a first time, the first image frame comprising one or more touch locations;
capturing a second image frame at a second, consecutive time, the second image frame comprising a same number of touch locations as the first image frame;
calculating a correlation function of the first image frame and the second image frame to generate a correlation peak position; and
the location of each finger touch is determined by separating the correlation peak locations, where each correlation peak corresponds to a touch location.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/151,583 | 2011-06-02 | ||
| US13/151,583 US9213438B2 (en) | 2011-06-02 | 2011-06-02 | Optical touchpad for touch and gesture recognition |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1179374A1 HK1179374A1 (en) | 2013-09-27 |
| HK1179374B true HK1179374B (en) | 2017-03-03 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| TWI461991B (en) | Optical touchpad for touch and gesture recognition | |
| US20090267919A1 (en) | Multi-touch position tracking apparatus and interactive system and image processing method using the same | |
| US9696853B2 (en) | Optical touch apparatus capable of detecting displacement and optical touch method thereof | |
| EP3066551B1 (en) | Multi-modal gesture based interactive system and method using one single sensing system | |
| CN102597936B (en) | Touch surface with compensated signal profile | |
| US20150078586A1 (en) | User input with fingerprint sensor | |
| TW201040850A (en) | Gesture recognition method and interactive input system employing same | |
| WO2008017077A2 (en) | Multi-touch sensing display through frustrated total internal reflection | |
| CN101582001A (en) | Touch screen, touch module and control method | |
| CN102073392A (en) | Hybrid pointing device | |
| TW201421322A (en) | Hybrid pointing device | |
| CN101776971A (en) | Multi-point touch screen device and positioning method | |
| TWI529572B (en) | Method for detecting operation object and touch device | |
| US9477348B2 (en) | Focus-based touch and hover detection | |
| US20130335334A1 (en) | Multi-dimensional image detection apparatus | |
| TW201337649A (en) | Optical input device and input detection method thereof | |
| KR101397938B1 (en) | Touch pannel and pointing detection apparatus having the pannel | |
| HK1179374B (en) | Optical touchpad for touch and gesture recognition | |
| US8878820B2 (en) | Optical touch module | |
| KR20100116267A (en) | Touch panel and touch display apparatus having the same | |
| KR20130090210A (en) | Input device | |
| TWI573043B (en) | The virtual two - dimensional positioning module of the input device | |
| US8760403B2 (en) | Hybrid human-interface device | |
| TWI488090B (en) | Optical information sampling method and touch information identification method | |
| CN114138162A (en) | Intelligent transparent office table interaction method |