[go: up one dir, main page]

US20130342442A1 - Input system - Google Patents

Input system Download PDF

Info

Publication number
US20130342442A1
US20130342442A1 US13/907,182 US201313907182A US2013342442A1 US 20130342442 A1 US20130342442 A1 US 20130342442A1 US 201313907182 A US201313907182 A US 201313907182A US 2013342442 A1 US2013342442 A1 US 2013342442A1
Authority
US
United States
Prior art keywords
command signal
input system
gesture
signal
detection unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/907,182
Other languages
English (en)
Inventor
Yu-Hao Huang
Yi-Fang Lee
Ming-Tsan Kao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Assigned to PixArt Imaging Incorporation, R.O.C. reassignment PixArt Imaging Incorporation, R.O.C. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUANG, YU-HAO, KAO, MING-TSAN, LEE, YI-FANG
Publication of US20130342442A1 publication Critical patent/US20130342442A1/en
Priority to US15/168,825 priority Critical patent/US10372224B2/en
Priority to US16/449,366 priority patent/US10606366B2/en
Priority to US16/790,783 priority patent/US10824241B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention relates to an input system; particularly, it relates to such input system which is capable of detecting different gestures so as to generate a combination command.
  • the user usually either directly touches the target icon displayed on the touch screen or selects it with an input device.
  • keyboards, mice or touch panels are typical tools for such selections in desktop PCs or tablet PCs.
  • a user may perform selections in a non-contact manner wherein the device senses gestures of upward, downward, leftward and rightward movements of a user's hand and movement of the user's hand approaching the device screen for selecting a function and confirmation.
  • An objective of the present invention is to provide an input system which is capable of detecting different gestures so as to generate a combination command.
  • the present invention provides an input system comprising a first gesture detection unit and a second gesture detection unit.
  • the first gesture detection unit has a first sensible range and includes a first light emitting device, a first light sensing device and a first processing unit.
  • the first light emitting device is for emitting a first light beam.
  • the first light sensing device is for receiving the first light beam which is reflected by a first motion trajectory generated by a user and outputting a first image signal accordingly.
  • the first processing unit is for processing the first image signal and outputting a first command signal accordingly.
  • the second gesture detection unit has a second sensible range and includes a second light emitting device, a second light sensing device and a second processing unit.
  • the second light emitting device is for emitting a second light beam.
  • the second light sensing device is for receiving the second light beam which is reflected by a second motion trajectory generated by the user and outputting a second image signal accordingly.
  • the second processing unit is for processing the second image signal and outputting a second command signal accordingly.
  • the first command signal includes a gesture command signal or a cursor movement command signal.
  • the gesture command signal includes an upward command signal, a downward command signal, a leftward command signal, a rightward command signal, a clockwise command signal, a counterclockwise command signal, a push forward command signal, a pull backward command signal, or a waving-hand command signal.
  • the cursor movement command signal includes a cursor motion signal or an object motion signal.
  • the second command signal includes a gesture command signal or a cursor movement command signal.
  • the gesture command signal includes an upward command signal, a downward command signal, a leftward command signal, a rightward command signal, a clockwise command signal, a counterclockwise command signal, a push forward command signal, a pull backward command signal, or a waving-hand command signal.
  • the cursor movement command signal includes a cursor motion signal or an object motion signal.
  • the input system is for transmitting the first command signal and the second command signal to an electronic device in serial or parallel manner so that the electronic device generates a corresponding action.
  • the action includes a copy, paste, zoom-in, zoom-out or object dragging action.
  • the first light beam and the second light beam are invisible light beams.
  • the first light beam and the second light beam have the same wavelength range. In one embodiment, the first sensible range and the second sensible range do not overlap with each other.
  • the first light beam and the second light beam have different wavelength ranges. In one embodiment, the first sensible range and the second sensible range partially overlap with each other.
  • the present invention provides an input system comprising a light emitting device, a first gesture detection unit and a second gesture detection unit.
  • the light emitting device is for emitting a light beam.
  • the first gesture detection unit has a first sensible range and includes a first light sensing device and a first processing unit.
  • the first light sensing device is for receiving the light beam which is reflected by a first motion trajectory generated by a user and outputting a first image signal accordingly.
  • the first processing unit is for processing the first image signal and outputting a first command signal accordingly.
  • the second gesture detection unit has a second sensible range and includes a second light sensing device and a second processing unit.
  • the second light sensing device is for receiving the light beam which is reflected by a second motion trajectory generated by the user and outputting a second image signal accordingly.
  • the second processing unit is for processing the second image signal and outputting a second command signal accordingly.
  • the input system of the present invention at least has the following advantages: first, the input system can detect different gestures with the first gesture detection unit and the second gesture detection unit so as to generate a combination command. Secondly, both the first gesture detection unit and the second gesture detection unit can operate in the gesture mode and the cursor motion mode, so there are more varieties of combination commands that may be generated, and therefore the input system of the present invention is superior to and can replace the conventional input system (e.g., the mouse).
  • FIGS. 1A-1C are schematic diagrams showing actions by an input system to input commands to an electronic device according to an embodiment of the present invention.
  • FIGS. 2A-2B illustrate that the image displayed on the screen is varied in response to a command inputted by the input system to the electronic device as shown in FIGS. 1A-1C .
  • FIGS. 3A-3C are schematic diagrams showing actions by an input system to input commands to an electronic device according to another embodiment of the present invention.
  • FIGS. 4A-4B illustrate that the image displayed on the screen is varied in response to a command inputted by the input system to the electronic device as shown in FIGS. 3A-3C .
  • FIG. 5 illustrates an input system according to yet another embodiment of the present invention.
  • FIG. 6 illustrates an input system according to still another embodiment of the present invention.
  • FIGS. 1A-1C are schematic diagrams showing actions by an input system to input commands to an electronic device according to an embodiment of the present invention.
  • the input system 100 of this embodiment comprises a first gesture detection unit 110 and a second gesture detection unit 120 .
  • the first gesture detection unit 110 and the second gesture detection unit 120 detects the user's different gestures to control the corresponding operation of the electronic device 101 .
  • the two gestures are integrated to become a combination command indicative of controlling a zoom-in operation or a zoom-out operation for the screen of the electronic device 101 .
  • the first gesture detection unit 110 detects a gesture indicative of outputting a command of “object selection” corresponding to left-clicking the mouse and the second gesture detection unit 120 detects a gesture indicative of moving the trajectory of an object
  • the two gestures are integrated to become a combination command indicative of moving the object.
  • the first gesture detection unit 110 and the second gesture detection unit 120 can be used to respectively detect the gestures of different users who are playing a game on the electronic device 101 (e.g., the double player mode). The details for how the first gesture detection unit 110 and the second gesture detection unit 120 detect the user's different gestures so as to control the corresponding operations of the electronic device 101 are explained below.
  • the first gesture detection unit 110 has a first sensible range 110 a and includes a first light emitting device 112 , a first light sensing device 114 and a first processing unit 116 .
  • the first light emitting device 112 is for emitting a first light beam L 1 .
  • the first light beam L 1 is for example but not limited to infrared light, which is invisible light.
  • the first light sensing device 114 is for receiving the first light beam. L 1 when it is reflected by a first motion trajectory 132 generated by the user and outputting a first image signal S 114 accordingly.
  • the first processing unit 116 is for processing the first image signal S 114 and outputting a first command signal S 116 accordingly.
  • the first command signal S 116 for example can be a gesture command signal S 1162 or a cursor movement command signal (not shown), etc.
  • the gesture command signal S 1162 for example is an upward command signal, a downward command signal, a leftward command signal, a rightward command signal, a clockwise command signal, a counterclockwise command signal, a push forward command signal, a pull backward command signal, or a waving-hand command signal, etc.
  • the cursor movement command signal includes a cursor motion signal or an object motion signal.
  • the first gesture detection unit 110 detects the gesture under the gesture mode. That is, when detecting a gesture of a pushing forward movement (as shown by the first motion trajectory 132 in FIG. 1A ), the first gesture detection unit 110 outputs a corresponding push forward command signal as the gesture command signal S 1162 .
  • the push forward command signal can be representing, for example, the command of the “control” key (“Ctrl”) on the keyboard.
  • the second gesture detection unit 120 has a second sensible range 120 a and includes a second light emitting device 122 , a second light sensing device 124 and a second processing unit 126 .
  • the second light emitting device 122 is for emitting a second light beam L 2 .
  • the second light beam L 2 for example is invisible light, and is infrared light having the same wavelength range as the first light beam L 1 in this embodiment, but is not limited thereto.
  • the second light sensing device 124 is for receiving the second light beam L 2 which is reflected by a second motion trajectory 134 generated by the user and outputting a second image signal S 124 accordingly.
  • the second processing unit 126 is for processing the second image signal S 124 and outputting a second command signal S 126 accordingly.
  • the second command signal S 126 for example can be a gesture command signal S 1262 or a cursor movement command signal (not shown), etc.
  • the gesture command signal S 1262 for example is an upward command signal, a downward command signal, a leftward command signal, a rightward command signal, a clockwise command signal, a counterclockwise command signal, a push forward command signal, a pull backward command signal, or a waving-hand command signal.
  • the cursor movement command signal includes a cursor motion signal or an object motion signal.
  • the second gesture detection unit 120 is also under the gesture mode.
  • the second gesture detection unit 120 when detecting a gesture of a clockwise movement (as shown by the second motion trajectory 134 in FIG. 1B ), the second gesture detection unit 120 outputs a corresponding clockwise command signal as the gesture command signal S 1262 .
  • the clockwise command signal for example can be representing the command of plus sign (“+”).
  • a combination command combining the command of “Ctrl” and the command of “+” is inputted to the electronic device 101 whereby the screen 101 a of the electronic device 101 is zoomed in, as shown in FIGS. 2A-2B .
  • the second gesture detection unit 120 when detecting a gesture of a counterclockwise movement, the second gesture detection unit 120 outputs a corresponding counterclockwise command signal as the gesture command signal S 1262 .
  • the counterclockwise command signal for example can be representing the command of minus sign (“ ⁇ ”).
  • a combination command combining the command of “Ctrl” and the command of “ ⁇ ” is inputted to the electronic device 101 whereby the screen 101 a of the electronic device 101 is zoomed out, as shown in FIGS. 2A-2B .
  • the user can move the gesture away from the first gesture detection unit 110 , whereby the first gesture detection unit 110 detects a gesture of a backward movement (as shown by the first motion trajectory 132 a in FIG. 1C ) and outputs a corresponding pull backward command signal as the gesture command signal S 1162 a .
  • the pull backward command signal for example can be representing “not outputting the Ctrl command”.
  • the input system 100 of this embodiment can detect different gestures generated by the user with the first gesture detection unit 110 and the second gesture detection unit 120 under the gesture mode so as to generate a combination command (e.g., as described previously, the combination of the command “Ctrl” and the command of “+”; or, the combination of the command “Ctrl” and the command of “ ⁇ ”), for controlling the operation of the electronic device 101 (e.g., the zoom-in/zoom-out action).
  • a combination command e.g., as described previously, the combination of the command “Ctrl” and the command of “+”; or, the combination of the command “Ctrl” and the command of “ ⁇ ”
  • the above-mentioned examples are for illustrative purpose, but not for limiting the scope of the present invention.
  • the user can combine multiple gestures by any combinations.
  • the user also can combine a gesture (e.g., an upward command signal, a downward command signal, a leftward command signal, a rightward command signal, a clockwise command signal, a counterclockwise command signal, a push forward command signal, a pull backward command signal, or a waving-hand command signal) with one or more hardware inputs such as a keyboard input (e.g., the “control” key, the “shift” key, the “Alt” key, the “+” key, the “ ⁇ ” key, the “upward arrow” key, the “downward arrow” key, the “leftward arrow” key, or the “rightward arrow” key, etc.) so as to generate other combination commands (e.g., page up, page down, scroll-up, scroll-down, copy, paste, cut, open or close, etc.), for controlling the electronic device 101 .
  • the input system can transmit the first command signal and the second command signal to the electronic device in serial or parallel manner.
  • the input system 100 also can detect different gestures and motion trajectories generated by the user with the first gesture detection unit 110 and the second gesture detection unit 120 under the cursor mode so as to generate a combination command, hence controlling an object motion of the electronic device 101 , as shown in FIGS. 3A-3C .
  • FIGS. 3A-3C are schematic diagrams showing actions by an input system to input commands to an electronic device according to an embodiment of the present invention.
  • the first gesture detection unit 110 detects the gesture under the gesture mode. That is, when detecting a gesture of pushing forward (as shown by the first motion trajectory 132 in FIG. 3A ), the first gesture detection unit 110 outputs a corresponding push forward command signal as the gesture command signal S 1162 .
  • the push forward command signal for example can be representing the command corresponding to the left-click on a mouse.
  • an object on the screen 101 a of the electronic device 101 is selected, as shown in FIG. 4A .
  • the input system 100 of this embodiment detects the motion trajectory of another gesture (as shown by the motion trajectory 134 shown in FIG. 3B ) by the second gesture detection unit 120 . Because the second gesture detection unit 120 detects the motion trajectory of the gesture under the cursor mode, it outputs a corresponding cursor motion signal as the cursor movement command signal S 1264 .
  • the cursor motion signal for example can be representing a trajectory corresponding to the motion trajectory of the gesture. Consequently, the selected object on the screen 101 a of the electronic device 101 is dragged or moved to a desired position, as shown in FIG. 4B .
  • the user can move the gesture away from the first gesture detection unit 110 , whereby the first gesture detection unit 110 detects a gesture of pulling backward (as shown by the first motion trajectory 132 a in FIG. 3C ) and outputs a corresponding pull backward command signal as the gesture command signal S 1162 a .
  • the pull backward command signal for example can be representing “not outputting the left-click command”.
  • the input system 100 of this embodiment can detect a gesture of the user by the first gesture detection unit 110 under the gesture mode and detect the motion trajectory of another gesture of the user by the second gesture detection unit 120 under the cursor mode to generate a combination command, for controlling the object motion operation of the electronic device 101 .
  • the above-mentioned examples are for illustrative purpose, but not for limiting the scope of the present invention.
  • the user can combine multiple gestures by any combinations.
  • the user also can combine a gesture (e.g., an upward command signal, a downward command signal, a leftward command signal, a rightward command signal, a clockwise command signal, a counterclockwise command signal, a push forward command signal, a pull backward command signal, or a waving-hand command signal) with one or more hardware inputs such as a keyboard input (e.g., the “control” key, the “shift” key, the “Alt” key, the “+” key, the “ ⁇ ” key, the “upward arrow” key, the “downward arrow” key, the “leftward arrow” key, or the “rightward arrow” key, etc.) so as to generate other combination commands (e.g., page up, page down, scroll-up, scroll-down, copy, paste, cut, open or close, etc.), for controlling the electronic device 101 .
  • a gesture e.g., an upward command signal, a downward command signal, a leftward command signal, a rightward command signal,
  • the first gesture detection unit 110 and the second gesture detection unit 120 can concurrently detect different gestures generated by the same user or different users under the cursor mode, for example in a game, hence making the game more entertaining. For example, in a game which is originally for one player, if the first gesture detection unit 110 and the second gesture detection unit 120 are concurrently operated under the cursor mode, another player is allowed to join the game, thus making the game more amusing by multiplayers' interactions.
  • the first sensible range 110 a and the second sensible range 120 a preferably do not overlap with each other to avoid mis-control or the mis-operation. Nevertheless, if the first light beam L 1 and the second light beam L 2 have different wavelength ranges, the mis-control or the misoperation will be less likely, and in this case the first sensible range 110 a and the second sensible range 120 a may overlap with each other.
  • FIG. 5 illustrates an input system according to yet another embodiment of the present invention. Please refer to both FIG. 1A and FIG. 5 .
  • the input system 200 of this embodiment is substantially the same as the above-mentioned input system 100 , but is different in that the input system 200 of this embodiment includes only one single light emitting device 240 , and the first gesture detection unit 210 and the second gesture detection unit 220 do not include the above-mentioned first light emitting device 112 and second light emitting device 122 , respectively. That is, both the first gesture detection unit 210 and the second gesture detection unit 220 detect the light beam 242 emitted from the same single light emitting device 240 .
  • the input system 200 of this embodiment has substantially the same advantages and efficacies as the above-mentioned input system 100 , which are not redundantly repeated here.
  • FIG. 6 illustrates an input system according to still another embodiment of the present invention. Please refer to both FIG. 1A and FIG. 6 .
  • the input system 300 of this embodiment is substantially the same as the above-mentioned input system 100 , but is different in that the input system 300 of this embodiment does not include any light emitting device. That is, the first gesture detection unit 310 and the second gesture detection unit 320 of the input system 300 do not include the above-mentioned first light emitting device 112 and second light emitting device 122 , respectively. In other words, both the first gesture detection unit 210 and the second gesture detection unit 220 detect visible light provided from the environment.
  • the input system 300 of this embodiment detects visible light whereas the above-mentioned input system 100 detects the invisible light
  • the input system 300 of this embodiment has substantially the same advantages and efficacies as the above-mentioned input system 100 , which are not redundantly repeated here.
  • the input system of the present invention at least has the following advantages: first, the input system can detect different gestures with the first gesture detection unit and the second gesture detection unit so as to generate a combination command. Secondly, both the first gesture detection unit and the second gesture detection unit can operate in the gesture mode and the cursor motion mode, so there are more varieties of combination commands that may be generated, and therefore the input system of the present invention is superior to and can replace the conventional input system (e.g., the mouse).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
US13/907,182 2012-06-20 2013-05-31 Input system Abandoned US20130342442A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/168,825 US10372224B2 (en) 2012-06-20 2016-05-31 Input system
US16/449,366 US10606366B2 (en) 2012-06-20 2019-06-22 Input system
US16/790,783 US10824241B2 (en) 2012-06-20 2020-02-14 Input system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW101121988A TWI490755B (zh) 2012-06-20 2012-06-20 輸入系統
TW101121988 2012-06-20

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/168,825 Continuation US10372224B2 (en) 2012-06-20 2016-05-31 Input system

Publications (1)

Publication Number Publication Date
US20130342442A1 true US20130342442A1 (en) 2013-12-26

Family

ID=49774000

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/907,182 Abandoned US20130342442A1 (en) 2012-06-20 2013-05-31 Input system
US15/168,825 Active US10372224B2 (en) 2012-06-20 2016-05-31 Input system
US16/449,366 Active US10606366B2 (en) 2012-06-20 2019-06-22 Input system
US16/790,783 Active US10824241B2 (en) 2012-06-20 2020-02-14 Input system

Family Applications After (3)

Application Number Title Priority Date Filing Date
US15/168,825 Active US10372224B2 (en) 2012-06-20 2016-05-31 Input system
US16/449,366 Active US10606366B2 (en) 2012-06-20 2019-06-22 Input system
US16/790,783 Active US10824241B2 (en) 2012-06-20 2020-02-14 Input system

Country Status (2)

Country Link
US (4) US20130342442A1 (zh)
TW (1) TWI490755B (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190310717A1 (en) * 2012-06-20 2019-10-10 PixArt Imaging Incorporation, R.O.C. Input system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20090109178A1 (en) * 2006-04-28 2009-04-30 Kim Dong-Jin Non-contact selection device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8086971B2 (en) 2006-06-28 2011-12-27 Nokia Corporation Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
CN101650520A (zh) * 2008-08-15 2010-02-17 索尼爱立信移动通讯有限公司 移动电话的可视激光触摸板和方法
TW201042507A (en) * 2009-05-19 2010-12-01 Pixart Imaging Inc Interactive image system and operating method thereof
TW201104494A (en) * 2009-07-20 2011-02-01 J Touch Corp Stereoscopic image interactive system
US8547327B2 (en) * 2009-10-07 2013-10-01 Qualcomm Incorporated Proximity object tracker
TWI431538B (zh) * 2010-04-30 2014-03-21 Acer Inc 基於影像之動作手勢辨識方法及系統
US9213438B2 (en) * 2011-06-02 2015-12-15 Omnivision Technologies, Inc. Optical touchpad for touch and gesture recognition
US20130314377A1 (en) * 2012-05-25 2013-11-28 Oleg Los Optical touch sensor apparatus
TWI490755B (zh) * 2012-06-20 2015-07-01 Pixart Imaging Inc 輸入系統
US9423886B1 (en) * 2012-10-02 2016-08-23 Amazon Technologies, Inc. Sensor connectivity approaches
TWI479363B (zh) * 2012-11-26 2015-04-01 Pixart Imaging Inc 具有指向功能的可攜式電腦及指向系統
US9958954B2 (en) * 2012-12-13 2018-05-01 3M Innovative Properties Company System and methods for calibrating a digitizer system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20090109178A1 (en) * 2006-04-28 2009-04-30 Kim Dong-Jin Non-contact selection device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190310717A1 (en) * 2012-06-20 2019-10-10 PixArt Imaging Incorporation, R.O.C. Input system
US10606366B2 (en) * 2012-06-20 2020-03-31 Pixart Imaging Incorporation Input system
US10824241B2 (en) * 2012-06-20 2020-11-03 Pixart Imaging Incorporation Input system

Also Published As

Publication number Publication date
US10606366B2 (en) 2020-03-31
US10372224B2 (en) 2019-08-06
US20160274672A1 (en) 2016-09-22
US10824241B2 (en) 2020-11-03
US20200183500A1 (en) 2020-06-11
US20190310717A1 (en) 2019-10-10
TWI490755B (zh) 2015-07-01
TW201401132A (zh) 2014-01-01

Similar Documents

Publication Publication Date Title
CN102576279B (zh) 用户界面
CN106292859B (zh) 电子装置及其操作方法
US9990062B2 (en) Apparatus and method for proximity based input
US10114485B2 (en) Keyboard and touchpad areas
CN103123543B (zh) 多点触控鼠标
US20110018806A1 (en) Information processing apparatus, computer readable medium, and pointing method
US20110227947A1 (en) Multi-Touch User Interface Interaction
TWI451309B (zh) Touch device and its control method
US20150193023A1 (en) Devices for use with computers
JP2007334870A (ja) 直接入力装置の位置をマッピングする方法およびシステム
CN102163096A (zh) 信息处理装置、信息处理方法、以及程序
CN101438229A (zh) 带滚动的多功能键
JP6194355B2 (ja) コンピュータと共に用いるデバイスの改良
US20170285908A1 (en) User interface through rear surface touchpad of mobile device
US20220413634A1 (en) Computer mouse providing a touchless input interface
JP5275429B2 (ja) 情報処理装置、プログラムおよびポインティング方法
TW201218036A (en) Method for combining at least two touch signals in a computer system
US10824241B2 (en) Input system
US20130021367A1 (en) Methods of controlling window display on an electronic device using combinations of event generators
CN102207817A (zh) 电子阅读装置及其光标控制方法
US10338692B1 (en) Dual touchpad system
KR20140086805A (ko) 전자 장치, 그 제어 방법 및 컴퓨터 판독가능 기록매체
KR101136327B1 (ko) 휴대 단말기의 터치 및 커서 제어방법 및 이를 적용한 휴대 단말기

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIXART IMAGING INCORPORATION, R.O.C., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HUANG, YU-HAO;LEE, YI-FANG;KAO, MING-TSAN;REEL/FRAME:030525/0847

Effective date: 20130530

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION