US20160187998A1 - Systems And Methods For Processing Data Entered Using An Eye-Tracking System - Google Patents
Systems And Methods For Processing Data Entered Using An Eye-Tracking System Download PDFInfo
- Publication number
- US20160187998A1 US20160187998A1 US15/066,288 US201615066288A US2016187998A1 US 20160187998 A1 US20160187998 A1 US 20160187998A1 US 201615066288 A US201615066288 A US 201615066288A US 2016187998 A1 US2016187998 A1 US 2016187998A1
- Authority
- US
- United States
- Prior art keywords
- value
- key
- input
- input data
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0227—Cooperation and interconnection of the input arrangement with other functional units of a computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
Definitions
- the present invention relates to electronic devices, and more particularly to a system for processing data entered at an electronic device.
- Handheld electronic devices such as mobile phones, cameras and personal digital assistants (PDAs)
- PDAs personal digital assistants
- Handheld electronic devices offer a multitude of services and functions.
- many handheld devices are capable of connecting a user to a wireless network, such as the Internet, and allowing the user to send and receive information to and from other users via the wireless network.
- many handheld devices can load and run software applications that allow the user to perform computing tasks.
- handheld devices can offer performance and versatility, the available services can sometimes be awkward to use because of device's size.
- data entry e.g., inputting text for an electronic message.
- Most handheld devices lack the space to provide a full keyboard for entering data and instead utilize several known techniques to allow the user to create words or number patterns. Nevertheless, each technique has its disadvantages.
- mobile phones typically provide a conventional number-key pad where each number key, i.e., key 2-9, is associated with three to four alphanumeric characters.
- key 2-9 By pressing specific keys, the user can form words.
- One method for creating words using the number-key pad uses a prediction algorithm that relies on an electronic dictionary of common words to predict a word based on the key(s) pressed.
- This technique referred to as the T-9 technique, allows the user to tap a key, and a input value selection module in the device checks the dictionary to resolve ambiguities between the letters associated with the tapped key.
- the T-9 technique is a single-tap method that can allow fast data entry, it is not intuitive and can be deficient when two or more words are formed by the same sequence of tapped keys. In that instance, the T-9 technique cannot disambiguate between the words themselves and the user must resolve the ambiguity.
- multi-tapping Another word forming technique using the number-key pad is referred to as multi-tapping.
- multi-tapping the user enters a particular letter by pressing the number key associated with the letter at least one time to scroll to the desired letter. Once the desired letter is found, e.g., displayed on the screen, the user must wait until the selection is committed and the letter is inputted before proceeding to the next letter. While multi-tapping allows the user to form words without ambiguity, it is time-consuming and awkward.
- PDAs can display a full keyboard on a touch screen, and the user selects letters by touching or tapping on the touch screen with a stylus.
- a full miniature keyboard is provided in the device itself. In both instances, the size of the displayed and physical keyboard is reduced to accommodate the space limitations. Because the keys are small, they are difficult to see or press, making text entry awkward and slow as well as error prone.
- a keyboard includes keys each associated with at least one input data value and at least one multi-value key associated with at least two input data values. Each multi-value key includes a plurality of units. Each input data value corresponds to at least one unit.
- the keyboard also includes an eye-tracking system that determines an area where a user is gazing by monitoring eye movements of the user, and an input value selection module coupled to the units and to the eye-tracking system. The input value selection module determines probabilistically which input data value was entered based on a value received from at least one unit and on the area where the user is gazing.
- the keyboard includes a plurality of data input keys. Each input key is associated with at least one input data value and at least one data input key is a multi-value key associated with at least two input data values.
- the keyboard also includes an eye-tracking system that determines an area where a user is gazing by monitoring eye movements of the user and that determines an area covering portions of at least two data input keys.
- the keyboard also includes an input value selection module coupled to the plurality of data input keys and to the eye-tracking system that determines which input data value was entered based on a manually pressed data input key and on the area where the user is gazing.
- a method for processing data entered at a keyboard having a plurality of keys wherein each key is associated with an input data value and the input data value corresponds to the unit, the plurality of key including at least one multi-value key associated with at least two input data values includes assigning a plurality of units to each multi-value key, where each input data value corresponds to at least one unit, determining an area where a user is gazing by tracking the user's eye movement, detecting a manual selection of one of a key and a multi-value key, and determining probabilistically which input data value was entered based on a value received from at least one unit and on the area where the user is gazing.
- a method for processing data entered at an electronic device having a plurality of data input keys includes determining an area where a user is gazing by tracking the user's eye movement, where the area covers portions of at least two data input keys, detecting a manual selection of a data input key, and determining which input data value was entered based on the manually pressed data input key and on the area where the user is gazing.
- FIG. 1A is a schematic diagram of an exemplary electronic device according to an embodiment
- FIG. 1B is a system block diagram of the electronic device according to an embodiment
- FIGS. 2A, 2B and 2C illustrate an exemplary multi-value key that is associated with four (4) input values according to an embodiment
- FIG. 3 is a flowchart illustrating a process for processing data entered at the electronic device according to an embodiment shown in FIGS. 1A, 1B, and 2A-2C ;
- FIG. 4 is a flowchart illustrating a process for calculating the probability factor for an input value according to one embodiment
- FIG. 5 is a schematic diagram of an exemplary electronic device according to another embodiment
- FIG. 6 is an illustration for a group of four (4) multi-value input keys according to an embodiment.
- FIG. 7 is a flowchart illustrating a process for processing data entered at the electronic device according to the embodiment shown in FIGS. 5 and 6 .
- the present invention relates to electronic devices, and more particularly to methods and systems for processing data entered at an electronic device.
- the following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements.
- Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art.
- the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
- an eye-gaze or eye-tracking system is incorporated into an electronic device that has data input keys that are associated with two or more input values.
- the electronic device uses the eye-tracking system to determine where a user is gazing while the user presses a data input key. Based on the eye-tracking and key press data, the electronic device is able to determine which of the two or more input values associated with the pressed key was entered.
- FIG. 1A is a schematic diagram of an exemplary electronic device according to an embodiment and FIG. 1B is a system block diagram of the electronic device according to another embodiment.
- the electronic device 10 includes a plurality of data input keys 20 , an eye-tracking system 40 , an input value selection module 30 , and a display screen 50 .
- the data input keys 20 can be arranged in rows and columns to form a keypad 25 on a face 12 of the electronic device 10 .
- Each data input key 20 is associated with at least one input value 22
- at least one of the data input keys is a multi-value key 20 a that is associated with at least two data input values 22 a, 22 b , 22 c.
- the eye-tracking system 40 is located on the same face 12 of the electronic device as the keypad 25 .
- the eye-tracking system 40 tracks and records the movement of a user's eye(s) to determine an area where the user is gazing.
- the eye-tracking system 40 can use many different known techniques to monitor and track the user's eye movements.
- the eye-tracking system 40 can utilize a technique known as corneal reflection, which directs an infrared light beam at the user's eye and measures the angular difference between the operator's mobile pupil and the stationary light beam reflection.
- the eye-tracking system 40 can scan the user's eye region with a scanning apparatus, e.g., television camera, and analyze the resulting image.
- a scanning apparatus e.g., television camera
- the user In order to enter an input value, particularly one that is associated with a multi-value key 20 a, e.g., 22 a, the user locates the key 20 a with which the input value 22 a is associated and gazes at or near a corresponding representation of input value 22 a on the face of the key while pressing the key 20 a.
- the input value selection module 30 receives the eye-tracking data as well as the data related to the selected, e.g., manually pressed, key 20 a, and analyzes the received data to determine which input value 22 a was entered.
- the entered value 22 a is then preferably displayed on a screen 50 .
- the accuracy with which the input value selection module 30 can determine the input value 22 a is improved by dividing each multi-value key 20 a into a plurality of units and associating each of the input values 22 a - 22 c with at least one of the units.
- a representation of each input value is provided on a portion of the multi-value key that coincides with the associated unit(s).
- Each unit in turn is associated with at least one pressure sensor that is capable of measuring a pressure asserted on the associated unit.
- FIG. 2A is an illustration of an exemplary multi-value key 200 that is associated with four (4) input values 202 a - 202 d.
- the multi-value key 200 is divided into four (4) units 210 a - 210 d and each of the input values 202 a - 202 d is associated with one of the four units 210 a - 210 d.
- the multi-value key 200 is divided into four (4) substantially equal units 210 a - 210 d merely for the sake of clarity.
- the number of units can be greater than four (4) and, in such circumstances, each input value, e.g., 202 a, can be associated with more than one unit 210 a.
- the shape and size of each unit 210 can vary so long as each input value 202 a - 202 d is associated with at least one unit 210 .
- each unit 210 a - 210 d may be associated with a pressure sensor 220 a - 220 d that is capable of quantifying the amount of pressure asserted on the associated unit 210 a - 210 d.
- each unit e.g., 210 a
- the user can press the portion of the multi-value key on which the desired input value, e.g., 202 a, is provided, while gazing at the desired input value 202 a.
- Each pressure sensor 220 a - 220 d measures the pressure asserted on its associated unit 210 a - 210 d .
- the input value selection module 30 receives the pressure measurements from each of the pressure sensors 220 a - 220 d along with the eye-tracking information and determines which of the associated input values 202 a - 202 d was entered.
- FIG. 3 is a flowchart illustrating a process for processing data entered at the electronic device 10 shown in FIGS. 1 and 2 .
- the process begins by activating the keypad 25 (step 300 ), for example, by unlocking the input keys 20 .
- the activation of the keypad 25 can also activate the eye-tracking system 40 , which then monitors and records the user's eye movements (step 302 ).
- Activation of the keypad 25 and/or eye-tracking system 40 does not necessarily require an additional action or input, but can occur automatically when electronic device 10 is powered up.
- the eye-tracking system 40 monitors and records, among other things, an area where the user is gazing, which input value(s) are in the area, and the amount of time the user's gaze rests on the input value(s).
- each pressure sensor 220 a - 220 d measures the amount of pressure asserted on the associated unit 210 a - 210 d.
- the pressure measurements associated with each unit 210 a - 210 d are received by the input value selection module 30 , which assigns to each input value 202 a - 202 d a pressure weight, P, based on the corresponding pressure measurement(s) (step 306 ).
- the pressure weight, P can be a fraction ranging from zero (0), corresponding to the lowest pressure measurement, to one (1.0), corresponding to the highest pressure measurement.
- the user preferably presses at or near the upper left hand corner of the input key 200 , which is the portion of the key on which the “2” is provided and which substantially coincides with the unit 210 a associated with the input value 202 a.
- the pressure sensors 220 a - 220 d measure the pressure asserted on each of the units 210 a - 210 d.
- the pressure weight, P, assigned to that input value 202 a will naturally be greater than the pressure weight assigned to an input value, e.g., 202 d, associated with a portion of the key 200 that was not pressed directly.
- the input value selection module 30 In addition to processing the pressure measurements, the input value selection module 30 also analyzes the eye-tracking data to determine how long the user was gazing at any of the input values 202 a - 202 d prior to pressing the key 200 and assigns a gaze weight, G, to each input value 202 a - 202 d based on the amount of time the user was gazing at the input value 202 a - 202 d (step 308 ). Like the pressure weight, P, the gaze weight, G, can be a fraction ranging from zero (0), corresponding to the least amount of gaze time, to one (1.0), corresponding to the greatest amount of gaze time. Once the pressure weight, P, and gaze weight, G, for each input value 202 a - 202 d are assigned, the input value selection module 30 calculates a probability factor, PF, for each input value (step 310 ).
- PF probability factor
- FIG. 4 is a flowchart illustrating a process for calculating the probability factor for an input value, e.g., 202 a, according to one embodiment.
- the assigned pressure weight, P, and gaze weight, G are added to form a combined weight, C (step 400 ).
- the combined weight, C, for each input value 202 a - 202 d is summed to form a total weight, T (step 402 ).
- the probability factor, PF, for each input value 202 a - 202 d is calculated by dividing the combined weight, C, of the input value 202 a by the total weight, T (step 404 ).
- the probability factor, PF(i) is determined by:
- the input value selection module 30 selects the input value having the greatest probability factor and displays that input value (step 312 ).
- the methods and systems disclosed take advantage of how the user intuitively enters data. That is, the user searches the keypad 25 for the key 20 that includes the input value 22 the user desires to enter. When the user's gaze falls upon the desired input value 22 , the user instinctively focuses on the input value 22 and presses the portion of the key 20 displaying the input value 22 .
- the various components of the electronic device 10 according to this embodiment monitor these actions to allow the user to enter the desired input value.
- the user can enter the desired input value with a single tap and need not wait a prescribed time before entering another input value. Accordingly, this embodiment is easy to use and provides fast data entry.
- each defined gaze area 14 covers a portion of at least two data input keys 20 , and preferably is associated with one input value in each of the at least two data input keys 20 .
- input value selection module 30 can determine the input value 22 a even when each input key 20 includes only one input sensor that is associated with multiple input values 202 a assigned to the input key 20 .
- the input sensor is activated when the associated input key 20 is pressed.
- the input sensor may be a simple momentary switch providing two binary values (0 or 1) or may be a pressure sensor as described above.
- FIG. 6 is an illustration for a group of four (4) multi-value input keys 200 a - 200 d according to this embodiment of the present invention.
- one gaze area 14 a includes an area corresponding to one input value of each of the four (4) keys 200 a - 200 d.
- Another gaze area 14 b includes a different area corresponding to a different set of input values in each of two (2) keys 200 c, 200 d.
- the gaze areas 14 a, 14 b do not overlap, and therefore, any particular input value, e.g., 202 d, is associated with only one gaze area 14 a.
- each gaze area 14 can also include a focal point 16 , which is preferably near the center of the gaze area 14 .
- the focal point 16 can be a feature, such as a dot, so that the user can focus on the point 16 .
- the focal point 16 can be unmarked and merely defined as near or around the center of the gaze area 14 .
- Other ways of designating a gaze area 14 and its focal point 16 are available as those skilled in the art would readily appreciate.
- the user can press the multi-value key, e.g., 200 a, associated with the desired input value, e.g., 202 d , while gazing at or near the focal point 16 a associated with the gaze area 14 a .
- the input value selection module 30 receives the key press data along with the eye-tracking information and determines which of the input values 202 a - 202 d associated with the pressed key 200 a was entered by selecting the input value 202 d associated with the gaze area 14 a.
- FIG. 7 is a flowchart illustrating a process for processing entered data using the electronic device 10 according to the embodiment shown in FIGS. 5 and 6 .
- the process begins by activating the keypad 25 (step 700 ), for example, by unlocking the input keys 20 or by powering up the electronic device 10 .
- the activation of the keypad 25 can also activate the eye-tracking system 40 , which then monitors and records the user's eye movements (step 702 ).
- the eye-tracking system 40 monitors and records, among other things, at which gaze area 14 the user is gazing and, optionally, an amount of time the user is gazing at a particular gaze area 14 .
- the user focuses on the focal point 16 of the gaze area 14 so that the eye-tracking system 40 can determine the gaze area 14 more accurately by increasing the margin for error.
- the input value selection module 30 determines which input key 200 a was pressed (step 706 ) and determines at which gaze area 14 a the user is gazing (step 708 ) during a time period either immediately before and/or after the key 200 a was pressed.
- the time period can be short such that the gazing action is substantially simultaneous with the key pressing event.
- the time period can be longer to ensure that the input value selection module 30 detects the correct gaze area 14 a.
- the input value selection module 30 displays the entered input value 202 d associated with both the pressed key 200 a and the gaze area 14 a (step 710 ).
- This embodiment also takes advantage of how the user intuitively enters data. That is, the user searches for the key 200 a that is associated with the desired input value 202 d and presses the key 200 a while gazing at a designated focal point 16 a near the desired input value 202 d. The user can enter the desired input value 202 d with a single tap and need not wait a prescribed time before entering another input value. Accordingly, this embodiment is also easy to use and provides fast data entry.
- the principles described herein can also be used with a standard sized keyboard that includes multi-value keys.
- the key associated with the input value “2” is also associated with the character “@.”
- the user can enter the character “@” by gazing at the “@” and typing the key.
- foreign language keyboards often require the user to press a sequence of keys to enter a symbol.
- the symbol can be associated with a key and can be entered with a single tap.
- a word-based technique yields a probability W for a given letter, where the larger W is the more probable the given letter.
- the probability factor PF calculated by the input value selection module 30 can be combined with W to produce a combined probability weight CW.
- the key-press-gaze technique is given higher priority than the word-based technique.
- single tap data entry is enabled by incorporating an eye-tracking system into an electronic device that has data input keys that are associated with two or more input values.
- the electronic device uses the eye-tracking system to determine where a user is gazing while the user presses a data input key. Based on the eye-tracking and key press data, the electronic device is able to determine which of the two or more input values associated with the pressed key was entered.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Input From Keyboards Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method for processing data entered using a keyboard and a keyboard capable of implementing the method is described. In one embodiment, the keyboard includes keys associated with at least one input data value and at least one multi-value key associated with at least two input data values. Each key includes a unit and each multi-value key includes a plurality of units and each input data value corresponds to at least one unit. The keyboard also includes an eye-tracking system that determines an area where a user is gazing by monitoring eye movements of the user, and an input value selection module coupled to units and to the eye-tracking system. The input value selection module determines probabilistically which input data value was entered based on a value received from at least one unit and on the area where the user is gazing.
Description
- This application is a continuation of U.S. patent application Ser. No. 14/062,414 titled “Systems And Methods For Processing Data Entered Using An Eye-Tracking System”, filed Oct. 24, 2013, now issued as U.S. Pat. No. 9,285,891, which is a continuation of U.S. patent application Ser. No. 12/748,642 titled, “Systems And Methods For Processing Data Entered Using An Eye-Tracking System,” filed Mar. 29, 2010, now issued as U.S. Pat. No. 8,576,175, which is a continuation of U.S. patent application Ser. No. 11/206,596 titled, “Systems And Methods For Processing Data Entered Using An Eye-Tracking System,” filed Aug. 18, 2005, now issued as U.S. Pat. No. 7,719,520 the entire disclosures of which are hereby incorporated by reference in their entireties.
- The present invention relates to electronic devices, and more particularly to a system for processing data entered at an electronic device.
- Handheld electronic devices, such as mobile phones, cameras and personal digital assistants (PDAs), offer a multitude of services and functions. For example, with the development of wireless technologies, many handheld devices are capable of connecting a user to a wireless network, such as the Internet, and allowing the user to send and receive information to and from other users via the wireless network. Moreover, many handheld devices can load and run software applications that allow the user to perform computing tasks.
- While handheld devices can offer performance and versatility, the available services can sometimes be awkward to use because of device's size. Of particular concern is data entry, e.g., inputting text for an electronic message. Most handheld devices lack the space to provide a full keyboard for entering data and instead utilize several known techniques to allow the user to create words or number patterns. Nevertheless, each technique has its disadvantages.
- For example, mobile phones typically provide a conventional number-key pad where each number key, i.e., key 2-9, is associated with three to four alphanumeric characters. By pressing specific keys, the user can form words. One method for creating words using the number-key pad uses a prediction algorithm that relies on an electronic dictionary of common words to predict a word based on the key(s) pressed. This technique, referred to as the T-9 technique, allows the user to tap a key, and a input value selection module in the device checks the dictionary to resolve ambiguities between the letters associated with the tapped key. Although the T-9 technique is a single-tap method that can allow fast data entry, it is not intuitive and can be deficient when two or more words are formed by the same sequence of tapped keys. In that instance, the T-9 technique cannot disambiguate between the words themselves and the user must resolve the ambiguity.
- Another word forming technique using the number-key pad is referred to as multi-tapping. In multi-tapping, the user enters a particular letter by pressing the number key associated with the letter at least one time to scroll to the desired letter. Once the desired letter is found, e.g., displayed on the screen, the user must wait until the selection is committed and the letter is inputted before proceeding to the next letter. While multi-tapping allows the user to form words without ambiguity, it is time-consuming and awkward.
- Other handheld devices, such as PDAs, can display a full keyboard on a touch screen, and the user selects letters by touching or tapping on the touch screen with a stylus. In other similar devices, a full miniature keyboard is provided in the device itself. In both instances, the size of the displayed and physical keyboard is reduced to accommodate the space limitations. Because the keys are small, they are difficult to see or press, making text entry awkward and slow as well as error prone.
- Other text entry techniques exist but suffer from the same and other shortcomings described above. Most of these techniques, in addition to being awkward and slow, are not intuitive and/or require the user to spend much time practicing in order to become somewhat proficient.
- The present invention provides a method and system for entering data using an eye-tracking system in combination with a plurality of manually activated keys. In one embodiment, a keyboard includes keys each associated with at least one input data value and at least one multi-value key associated with at least two input data values. Each multi-value key includes a plurality of units. Each input data value corresponds to at least one unit. The keyboard also includes an eye-tracking system that determines an area where a user is gazing by monitoring eye movements of the user, and an input value selection module coupled to the units and to the eye-tracking system. The input value selection module determines probabilistically which input data value was entered based on a value received from at least one unit and on the area where the user is gazing.
- In another embodiment, the keyboard includes a plurality of data input keys. Each input key is associated with at least one input data value and at least one data input key is a multi-value key associated with at least two input data values. The keyboard also includes an eye-tracking system that determines an area where a user is gazing by monitoring eye movements of the user and that determines an area covering portions of at least two data input keys. The keyboard also includes an input value selection module coupled to the plurality of data input keys and to the eye-tracking system that determines which input data value was entered based on a manually pressed data input key and on the area where the user is gazing.
- In another embodiment, a method for processing data entered at a keyboard having a plurality of keys wherein each key is associated with an input data value and the input data value corresponds to the unit, the plurality of key including at least one multi-value key associated with at least two input data values includes assigning a plurality of units to each multi-value key, where each input data value corresponds to at least one unit, determining an area where a user is gazing by tracking the user's eye movement, detecting a manual selection of one of a key and a multi-value key, and determining probabilistically which input data value was entered based on a value received from at least one unit and on the area where the user is gazing.
- In another embodiment, a method for processing data entered at an electronic device having a plurality of data input keys, where each data input key is associated with at least one input data value and at least one data input key is a multi-value key associated with at least two input data values, includes determining an area where a user is gazing by tracking the user's eye movement, where the area covers portions of at least two data input keys, detecting a manual selection of a data input key, and determining which input data value was entered based on the manually pressed data input key and on the area where the user is gazing.
- The various features of the present invention and the manner of attaining them will be described in greater detail with reference to the following description, claims and drawings, wherein reference numerals are reused, where appropriate, to indicate a correspondence between the referenced items, and wherein:
-
FIG. 1A is a schematic diagram of an exemplary electronic device according to an embodiment; -
FIG. 1B is a system block diagram of the electronic device according to an embodiment; -
FIGS. 2A, 2B and 2C illustrate an exemplary multi-value key that is associated with four (4) input values according to an embodiment; -
FIG. 3 is a flowchart illustrating a process for processing data entered at the electronic device according to an embodiment shown inFIGS. 1A, 1B, and 2A-2C ; -
FIG. 4 is a flowchart illustrating a process for calculating the probability factor for an input value according to one embodiment; -
FIG. 5 is a schematic diagram of an exemplary electronic device according to another embodiment; -
FIG. 6 is an illustration for a group of four (4) multi-value input keys according to an embodiment; and -
FIG. 7 is a flowchart illustrating a process for processing data entered at the electronic device according to the embodiment shown inFIGS. 5 and 6 . - The present invention relates to electronic devices, and more particularly to methods and systems for processing data entered at an electronic device. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
- According to one embodiment, an eye-gaze or eye-tracking system is incorporated into an electronic device that has data input keys that are associated with two or more input values. The electronic device uses the eye-tracking system to determine where a user is gazing while the user presses a data input key. Based on the eye-tracking and key press data, the electronic device is able to determine which of the two or more input values associated with the pressed key was entered.
-
FIG. 1A is a schematic diagram of an exemplary electronic device according to an embodiment andFIG. 1B is a system block diagram of the electronic device according to another embodiment. Referring toFIGS. 1A and 1B , theelectronic device 10 includes a plurality ofdata input keys 20, an eye-trackingsystem 40, an inputvalue selection module 30, and adisplay screen 50. Thedata input keys 20 can be arranged in rows and columns to form akeypad 25 on aface 12 of theelectronic device 10. Each data input key 20 is associated with at least oneinput value 22, and at least one of the data input keys is a multi-value key 20 a that is associated with at least two data input values 22 a, 22 b, 22 c. - In a preferred embodiment, the eye-tracking
system 40 is located on thesame face 12 of the electronic device as thekeypad 25. The eye-trackingsystem 40 tracks and records the movement of a user's eye(s) to determine an area where the user is gazing. The eye-trackingsystem 40 can use many different known techniques to monitor and track the user's eye movements. For example, the eye-trackingsystem 40 can utilize a technique known as corneal reflection, which directs an infrared light beam at the user's eye and measures the angular difference between the operator's mobile pupil and the stationary light beam reflection. Alternatively, the eye-trackingsystem 40 can scan the user's eye region with a scanning apparatus, e.g., television camera, and analyze the resulting image. Commercially available eye-trackingsystems 40 that can be appropriate for the present invention include, for example, the QUICK GLANCE product developed by EyeTech Digital Systems of Mesa, Ariz. - In order to enter an input value, particularly one that is associated with a multi-value key 20 a, e.g., 22 a, the user locates the key 20 a with which the
input value 22 a is associated and gazes at or near a corresponding representation ofinput value 22 a on the face of the key while pressing the key 20 a. The inputvalue selection module 30 receives the eye-tracking data as well as the data related to the selected, e.g., manually pressed, key 20 a, and analyzes the received data to determine whichinput value 22 a was entered. The enteredvalue 22 a is then preferably displayed on ascreen 50. - In one embodiment, the accuracy with which the input
value selection module 30 can determine theinput value 22 a is improved by dividing each multi-value key 20 a into a plurality of units and associating each of the input values 22 a-22 c with at least one of the units. In a preferred embodiment, a representation of each input value is provided on a portion of the multi-value key that coincides with the associated unit(s). Each unit, in turn is associated with at least one pressure sensor that is capable of measuring a pressure asserted on the associated unit. -
FIG. 2A is an illustration of an exemplary multi-value key 200 that is associated with four (4) input values 202 a-202 d. As is shown inFIG. 2B , themulti-value key 200 is divided into four (4) units 210 a-210 d and each of the input values 202 a-202 d is associated with one of the four units 210 a-210 d. InFIG. 2B , themulti-value key 200 is divided into four (4) substantially equal units 210 a-210 d merely for the sake of clarity. The number of units can be greater than four (4) and, in such circumstances, each input value, e.g., 202 a, can be associated with more than oneunit 210 a. Similarly, the shape and size of each unit 210 can vary so long as each input value 202 a-202 d is associated with at least one unit 210. - As is shown in
FIG. 2C , each unit 210 a-210 d may be associated with a pressure sensor 220 a-220 d that is capable of quantifying the amount of pressure asserted on the associated unit 210 a-210 d. Although not shown inFIG. 2C , each unit, e.g., 210 a, can be associated with more than onepressure sensor 220 a, and the shape and size of the pressure sensors 220 a-220 d can vary. - According to this embodiment, during data entry, the user can press the portion of the multi-value key on which the desired input value, e.g., 202 a, is provided, while gazing at the desired
input value 202 a. Each pressure sensor 220 a-220 d measures the pressure asserted on its associated unit 210 a-210 d. The inputvalue selection module 30 receives the pressure measurements from each of the pressure sensors 220 a-220 d along with the eye-tracking information and determines which of the associated input values 202 a-202 d was entered. -
FIG. 3 is a flowchart illustrating a process for processing data entered at theelectronic device 10 shown inFIGS. 1 and 2 . The process begins by activating the keypad 25 (step 300), for example, by unlocking theinput keys 20. The activation of thekeypad 25 can also activate the eye-trackingsystem 40, which then monitors and records the user's eye movements (step 302). Activation of thekeypad 25 and/or eye-trackingsystem 40 does not necessarily require an additional action or input, but can occur automatically whenelectronic device 10 is powered up. According to a preferred embodiment, the eye-trackingsystem 40 monitors and records, among other things, an area where the user is gazing, which input value(s) are in the area, and the amount of time the user's gaze rests on the input value(s). - When the user presses a multi-value key 200 (step 304) to enter a desired input value, e.g., 202 a, each pressure sensor 220 a-220 d measures the amount of pressure asserted on the associated unit 210 a-210 d. The pressure measurements associated with each unit 210 a-210 d are received by the input
value selection module 30, which assigns to each input value 202 a-202 d a pressure weight, P, based on the corresponding pressure measurement(s) (step 306). In one embodiment, the pressure weight, P, can be a fraction ranging from zero (0), corresponding to the lowest pressure measurement, to one (1.0), corresponding to the highest pressure measurement. - Thus, in the example above where the user wants to enter input value “2” (202 a), the user preferably presses at or near the upper left hand corner of the
input key 200, which is the portion of the key on which the “2” is provided and which substantially coincides with theunit 210 a associated with theinput value 202 a. The pressure sensors 220 a-220 d measure the pressure asserted on each of the units 210 a-210 d. Because the user presses the portion of the key 200 substantially coinciding with theunit 210 a corresponding to the desiredinput value 202 a, the pressure weight, P, assigned to thatinput value 202 a will naturally be greater than the pressure weight assigned to an input value, e.g., 202 d, associated with a portion of the key 200 that was not pressed directly. - In addition to processing the pressure measurements, the input
value selection module 30 also analyzes the eye-tracking data to determine how long the user was gazing at any of the input values 202 a-202 d prior to pressing the key 200 and assigns a gaze weight, G, to each input value 202 a-202 d based on the amount of time the user was gazing at the input value 202 a-202 d (step 308). Like the pressure weight, P, the gaze weight, G, can be a fraction ranging from zero (0), corresponding to the least amount of gaze time, to one (1.0), corresponding to the greatest amount of gaze time. Once the pressure weight, P, and gaze weight, G, for each input value 202 a-202 d are assigned, the inputvalue selection module 30 calculates a probability factor, PF, for each input value (step 310). -
FIG. 4 is a flowchart illustrating a process for calculating the probability factor for an input value, e.g., 202 a, according to one embodiment. For each input value 202 a-202 d, the assigned pressure weight, P, and gaze weight, G, are added to form a combined weight, C (step 400). The combined weight, C, for each input value 202 a-202 d is summed to form a total weight, T (step 402). The probability factor, PF, for each input value 202 a-202 d is calculated by dividing the combined weight, C, of theinput value 202 a by the total weight, T (step 404). Thus, for an input value, i, the probability factor, PF(i) is determined by: -
PF(i)=C(i)/T - Referring again to
FIG. 3 , after the probability factor for each of the input values 202 a-202 d is calculated (step 310), the inputvalue selection module 30 selects the input value having the greatest probability factor and displays that input value (step 312). - Based on the embodiment described above, the methods and systems disclosed take advantage of how the user intuitively enters data. That is, the user searches the
keypad 25 for the key 20 that includes theinput value 22 the user desires to enter. When the user's gaze falls upon the desiredinput value 22, the user instinctively focuses on theinput value 22 and presses the portion of the key 20 displaying theinput value 22. The various components of theelectronic device 10 according to this embodiment monitor these actions to allow the user to enter the desired input value. The user can enter the desired input value with a single tap and need not wait a prescribed time before entering another input value. Accordingly, this embodiment is easy to use and provides fast data entry. - In another embodiment, illustrated in
FIG. 5 , the accuracy with which the inputvalue selection module 30 can determine theinput value 22 a is improved by defining a plurality ofgaze areas 14 on theface 12 of theelectronic device 10 a. According to this embodiment, each definedgaze area 14 covers a portion of at least twodata input keys 20, and preferably is associated with one input value in each of the at least twodata input keys 20. Moreover, in this embodiment, inputvalue selection module 30 can determine theinput value 22 a even when eachinput key 20 includes only one input sensor that is associated with multiple input values 202 a assigned to theinput key 20. The input sensor is activated when the associatedinput key 20 is pressed. The input sensor may be a simple momentary switch providing two binary values (0 or 1) or may be a pressure sensor as described above. - For example,
FIG. 6 is an illustration for a group of four (4)multi-value input keys 200 a-200 d according to this embodiment of the present invention. As is shown, onegaze area 14 a includes an area corresponding to one input value of each of the four (4)keys 200 a-200 d. Anothergaze area 14 b includes a different area corresponding to a different set of input values in each of two (2) 200 c, 200 d. Thekeys 14 a, 14 b do not overlap, and therefore, any particular input value, e.g., 202 d, is associated with only onegaze areas gaze area 14 a. - Referring again to
FIG. 5 , eachgaze area 14 can also include afocal point 16, which is preferably near the center of thegaze area 14. Thefocal point 16 can be a feature, such as a dot, so that the user can focus on thepoint 16. Alternatively, thefocal point 16 can be unmarked and merely defined as near or around the center of thegaze area 14. Other ways of designating agaze area 14 and itsfocal point 16 are available as those skilled in the art would readily appreciate. - Referring again to
FIG. 6 , during data entry, the user can press the multi-value key, e.g., 200 a, associated with the desired input value, e.g., 202 d, while gazing at or near thefocal point 16 a associated with thegaze area 14 a. The inputvalue selection module 30 receives the key press data along with the eye-tracking information and determines which of the input values 202 a-202 d associated with the pressed key 200 a was entered by selecting theinput value 202 d associated with thegaze area 14 a. -
FIG. 7 is a flowchart illustrating a process for processing entered data using theelectronic device 10 according to the embodiment shown inFIGS. 5 and 6 . The process begins by activating the keypad 25 (step 700), for example, by unlocking theinput keys 20 or by powering up theelectronic device 10. The activation of thekeypad 25 can also activate the eye-trackingsystem 40, which then monitors and records the user's eye movements (step 702). According to a preferred embodiment, the eye-trackingsystem 40 monitors and records, among other things, at which gazearea 14 the user is gazing and, optionally, an amount of time the user is gazing at aparticular gaze area 14. Preferably, the user focuses on thefocal point 16 of thegaze area 14 so that the eye-trackingsystem 40 can determine thegaze area 14 more accurately by increasing the margin for error. - When the user presses a multi-value key 200 a (step 704) to enter a desired input value, e.g., 202 d, the input
value selection module 30 determines which input key 200 a was pressed (step 706) and determines at which gazearea 14 a the user is gazing (step 708) during a time period either immediately before and/or after the key 200 a was pressed. The time period can be short such that the gazing action is substantially simultaneous with the key pressing event. On the other hand, the time period can be longer to ensure that the inputvalue selection module 30 detects thecorrect gaze area 14 a. Once thegaze area 14 a and the pressed key 200 a are determined, the inputvalue selection module 30 displays the enteredinput value 202 d associated with both the pressed key 200 a and thegaze area 14 a (step 710). - This embodiment also takes advantage of how the user intuitively enters data. That is, the user searches for the key 200 a that is associated with the desired
input value 202 d and presses the key 200 a while gazing at a designatedfocal point 16 a near the desiredinput value 202 d. The user can enter the desiredinput value 202 d with a single tap and need not wait a prescribed time before entering another input value. Accordingly, this embodiment is also easy to use and provides fast data entry. - Although the embodiments described above each utilize a
keypad 25, the principles described herein can also be used with a standard sized keyboard that includes multi-value keys. For example, in a typical QWERTY keyboard, the key associated with the input value “2” is also associated with the character “@.” Utilizing the systems described herein, the user can enter the character “@” by gazing at the “@” and typing the key. In another example, foreign language keyboards often require the user to press a sequence of keys to enter a symbol. With the system of the present invention, the symbol can be associated with a key and can be entered with a single tap. - Moreover, the aspects described herein can also be combined with other letter prediction techniques, such as a word-based letter prediction technique that is based on a dictionary or an application-specific set of words. In this aspect, a word-based technique yields a probability W for a given letter, where the larger W is the more probable the given letter. The probability factor PF calculated by the input
value selection module 30 can be combined with W to produce a combined probability weight CW. Depending on how PF and W are combined, either technique can be given priority. For example, let CW=PF×W. Here, both techniques are given equal priority. Alternatively, let CW=(PF)1/n×W. Here, the key-press-gaze technique is given higher priority than the word-based technique. - According to aspects described herein, single tap data entry is enabled by incorporating an eye-tracking system into an electronic device that has data input keys that are associated with two or more input values. The electronic device uses the eye-tracking system to determine where a user is gazing while the user presses a data input key. Based on the eye-tracking and key press data, the electronic device is able to determine which of the two or more input values associated with the pressed key was entered.
- The present invention has been described in accordance with the embodiments shown, and one of ordinary skill in the art will readily recognize that there could be variations to the embodiments, and any variations would be within the spirit and scope of the present invention. Software written according to the present invention is to be stored in some form of computer-readable medium, such as memory, CD-ROM or transmitted over a network, and executed by a input value selection module. Consequently, a computer-readable medium is intended to include a computer readable signal which, for example, may be transmitted over a network. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.
Claims (23)
1. A keyboard comprising:
a plurality of keys, each associated with an input data value wherein each key includes a unit and the input data value corresponds to the unit, the plurality of keys including at least one multi-value key associated with at least two input data values, wherein each multi-value key includes a plurality of units and each input data value corresponds to at least one of the plurality of units and at least one input data value of the at least two input data values of one or more of the at least one multi-value keys corresponds to a symbol;
an eye-tracking system for determining an area where a user of the keyboard is gazing by monitoring an eye movement of the user; and
an input value selection module coupled to the units of the plurality of keys and to the eye-tracking system;
wherein the input value selection module determines probabilistically which input data value was entered based on a value received from at least one unit of the units and on the area where the user is gazing.
2. The keyboard of claim 1 further comprising:
a plurality of pressure sensors coupled to the input value selection module, wherein each unit of the plurality of units of the at least one multi-value key is associated with at least one of the plurality of pressure sensors and each unit of the plurality of keys is associated with at least one of the plurality of pressure sensors and wherein each of the pressure sensors is configured to measure a pressure asserted on the associated unit of the plurality of keys including the at least one multi-value key;
wherein the input value selection module concurrently analyzes the pressure measurements and the area at which the user is gazing to determine which input data value was entered.
3. The keyboard of claim 2 wherein the input value selection module is configured for:
assigning a pressure weight to each of the input data values associated with the plurality of keys including the at least one multi-value key, the pressure weight corresponding to measured pressured asserted on the unit corresponding to the input data value;
assigning a gaze weight to each of the input data values based on an amount of time that the area at which the user is gazing coincides with the unit corresponding to the input data value;
combining the pressure weight and the gaze weight of each input data value; and
entering the input data value based on the combined pressure and gaze weights.
4. The keyboard of claim 2 wherein the area at which the user is gazing substantially coincides with a unit associated with an input data value.
5. The keyboard of claim 1 wherein the keyboard is associated with a handheld device.
6. The keyboard of claim 5 wherein the handheld device is either a camera, a telephone, or a personal digital assistant.
7. A keyboard comprising:
a plurality of data input keys, wherein each input key is associated with at least one input data value and at least one data input key is a multi-value key associated with at least two input data values and at least one input data value of the at least two input data values of one or more of the at least one data input key that is a multi-value key corresponds to a symbol;
an eye-tracking system for determining an area where a user is gazing by monitoring an eye movement of the user, wherein the eye-tracking system determines an area covering portions of at least two of the plurality of data input keys; and
an input value selection module coupled to the plurality of data input keys and to the eye-tracking system;
wherein the input value selection module determines which input data value was entered based on a manually pressed data input key and on the area where the user is gazing.
8. The keyboard of claim 7 comprising a plurality of defined gaze areas, at least one gaze area overlapping at least two multi-value keys.
9. The keyboard of claim 8 wherein each defined gaze area is associated with one input data value in each of a first and a second multi-value key.
10. The keyboard of claim 8 wherein each defined gaze area is associated with one input data value in each of a first, a second, a third and a fourth multi-value key.
11. The keyboard of claim 9 wherein the input value selection module enters the input data value that is associated with the gaze area and that is associated with the manually pressed multi-value key.
12. The keyboard of claim 7 wherein the keyboard is associated with a handheld device.
13. The keyboard of claim 12 wherein the handheld device is either a camera, a telephone, or a personal digital assistant.
14. A method for processing data entered at a keyboard having a plurality of keys associated with an input data value, the plural of keys including at least one multi-value key associated with at least two input data values, the method comprising:
assigning a unit to each key;
assigning a plurality of units to each multi-value key, wherein each input data value corresponds to at least one unit, wherein at least one input data value of the at least two input data values of one or more of the at least one multi-value key corresponds to a symbol;
determining an area where a user is gazing by tracking the user's eye movement;
detecting a manual selection of one of a key and a multi-value key; and
determining probabilistically an input data value entered based on a value received from at least one unit and on the area where the user is gazing.
15. The method of claim 14 further comprising:
assigning each unit of the plurality of keys including the at least one multi-value key to at least one pressure sensor, wherein each of the pressure sensors is configured to measure and quantify a pressure asserted on the associated unit of the plurality of keys including the at least one multi-value key; and
concurrently analyzing the pressure measurements and the area at which the user is gazing to determine which input data value was entered.
16. The method of claim 15 further comprising defining the area at which the user is gazing as substantially coinciding with a unit associated with an input data value.
17. The method of claim 15 further comprising:
assigning a pressure weight to each of the input data values associated with plurality of keys including the at least one multi-value key, the pressure weight corresponding to measured pressured asserted on the unit corresponding to the input data value;
assigning a gaze weight to each of the input data values based on an amount of time that the area at which the user is gazing coincides with the unit corresponding to the input data value;
combining the pressure weight and the gaze weight of each input data value; and
entering the input data value based on the combined pressure and gaze weights.
18. A method for processing data entered at a keyboard having a plurality of data input keys, each data input key being associated with at least one input data value and at least one data input key being a multi-value key associated with at least two input data values and at least one input data value of the at least two input data values of one or more of the at least one data input key that is a multi-value key corresponds to a symbol, the method comprising:
determining an area where a user is gazing by tracking the user's eye movement, wherein the area covers portions of at least two data input keys;
detecting a manual selection of a data input key; and
determining which input data value was entered based on the manually pressed data input key and on the area where the user is gazing.
19. The method of claim 18 further comprising defining a plurality of gaze areas, at least one gaze area overlapping at least two multi-value keys.
20. The method of claim 19 wherein each defined gaze area is associated with one input data value in each of a first and a second multi-value key.
21. The method of claim 20 wherein determining which input data value was entered includes identifying the input data value that is associated with the gaze area and that is associated with the manually pressed multi-value key.
22. A computer readable medium containing program instructions for entering data using a keyboard having a plurality of data input keys, wherein each data input key is associated with at least one input data value and at least one data input key is a multi-value key associated with at least two input data values and at least one input data value of the at least two input data values of one or more of the at least one data input key that is a multi-value key corresponds to a symbol, the computer readable medium comprising program instructions for:
determining an area where a user is gazing by tracking the user's eye movement, wherein the area covers portions of at least two data input keys;
detecting a manual selection of a data input key; and
determining which input data value was entered based on the manually pressed data input key and on the area where the user is gazing.
23. A computer readable medium containing program instructions for entering data using a keyboard having at least one multi-value key associated with at least two input data values; the computer readable medium comprising program instructions for:
assigning a unit to each key;
assigning a plurality of units to each multi-value key, wherein each input data value corresponds to at least one unit, wherein at least one input data value of the at least two input data values of one or more of the at least one multi-value key corresponds to a symbol;
determining an area where a user is gazing by tracking the user's eye movement;
detecting a manual selection of one of a key and a multi-value key; and
determining probabilistically which input data value was entered based on a value received from at least one unit and on the area where the user is gazing.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/066,288 US20160187998A1 (en) | 2005-08-18 | 2016-03-10 | Systems And Methods For Processing Data Entered Using An Eye-Tracking System |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/206,596 US7719520B2 (en) | 2005-08-18 | 2005-08-18 | Systems and methods for processing data entered using an eye-tracking system |
| US12/748,642 US8576175B2 (en) | 2005-08-18 | 2010-03-29 | Systems and methods for processing data entered using an eye-tracking system |
| US14/062,414 US9285891B2 (en) | 2005-08-18 | 2013-10-24 | Systems and methods for processing data entered using an eye-tracking system |
| US15/066,288 US20160187998A1 (en) | 2005-08-18 | 2016-03-10 | Systems And Methods For Processing Data Entered Using An Eye-Tracking System |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/062,414 Continuation US9285891B2 (en) | 2005-08-18 | 2013-10-24 | Systems and methods for processing data entered using an eye-tracking system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20160187998A1 true US20160187998A1 (en) | 2016-06-30 |
Family
ID=37766936
Family Applications (4)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/206,596 Expired - Fee Related US7719520B2 (en) | 2005-08-18 | 2005-08-18 | Systems and methods for processing data entered using an eye-tracking system |
| US12/748,642 Expired - Fee Related US8576175B2 (en) | 2005-08-18 | 2010-03-29 | Systems and methods for processing data entered using an eye-tracking system |
| US14/062,414 Expired - Fee Related US9285891B2 (en) | 2005-08-18 | 2013-10-24 | Systems and methods for processing data entered using an eye-tracking system |
| US15/066,288 Abandoned US20160187998A1 (en) | 2005-08-18 | 2016-03-10 | Systems And Methods For Processing Data Entered Using An Eye-Tracking System |
Family Applications Before (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/206,596 Expired - Fee Related US7719520B2 (en) | 2005-08-18 | 2005-08-18 | Systems and methods for processing data entered using an eye-tracking system |
| US12/748,642 Expired - Fee Related US8576175B2 (en) | 2005-08-18 | 2010-03-29 | Systems and methods for processing data entered using an eye-tracking system |
| US14/062,414 Expired - Fee Related US9285891B2 (en) | 2005-08-18 | 2013-10-24 | Systems and methods for processing data entered using an eye-tracking system |
Country Status (1)
| Country | Link |
|---|---|
| US (4) | US7719520B2 (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110059666A (en) * | 2019-04-29 | 2019-07-26 | 北京市商汤科技开发有限公司 | A kind of attention detection method and device |
| US11327651B2 (en) * | 2020-02-12 | 2022-05-10 | Facebook Technologies, Llc | Virtual keyboard based on adaptive language model |
Families Citing this family (43)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7881493B1 (en) | 2003-04-11 | 2011-02-01 | Eyetools, Inc. | Methods and apparatuses for use of eye interpretation information |
| US8437729B2 (en) * | 2005-05-10 | 2013-05-07 | Mobile Communication Technologies, Llc | Apparatus for and system for enabling a mobile communicator |
| US20070270122A1 (en) | 2005-05-10 | 2007-11-22 | Ewell Robert C Jr | Apparatus, system, and method for disabling a mobile communicator |
| US8385880B2 (en) * | 2005-05-10 | 2013-02-26 | Mobile Communication Technologies, Llc | Apparatus for and system for enabling a mobile communicator |
| US7953448B2 (en) * | 2006-05-31 | 2011-05-31 | Research In Motion Limited | Keyboard for mobile device |
| US7719520B2 (en) * | 2005-08-18 | 2010-05-18 | Scenera Technologies, Llc | Systems and methods for processing data entered using an eye-tracking system |
| US20070088714A1 (en) * | 2005-10-19 | 2007-04-19 | Edwards Gregory T | Methods and apparatuses for collection, processing, and utilization of viewing data |
| US7760910B2 (en) * | 2005-12-12 | 2010-07-20 | Eyetools, Inc. | Evaluation of visual stimuli using existing viewing data |
| US10437459B2 (en) | 2007-01-07 | 2019-10-08 | Apple Inc. | Multitouch data fusion |
| US20090104928A1 (en) * | 2007-10-22 | 2009-04-23 | Sony Ericsson Mobile Communications Ab | Portable electronic device and a method for entering data on such a device |
| JP5241278B2 (en) * | 2008-03-12 | 2013-07-17 | アルパイン株式会社 | Touch panel input device and process execution method |
| US20100007507A1 (en) * | 2008-07-14 | 2010-01-14 | Maurice William Thompson | Electronic eye-pointing assisted alternative and augmented communications device |
| WO2010071928A1 (en) * | 2008-12-22 | 2010-07-01 | Seeing Machines Limited | Automatic calibration of a gaze direction algorithm from user behaviour |
| US9039419B2 (en) * | 2009-11-06 | 2015-05-26 | International Business Machines Corporation | Method and system for controlling skill acquisition interfaces |
| US9507418B2 (en) * | 2010-01-21 | 2016-11-29 | Tobii Ab | Eye tracker based contextual action |
| US8384566B2 (en) * | 2010-05-19 | 2013-02-26 | Mckesson Financial Holdings | Pressure-sensitive keyboard and associated method of operation |
| US20120093358A1 (en) * | 2010-10-15 | 2012-04-19 | Visteon Global Technologies, Inc. | Control of rear-view and side-view mirrors and camera-coordinated displays via eye gaze |
| US20120119999A1 (en) * | 2010-11-11 | 2012-05-17 | Harris Scott C | Adaptive Keyboard for portable device |
| US10139900B2 (en) | 2011-04-12 | 2018-11-27 | Mobile Communication Technologies, Llc | Mobile communicator device including user attentiveness detector |
| US9026779B2 (en) | 2011-04-12 | 2015-05-05 | Mobile Communication Technologies, Llc | Mobile communicator device including user attentiveness detector |
| US9026780B2 (en) | 2011-04-12 | 2015-05-05 | Mobile Communication Technologies, Llc | Mobile communicator device including user attentiveness detector |
| US8995945B2 (en) | 2011-08-30 | 2015-03-31 | Mobile Communication Technologies, Llc | Mobile communicator and system |
| US20150084864A1 (en) * | 2012-01-09 | 2015-03-26 | Google Inc. | Input Method |
| CN102662473B (en) * | 2012-04-16 | 2016-08-24 | 维沃移动通信有限公司 | The device and method of man-machine information interaction is realized based on eye motion recognition |
| JP2013225226A (en) * | 2012-04-23 | 2013-10-31 | Kyocera Corp | Information terminal, display control program and display control method |
| JP5783957B2 (en) * | 2012-06-22 | 2015-09-24 | 株式会社Nttドコモ | Display device, display method, and program |
| US9007308B2 (en) * | 2012-08-03 | 2015-04-14 | Google Inc. | Adaptive keyboard lighting |
| DE102012215407A1 (en) * | 2012-08-30 | 2014-05-28 | Bayerische Motoren Werke Aktiengesellschaft | Providing an input for a control |
| US9064168B2 (en) * | 2012-12-14 | 2015-06-23 | Hand Held Products, Inc. | Selective output of decoded message data |
| US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
| CN103870164A (en) * | 2012-12-17 | 2014-06-18 | 联想(北京)有限公司 | Processing method and electronic device |
| US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
| US20150049023A1 (en) * | 2013-08-16 | 2015-02-19 | Omnivision Technologies, Inc. | Keyboard Camera Device |
| US9760696B2 (en) * | 2013-09-27 | 2017-09-12 | Excalibur Ip, Llc | Secure physical authentication input with personal display or sound device |
| JP2015090569A (en) * | 2013-11-06 | 2015-05-11 | ソニー株式会社 | Information processing device and information processing method |
| GB201322873D0 (en) * | 2013-12-23 | 2014-02-12 | Tobii Technology Ab | Eye gaze determination |
| WO2015127325A1 (en) * | 2014-02-21 | 2015-08-27 | Drnc Holdings, Inc. | Methods for facilitating entry of user input into computing devices |
| US9804753B2 (en) * | 2014-03-20 | 2017-10-31 | Microsoft Technology Licensing, Llc | Selection using eye gaze evaluation over time |
| JP6696119B2 (en) * | 2015-05-01 | 2020-05-20 | 富士通株式会社 | Conversion device, conversion method, and conversion program |
| US10713501B2 (en) | 2015-08-13 | 2020-07-14 | Ford Global Technologies, Llc | Focus system to enhance vehicle vision performance |
| US10275023B2 (en) * | 2016-05-05 | 2019-04-30 | Google Llc | Combining gaze input and touch surface input for user interfaces in augmented and/or virtual reality |
| US10223067B2 (en) * | 2016-07-15 | 2019-03-05 | Microsoft Technology Licensing, Llc | Leveraging environmental context for enhanced communication throughput |
| US10262157B2 (en) * | 2016-09-28 | 2019-04-16 | International Business Machines Corporation | Application recommendation based on permissions |
Family Cites Families (28)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR2680420B1 (en) * | 1991-08-14 | 1994-06-17 | Truchet Philippe | ELECTRONIC DEVICE FOR TRANSFORMING A STANDARD COMPUTER SYSTEM INTO A COMPUTER SAID WITHOUT A KEYBOARD IN ORDER TO SIMPLIFY THEIR USE. |
| US5528235A (en) * | 1991-09-03 | 1996-06-18 | Edward D. Lin | Multi-status multi-function data processing key and key array |
| CA2126142A1 (en) * | 1994-06-17 | 1995-12-18 | David Alexander Kahn | Visual communications apparatus |
| US5818437A (en) | 1995-07-26 | 1998-10-06 | Tegic Communications, Inc. | Reduced keyboard disambiguating computer |
| US6127990A (en) * | 1995-11-28 | 2000-10-03 | Vega Vista, Inc. | Wearable display and methods for controlling same |
| US5912721A (en) | 1996-03-13 | 1999-06-15 | Kabushiki Kaisha Toshiba | Gaze detection apparatus and its method as well as information display apparatus |
| US5953541A (en) | 1997-01-24 | 1999-09-14 | Tegic Communications, Inc. | Disambiguating system for disambiguating ambiguous input sequences by displaying objects associated with the generated input sequences in the order of decreasing frequency of use |
| RU2214620C2 (en) | 1997-09-25 | 2003-10-20 | Теджик Коммьюникейшнз, Инк. | Reduced-keyboard system for ambiguity elimination |
| US6377685B1 (en) | 1999-04-23 | 2002-04-23 | Ravi C. Krishnan | Cluster key arrangement |
| US7286115B2 (en) * | 2000-05-26 | 2007-10-23 | Tegic Communications, Inc. | Directional input system with automatic correction |
| JP2001136492A (en) * | 1999-11-09 | 2001-05-18 | Fuji Photo Film Co Ltd | Image reproducing device |
| US20020126097A1 (en) * | 2001-03-07 | 2002-09-12 | Savolainen Sampo Jussi Pellervo | Alphanumeric data entry method and apparatus using reduced keyboard and context related dictionaries |
| US20020163504A1 (en) * | 2001-03-13 | 2002-11-07 | Pallakoff Matthew G. | Hand-held device that supports fast text typing |
| US6847706B2 (en) | 2001-03-20 | 2005-01-25 | Saied Bozorgui-Nesbat | Method and apparatus for alphanumeric data entry using a keypad |
| US7385591B2 (en) * | 2001-03-31 | 2008-06-10 | Microsoft Corporation | Out-of-vocabulary word determination and user interface for text input via reduced keypad keys |
| US6886137B2 (en) | 2001-05-29 | 2005-04-26 | International Business Machines Corporation | Eye gaze control of dynamic information presentation |
| ATE470897T1 (en) * | 2001-06-12 | 2010-06-15 | Research In Motion Ltd | PORTABLE ELECTRONIC DEVICE WITH KEYBOARD |
| US6765556B2 (en) * | 2001-11-16 | 2004-07-20 | International Business Machines Corporation | Two-key input per character text entry apparatus and method |
| US7014099B2 (en) * | 2001-12-31 | 2006-03-21 | Hewlett-Packard Development Company, L.P. | Data entry device |
| GB2396001B (en) * | 2002-10-09 | 2005-10-26 | Canon Kk | Gaze tracking system |
| US7084858B2 (en) * | 2002-11-19 | 2006-08-01 | Microsoft Corporation | System and method for inputting characters using a directional pad |
| US7306337B2 (en) * | 2003-03-06 | 2007-12-11 | Rensselaer Polytechnic Institute | Calibration-free gaze tracking under natural head movement |
| US20040186729A1 (en) * | 2003-03-11 | 2004-09-23 | Samsung Electronics Co., Ltd. | Apparatus for and method of inputting Korean vowels |
| CN1221884C (en) | 2003-03-27 | 2005-10-05 | 刘二中 | Device possessing character input function and method |
| US20050047629A1 (en) * | 2003-08-25 | 2005-03-03 | International Business Machines Corporation | System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking |
| US7555732B2 (en) * | 2004-03-12 | 2009-06-30 | Steven Van der Hoeven | Apparatus method and system for a data entry interface |
| US7719520B2 (en) * | 2005-08-18 | 2010-05-18 | Scenera Technologies, Llc | Systems and methods for processing data entered using an eye-tracking system |
| US7760910B2 (en) * | 2005-12-12 | 2010-07-20 | Eyetools, Inc. | Evaluation of visual stimuli using existing viewing data |
-
2005
- 2005-08-18 US US11/206,596 patent/US7719520B2/en not_active Expired - Fee Related
-
2010
- 2010-03-29 US US12/748,642 patent/US8576175B2/en not_active Expired - Fee Related
-
2013
- 2013-10-24 US US14/062,414 patent/US9285891B2/en not_active Expired - Fee Related
-
2016
- 2016-03-10 US US15/066,288 patent/US20160187998A1/en not_active Abandoned
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110059666A (en) * | 2019-04-29 | 2019-07-26 | 北京市商汤科技开发有限公司 | A kind of attention detection method and device |
| US11327651B2 (en) * | 2020-02-12 | 2022-05-10 | Facebook Technologies, Llc | Virtual keyboard based on adaptive language model |
| US20220261150A1 (en) * | 2020-02-12 | 2022-08-18 | Facebook Technologies, Llc | Virtual keyboard based on adaptive language model |
| US11899928B2 (en) * | 2020-02-12 | 2024-02-13 | Meta Platforms Technologies, Llc | Virtual keyboard based on adaptive language model |
Also Published As
| Publication number | Publication date |
|---|---|
| US20140049474A1 (en) | 2014-02-20 |
| US8576175B2 (en) | 2013-11-05 |
| US20100182243A1 (en) | 2010-07-22 |
| US7719520B2 (en) | 2010-05-18 |
| US20070040799A1 (en) | 2007-02-22 |
| US9285891B2 (en) | 2016-03-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9285891B2 (en) | Systems and methods for processing data entered using an eye-tracking system | |
| US10642933B2 (en) | Method and apparatus for word prediction selection | |
| US8390583B2 (en) | Pressure sensitive user interface for mobile devices | |
| US7508324B2 (en) | Finger activated reduced keyboard and a method for performing text input | |
| KR101695174B1 (en) | Ergonomic motion detection for receiving character input to electronic devices | |
| US10747334B2 (en) | Reduced keyboard disambiguating system and method thereof | |
| CN106445369A (en) | Input method and device | |
| JP5177158B2 (en) | Input device, input button display method, and input button display program | |
| US20100110002A1 (en) | Communication device with combined input and display device | |
| US20140215397A1 (en) | Apparatus and Method Pertaining to Predicted-Text Derivatives | |
| Dearman et al. | Multi-modal text entry and selection on a mobile device. | |
| CA2846561C (en) | Method and apparatus for word prediction selection | |
| US9250728B2 (en) | Apparatus and method pertaining to predicted-text entry | |
| JPH0954646A (en) | Virtual keyboard device and key input control method | |
| Kim et al. | Hybrid evaluation method for Korean Character input system in mobile phones | |
| CA2840803C (en) | Apparatus and method pertaining to predicted-text entry | |
| EP2759912B1 (en) | Apparatus and method pertaining to predicted-text entry | |
| JP2002091671A (en) | Method for recognizing depressed key | |
| JP2000155641A (en) | Keyboard controller and its control method | |
| EP2759911A1 (en) | Apparatus and method pertaining to predicted-text derivatives | |
| US20130293475A1 (en) | Typing efficiency enhancement system and method | |
| CA2840817A1 (en) | Apparatus and method pertaining to predicted-text derivatives | |
| CA2793436A1 (en) | Method of facilitating input at an electronic device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: SCENERA TECHNOLOGIES, LLC, NEW HAMPSHIRE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IPAC ACQUISITION SUBSIDIARY I, LLC;REEL/FRAME:038599/0841 Effective date: 20061102 Owner name: IPAC ACQUISITION SUBSIDIARY I, LLC, NEW HAMPSHIRE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SINGH, MONA;THOMAS, THEODOSIOS;SIGNING DATES FROM 20050817 TO 20050818;REEL/FRAME:038599/0683 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |