Coutinho et al., 2013 - Google Patents
Improving head movement tolerance of cross-ratio based eye trackersCoutinho et al., 2013
View PDF- Document ID
- 9383850503930953425
- Author
- Coutinho F
- Morimoto C
- Publication year
- Publication venue
- International journal of computer vision
External Links
Snippet
When first introduced, the cross-ratio (CR) based remote eye tracking method offered many attractive features for natural human gaze-based interaction, such as simple camera setup, no user calibration, and invariance to head motion. However, due to many simplification …
- 230000004886 head movement 0 title abstract description 41
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00597—Acquiring or recognising eyes, e.g. iris verification
- G06K9/00604—Acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
- G02B27/00—Other optical systems; Other optical apparatus
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
- G02B27/00—Other optical systems; Other optical apparatus
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Coutinho et al. | Improving head movement tolerance of cross-ratio based eye trackers | |
Lai et al. | Hybrid method for 3-D gaze tracking using glint and contour features | |
US6659611B2 (en) | System and method for eye gaze tracking using corneal image mapping | |
Wang et al. | 3D gaze estimation without explicit personal calibration | |
Shih et al. | A novel approach to 3-D gaze tracking using stereo cameras | |
Nitschke et al. | Corneal imaging revisited: An overview of corneal reflection analysis and applications | |
Nitschke et al. | Display-camera calibration using eye reflections and geometry constraints | |
WO2023011339A1 (en) | Line-of-sight direction tracking method and apparatus | |
Mestre et al. | Robust eye tracking based on multiple corneal reflections for clinical applications | |
Cho et al. | Long range eye gaze tracking system for a large screen | |
EP2306891A1 (en) | Eye gaze tracking | |
Arar et al. | A regression-based user calibration framework for real-time gaze estimation | |
Cheng et al. | Gazing point dependent eye gaze estimation | |
US11181978B2 (en) | System and method for gaze estimation | |
CN102125422A (en) | Pupil center-corneal reflection (PCCR) based sight line evaluation method in sight line tracking system | |
Takemura et al. | Estimation of a focused object using a corneal surface image for eye-based interaction | |
Toivanen et al. | Probabilistic approach to robust wearable gaze tracking | |
Liu et al. | Iris feature-based 3-D gaze estimation method using a one-camera-one-light-source system | |
JP2018099174A (en) | Pupil detector and pupil detection method | |
Wen et al. | Accurate real‐time 3D gaze tracking using a lightweight eyeball calibration | |
Coutinho et al. | Augmenting the robustness of cross-ratio gaze tracking methods to head movement | |
Lu et al. | Neural 3D gaze: 3D pupil localization and gaze tracking based on anatomical eye model and neural refraction correction | |
Morimoto et al. | Screen-light decomposition framework for point-of-gaze estimation using a single uncalibrated camera and multiple light sources | |
Li et al. | An efficient method for eye tracking and eye-gazed FOV estimation | |
Nitschke et al. | I see what you see: point of gaze estimation from corneal images |