Antonya, 2012 - Google Patents
Accuracy of gaze point estimation in immersive 3d interaction interface based on eye trackingAntonya, 2012
- Document ID
- 15259690169381223163
- Author
- Antonya C
- Publication year
- Publication venue
- 2012 12th International Conference on Control Automation Robotics & Vision (ICARCV)
External Links
Snippet
The gaze point and gaze line, measured with an eye tracking device, can be used in various interaction interfaces, like mobile robot programming in immersive virtual environment. Path generation of the robot should be made without any tedious eye gestures, but rather it …
- 230000003993 interaction 0 title abstract description 9
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with three-dimensional environments, e.g. control of viewpoint to navigate in the environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/285—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Lemaignan et al. | From real-time attention assessment to “with-me-ness” in human-robot interaction | |
| CN110167421B (en) | System for integrally measuring clinical parameters of visual function | |
| Mobini et al. | Accuracy of Kinect’s skeleton tracking for upper body rehabilitation applications | |
| Held et al. | Telepresence, time delay, and adaptation | |
| Kasahara et al. | JackIn: integrating first-person view with out-of-body vision generation for human-human augmentation | |
| Berton et al. | Eye-gaze activity in crowds: impact of virtual reality and density | |
| Essig et al. | A neural network for 3D gaze recording with binocular eye trackers | |
| CN104838337A (en) | Touchless input for a user interface | |
| Lin et al. | Eye movement parameters for performance evaluation in projection-based stereoscopic display | |
| Essig et al. | Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems | |
| Placidi et al. | Data integration by two-sensors in a LEAP-based Virtual Glove for human-system interaction | |
| Di Vincenzo et al. | A natural human-drone embodied interface: Empirical comparison with a traditional interface | |
| Li et al. | A flexible technique to select objects via convolutional neural network in VR space | |
| Antonya | Accuracy of gaze point estimation in immersive 3d interaction interface based on eye tracking | |
| La Scaleia et al. | Visuomotor interactions and perceptual judgments in virtual reality simulating different levels of gravity | |
| Kreiensieck et al. | A comprehensive evaluation of openface 2.0 gaze tracking | |
| Invitto et al. | Interactive entertainment, virtual motion training and brain ergonomy | |
| Leo et al. | Mental rotation skill shapes haptic exploration strategies | |
| Lukashova-Sanz et al. | Context matters during pick-and-place in VR: Impact on search and transport phases | |
| Bao et al. | Real-time wide-view eye tracking based on resolving the spatial depth | |
| Angrisani et al. | A method for the metrological characterization of eye-and head-tracking interfaces for human–machine interaction through eXtended Reality head-mounted displays | |
| Kang et al. | Putting vision and touch into conflict: results from a multimodal mixed reality setup | |
| Raees et al. | THE-3DI: Tracing head and eyes for 3D interactions: An interaction technique for virtual environments | |
| Wu et al. | Hybrid Swin Transformer-Based Classification of Gaze Target Regions | |
| Antonya et al. | Path generation in virtual reality environment based on gaze analysis |