[go: up one dir, main page]

Antonya, 2012 - Google Patents

Accuracy of gaze point estimation in immersive 3d interaction interface based on eye tracking

Antonya, 2012

Document ID
15259690169381223163
Author
Antonya C
Publication year
Publication venue
2012 12th International Conference on Control Automation Robotics & Vision (ICARCV)

External Links

Snippet

The gaze point and gaze line, measured with an eye tracking device, can be used in various interaction interfaces, like mobile robot programming in immersive virtual environment. Path generation of the robot should be made without any tedious eye gestures, but rather it …
Continue reading at ieeexplore.ieee.org (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with three-dimensional environments, e.g. control of viewpoint to navigate in the environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • G09B23/285Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for injections, endoscopy, bronchoscopy, sigmoidscopy, insertion of contraceptive devices or enemas
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048

Similar Documents

Publication Publication Date Title
Lemaignan et al. From real-time attention assessment to “with-me-ness” in human-robot interaction
CN110167421B (en) System for integrally measuring clinical parameters of visual function
Mobini et al. Accuracy of Kinect’s skeleton tracking for upper body rehabilitation applications
Held et al. Telepresence, time delay, and adaptation
Kasahara et al. JackIn: integrating first-person view with out-of-body vision generation for human-human augmentation
Berton et al. Eye-gaze activity in crowds: impact of virtual reality and density
Essig et al. A neural network for 3D gaze recording with binocular eye trackers
CN104838337A (en) Touchless input for a user interface
Lin et al. Eye movement parameters for performance evaluation in projection-based stereoscopic display
Essig et al. Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems
Placidi et al. Data integration by two-sensors in a LEAP-based Virtual Glove for human-system interaction
Di Vincenzo et al. A natural human-drone embodied interface: Empirical comparison with a traditional interface
Li et al. A flexible technique to select objects via convolutional neural network in VR space
Antonya Accuracy of gaze point estimation in immersive 3d interaction interface based on eye tracking
La Scaleia et al. Visuomotor interactions and perceptual judgments in virtual reality simulating different levels of gravity
Kreiensieck et al. A comprehensive evaluation of openface 2.0 gaze tracking
Invitto et al. Interactive entertainment, virtual motion training and brain ergonomy
Leo et al. Mental rotation skill shapes haptic exploration strategies
Lukashova-Sanz et al. Context matters during pick-and-place in VR: Impact on search and transport phases
Bao et al. Real-time wide-view eye tracking based on resolving the spatial depth
Angrisani et al. A method for the metrological characterization of eye-and head-tracking interfaces for human–machine interaction through eXtended Reality head-mounted displays
Kang et al. Putting vision and touch into conflict: results from a multimodal mixed reality setup
Raees et al. THE-3DI: Tracing head and eyes for 3D interactions: An interaction technique for virtual environments
Wu et al. Hybrid Swin Transformer-Based Classification of Gaze Target Regions
Antonya et al. Path generation in virtual reality environment based on gaze analysis