[go: up one dir, main page]

Jacques et al., 2005 - Google Patents

An object tracking and visual servoing system for the visually impaired

Jacques et al., 2005

Document ID
10327106808450306475
Author
Jacques D
Rodrigo R
McIsaac K
Samarabandu J
Publication year
Publication venue
Proceedings of the 2005 IEEE International Conference on Robotics and Automation

External Links

Snippet

In this work, we have taken the first step towards the creation of a computerized seeing-eye guide dog. The system we presented extends the development of assistive technology for the visually impaired into a new area: object tracking and visual servoing. The system uses …
Continue reading at ieeexplore.ieee.org (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • G06K9/00268Feature extraction; Face representation
    • G06K9/00281Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • G06K9/00355Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision

Similar Documents

Publication Publication Date Title
Waldherr et al. A gesture based interface for human-robot interaction
Pavlovic et al. Visual interpretation of hand gestures for human-computer interaction: A review
Van den Bergh et al. Real-time 3D hand gesture interaction with a robot for understanding directions from humans
Triesch et al. A gesture interface for human-robot-interaction
Matsumoto et al. Behavior recognition based on head pose and gaze direction measurement
Stiefelhagen et al. Gaze tracking for multimodal human-computer interaction
JP5001930B2 (en) Motion recognition apparatus and method
Lenz et al. Human workflow analysis using 3d occupancy grid hand tracking in a human-robot collaboration scenario
KR20220026186A (en) A Mixed Reality Telepresence System for Dissimilar Spaces Using Full-Body Avatar
Chen et al. A human–robot interface for mobile manipulator
Busam et al. A stereo vision approach for cooperative robotic movement therapy
Mühlbauer et al. A probabilistic approach to multi-modal adaptive virtual fixtures
JP7565848B2 (en) ROBOT REMOTE OPERATION CONTROL DEVICE, ROBOT REMOTE OPERATION CONTROL SYSTEM, ROBOT REMOTE OPERATION CONTROL METHOD, AND PROGRAM
Boulic et al. Evaluation of on-line analytic and numeric inverse kinematics approaches driven by partial vision input
Jacques et al. An object tracking and visual servoing system for the visually impaired
Perez et al. Robotic wheelchair controlled through a vision-based interface
Bergasa et al. Guidance of a wheelchair for handicapped people by face tracking
JP2021094085A (en) Control system and control program
Martin et al. Estimation of pointing poses for visually instructing mobile robots under real world conditions
Nolker et al. Illumination independent recognition of deictic arm postures
Mihara et al. A real‐time vision‐based interface using Motion Processor and applications to robotics
Hwang et al. Performance-based animation using constraints for virtual object manipulation
Do Hyoung Kim et al. A human-robot interface using eye-gaze tracking system for people with motor disabilities
Rett et al. Visual based human motion analysis: Mapping gestures using a puppet model
Haritaoglu et al. Attentive Toys.