[go: up one dir, main page]

Iossifidis et al., 2003 - Google Patents

Anthropomorphism as a pervasive design concept for a robotic assistant

Iossifidis et al., 2003

View PDF
Document ID
10892760043037210075
Author
Iossifidis I
Theis C
Grote C
Faubel C
Schoner G
Publication year
Publication venue
Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003)(Cat. No. 03CH37453)

External Links

Snippet

CORA is a robotic assistant whose task is to collaborate with a human operator on simple manipulation or handling tasks. Its sensory channels comprising vision, audition, haptics, and force sensing are used to extract perceptual information about speech, gestures and …
Continue reading at www.researchgate.net (PDF) (other versions)

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40053Pick 3-D object from pile of objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/50Machine tool, machine tool null till machine tool work handling
    • G05B2219/50109Soft approach, engage, retract, escape, withdraw path for tool to workpiece
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric

Similar Documents

Publication Publication Date Title
Du et al. Markerless human–robot interface for dual robot manipulators using Kinect sensor
US9701015B2 (en) Vision-guided robots and methods of training them
US7353082B2 (en) Method and a system for programming an industrial robot
Petersson et al. Systems integration for real-world manipulation tasks
Nguyen et al. Merging physical and social interaction for effective human-robot collaboration
Bischoff et al. Integrating vision, touch and natural language in the control of a situation-oriented behavior-based humanoid robot
Iossifidis et al. Anthropomorphism as a pervasive design concept for a robotic assistant
Chen et al. A human–robot interface for mobile manipulator
Mühlbauer et al. A probabilistic approach to multi-modal adaptive virtual fixtures
Infantino et al. A cognitive architecture for robotic hand posture learning
Sumukha et al. Gesture controlled 6 DoF manipulator with custom gripper for pick and place operation using ROS2 framework
Infantino et al. Visual control of a robotic hand
Theis et al. Image processing methods for interactive robot control
Iossifidis et al. A cooperative robotic assistant for human environments
Chen et al. Gesture-speech based HMI for a rehabilitation robot
Inoue Vision Based Robot Behavior: Tools and Testbeds for Real-World AI Research
Faubel et al. A Cooperative Robotic Assistant for Human
Memmesheimer et al. Markerless visual robot programming by demonstration
Dillmann et al. Programming by demonstration: A machine learning approach to support skill acquision for robots
Ferraz et al. Robotic actuator control through eye-tracking camera to assist people with physical limitations
Memmesheimer et al. Robotic imitation by markerless visual observation and semantic associations
Hwang et al. Real-time grasp planning based on motion field graph for human-robot cooperation
Wu et al. A VR-Based Robotic Teleoperation System With Haptic Feedback and Adaptive Collision Avoidance
Saini et al. The b-it-bots@ Home 2024 Team Description Paper
Ishihata et al. Teaching Method for Robot’s Gripper Posture with a Laser Sensor on a Pan-tilt Actuator: A Method for Specifying Posture Feature Curves and Posture Feature Point