Iossifidis et al., 2003 - Google Patents
Anthropomorphism as a pervasive design concept for a robotic assistantIossifidis et al., 2003
View PDF- Document ID
- 10892760043037210075
- Author
- Iossifidis I
- Theis C
- Grote C
- Faubel C
- Schoner G
- Publication year
- Publication venue
- Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003)(Cat. No. 03CH37453)
External Links
Snippet
CORA is a robotic assistant whose task is to collaborate with a human operator on simple manipulation or handling tasks. Its sensory channels comprising vision, audition, haptics, and force sensing are used to extract perceptual information about speech, gestures and …
- 238000004805 robotic 0 title abstract description 20
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40053—Pick 3-D object from pile of objects
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/50—Machine tool, machine tool null till machine tool work handling
- G05B2219/50109—Soft approach, engage, retract, escape, withdraw path for tool to workpiece
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Du et al. | Markerless human–robot interface for dual robot manipulators using Kinect sensor | |
| US9701015B2 (en) | Vision-guided robots and methods of training them | |
| US7353082B2 (en) | Method and a system for programming an industrial robot | |
| Petersson et al. | Systems integration for real-world manipulation tasks | |
| Nguyen et al. | Merging physical and social interaction for effective human-robot collaboration | |
| Bischoff et al. | Integrating vision, touch and natural language in the control of a situation-oriented behavior-based humanoid robot | |
| Iossifidis et al. | Anthropomorphism as a pervasive design concept for a robotic assistant | |
| Chen et al. | A human–robot interface for mobile manipulator | |
| Mühlbauer et al. | A probabilistic approach to multi-modal adaptive virtual fixtures | |
| Infantino et al. | A cognitive architecture for robotic hand posture learning | |
| Sumukha et al. | Gesture controlled 6 DoF manipulator with custom gripper for pick and place operation using ROS2 framework | |
| Infantino et al. | Visual control of a robotic hand | |
| Theis et al. | Image processing methods for interactive robot control | |
| Iossifidis et al. | A cooperative robotic assistant for human environments | |
| Chen et al. | Gesture-speech based HMI for a rehabilitation robot | |
| Inoue | Vision Based Robot Behavior: Tools and Testbeds for Real-World AI Research | |
| Faubel et al. | A Cooperative Robotic Assistant for Human | |
| Memmesheimer et al. | Markerless visual robot programming by demonstration | |
| Dillmann et al. | Programming by demonstration: A machine learning approach to support skill acquision for robots | |
| Ferraz et al. | Robotic actuator control through eye-tracking camera to assist people with physical limitations | |
| Memmesheimer et al. | Robotic imitation by markerless visual observation and semantic associations | |
| Hwang et al. | Real-time grasp planning based on motion field graph for human-robot cooperation | |
| Wu et al. | A VR-Based Robotic Teleoperation System With Haptic Feedback and Adaptive Collision Avoidance | |
| Saini et al. | The b-it-bots@ Home 2024 Team Description Paper | |
| Ishihata et al. | Teaching Method for Robot’s Gripper Posture with a Laser Sensor on a Pan-tilt Actuator: A Method for Specifying Posture Feature Curves and Posture Feature Point |