Došen et al., 2011 - Google Patents
Transradial prosthesis: artificial vision for control of prehensionDošen et al., 2011
View PDF- Document ID
- 9035540588483114628
- Author
- Došen S
- Popović D
- Publication year
- Publication venue
- Artificial organs
External Links
Snippet
We present a practical system for controlling the prehension of a transradial prosthesis. The system is mounted on the artificial hand and comprises simple hardware and software that are convenient for real‐time implementation. The hardware consists of a standard web …
- 210000000707 Wrist 0 abstract description 22
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Došen et al. | Transradial prosthesis: artificial vision for control of prehension | |
| Quivira et al. | Translating sEMG signals to continuous hand poses using recurrent neural networks | |
| Antuvan et al. | Embedded human control of robots using myoelectric interfaces | |
| Cerulo et al. | Teleoperation of the SCHUNK S5FH under-actuated anthropomorphic hand using human hand motion tracking | |
| Li et al. | A dexterous hand-arm teleoperation system based on hand pose estimation and active vision | |
| WO2019147928A1 (en) | Handstate reconstruction based on multiple inputs | |
| Zandigohar et al. | Multimodal fusion of EMG and vision for human grasp intent inference in prosthetic hand control | |
| Gallina et al. | Progressive co-adaptation in human-machine interaction | |
| Tacca et al. | Wearable high-density EMG sleeve for complex hand gesture classification and continuous joint angle estimation | |
| Figueredo et al. | Planning to minimize the human muscular effort during forceful human-robot collaboration | |
| Ramadoss et al. | Computer Vision for Human‐Computer Interaction Using Noninvasive Technology | |
| Pasarica et al. | Remote control of a robotic platform based on hand gesture recognition | |
| Mastinu et al. | Explorations of autonomous prosthetic grasping via proximity vision and deep learning | |
| Skoglund et al. | Programming-by-Demonstration of reaching motions—A next-state-planner approach | |
| Chen et al. | Hand tracking accuracy enhancement by data fusion using leap motion and myo armband | |
| Phillips et al. | Endpoint control for a powered shoulder prosthesis | |
| Olikkal et al. | Learning hand gestures using synergies in a humanoid robot | |
| Heiwolt et al. | Automatic detection of myocontrol failures based upon situational context information | |
| Rouse | A four-dimensional virtual hand brain–machine interface using active dimension selection | |
| Li et al. | Unsupervised Recurrent Neural Network with Parametric Bias Framework for Human Emotion Recognition with Multimodal Sensor Data Fusion. | |
| Zhu et al. | Learning torsional eye movements through active efficient coding | |
| James et al. | Realtime hand landmark tracking to aid development of a prosthetic arm for reach and grasp motions | |
| Dwivedi | Analysis, development, and evaluation of muscle machine interfaces for the intuitive control of robotic devices | |
| Gentili et al. | A cortically-inspired model for inverse kinematics computation of a humanoid finger with mechanically coupled joints | |
| Bernat Iborra | Mimicking hand motion using sEMG-based techniques for controlling a prosthesis in a natural and intuitive way |