Yam-Viramontes et al., 2022 - Google Patents
Commanding a drone through body poses, improving the user experienceYam-Viramontes et al., 2022
- Document ID
- 9465034459842081303
- Author
- Yam-Viramontes B
- Cardona-Reyes H
- González-Trejo J
- Trujillo-Espinoza C
- Mercado-Ravell D
- Publication year
- Publication venue
- Journal on Multimodal User Interfaces
External Links
Snippet
In this work, we propose the use of Multimodal Human-Computer Interfaces (MHCI) through body poses to command a drone in an easy and intuitive way. First, the human-user pose is recovered from a video stream, with the help of the open source library Open Pose. Then, a …
- 238000002474 experimental method 0 abstract description 9
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for a specific business sector, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/20—Education
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
- G09B19/0038—Sports
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Systems or methods specially adapted for a specific business sector, e.g. utilities or tourism
- G06Q50/01—Social networking
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Szafir et al. | Designing planning and control interfaces to support user collaboration with flying robots | |
| Laut et al. | Increasing patient engagement in rehabilitation exercises using computer-based citizen science | |
| Zhou et al. | Human-behaviour-based social locomotion model improves the humanization of social robots | |
| Rybarczyk et al. | Recognition of physiotherapeutic exercises through DTW and low-cost vision-based motion capture | |
| Schreiter et al. | THÖR-MAGNI: A large-scale indoor motion capture recording of human movement and robot interaction | |
| Araya et al. | Automatic detection of gaze and body orientation in elementary school classrooms | |
| Kutbi et al. | Egocentric computer vision for hands-free robotic wheelchair navigation | |
| Liu et al. | State-of-the-art elderly service robot: Environmental perception, compliance control, intention recognition, and research challenges | |
| Pięta et al. | Automated classification of virtual reality user motions using a motion atlas and machine learning approach | |
| Qin et al. | Vision-based pointing estimation and evaluation in toddlers for autism screening | |
| Di Vincenzo et al. | A natural human-drone embodied interface: Empirical comparison with a traditional interface | |
| Adamo-Villani et al. | Two gesture recognition systems for immersive math education of the deaf | |
| Yam-Viramontes et al. | Implementation of a natural user interface to command a drone | |
| Yam-Viramontes et al. | Commanding a drone through body poses, improving the user experience | |
| Zhang et al. | Designing a training assistant system for badminton using artificial intelligence | |
| Guanoluisa et al. | GY MEDIC: analysis and rehabilitation system for patients with facial paralysis | |
| Tomari et al. | Enhancing wheelchair manoeuvrability for severe impairment users | |
| Finco et al. | Exergames, artificial intelligence and augmented reality: Connections to body and sensorial experiences | |
| Hincapié et al. | Using RGBD cameras for classifying learning and teacher interaction through postural attitude | |
| Nikiforov et al. | Pilot studies on Avrora Unior car-like robot control using gestures | |
| Widdowson et al. | VR environment for the study of collocated interaction between small UAVs and humans | |
| Mao et al. | Eliminating drift of the head gesture reference to enhance Google Glass-based control of an NAO humanoid robot | |
| Lakshantha et al. | A diagrammatic framework for intuitive human robot interaction | |
| Sanders et al. | Design and validation of a unity-based simulation to investigate gesture based control of semi-autonomous vehicles | |
| Ranjana et al. | Applications and Implications of Artificial Intelligence and Deep Learning in Computer Vision |