Onishi et al., 2014 - Google Patents
PopArm: A robot arm for embodying video-mediated pointing behaviorsOnishi et al., 2014
- Document ID
- 13631044066504174784
- Author
- Onishi Y
- Tanaka K
- Nakanishi H
- Publication year
- Publication venue
- 2014 International Conference on Collaboration Technologies and Systems (CTS)
External Links
Snippet
In a videoconferencing system, there is a problem that the direction to which a remote instructor points is unclear. To solve this problem, we developed the PopArm, a remote pointing robotic arm that seems to pop out from the video. The PopArm is able to move and …
- 230000006399 behavior 0 title description 3
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with three-dimensional environments, e.g. control of viewpoint to navigate in the environment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Schäfer et al. | A survey on synchronous augmented, virtual, andmixed reality remote collaboration systems | |
| Grønbæk et al. | Partially blended realities: Aligning dissimilar spaces for distributed mixed reality meetings | |
| Zillner et al. | 3D-board: a whole-body remote collaborative whiteboard | |
| Gauglitz et al. | Integrating the physical environment into mobile remote collaboration | |
| EP1536645B1 (en) | Video conferencing system with physical cues | |
| Tang et al. | Collaboration in 360 videochat: Challenges and opportunities | |
| Kim et al. | Telehuman: effects of 3d perspective on gaze and pose estimation with a life-size cylindrical telepresence pod | |
| US8395655B2 (en) | System and method for enabling collaboration in a video conferencing system | |
| Yamazaki et al. | GestureLaser and Gesturelaser Car: Development of an embodied space to support remote instruction | |
| JP5839427B2 (en) | 3D teleconferencing device capable of eye contact and method using the same | |
| Lee et al. | A user study on view-sharing techniques for one-to-many mixed reality collaborations | |
| Onishi et al. | Embodiment of video-mediated communication enhances social telepresence | |
| Robinson et al. | Distributed tabletops: Supporting remote and mixed-presence tabletop collaboration | |
| Kuzuoka et al. | Mediating dual ecologies | |
| Kishore et al. | Beaming into the news: a system for and case study of tele-immersive journalism | |
| Roberts et al. | Communicating eye-gaze across a distance: Comparing an eye-gaze enabled immersive collaborative virtual environment, aligned video conferencing, and being together | |
| Lincoln et al. | Animatronic shader lamps avatars | |
| Feick et al. | Perspective on and re-orientation of physical proxies in object-focused remote collaboration | |
| Kunz et al. | Collaboard: a novel interactive electronic whiteboard for remote collaboration with people on content | |
| Regenbrecht et al. | Mutual gaze support in videoconferencing reviewed | |
| Zhang et al. | Remotetouch: Enhancing immersive 3d video communication with hand touch | |
| Sakashita et al. | ReMotion: Supporting Remote Collaboration in Open Space with Automatic Robotic Embodiment | |
| Mulder et al. | A modular system for collaborative desktop vr/ar with a shared workspace | |
| Onishi et al. | PopArm: A robot arm for embodying video-mediated pointing behaviors | |
| Nescher et al. | An interactive whiteboard for immersive telecollaboration |