WO2012120520A1 - Gestural interaction - Google Patents
Gestural interaction Download PDFInfo
- Publication number
- WO2012120520A1 WO2012120520A1 PCT/IN2011/000136 IN2011000136W WO2012120520A1 WO 2012120520 A1 WO2012120520 A1 WO 2012120520A1 IN 2011000136 W IN2011000136 W IN 2011000136W WO 2012120520 A1 WO2012120520 A1 WO 2012120520A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- command
- gearshift
- user interface
- interactive
- graphical user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Definitions
- Gestural interactive systems are computing systems that sense the movements of a user and translate those movements into
- the commands then cause an attached computing device to execute code based on the command.
- These gestural interactive systems may be used in situations wherein a user may desire a more intimate, efficient, and - meaningful interface with a computing device, and, more specifically, within gaming, media playback and browsing, and remote control applications, for example.
- GUI graphical user interfaces
- the graphical user interfaces are designed for productivity applications that the user must concentrate on and excel in using in order to properly perform desired tasks. Further, in using these graphical user interfaces, the goal is to memorize the gestures, and not use a teaching system. Thus, in these types of GUI's a novice user cannot utilize the gestural interactive system because he or she is unfamiliar with the techniques and gestures used to control the system through the GUI.
- FIG. 1 is a diagram of a gestural interactive system, according to one example of the principles described herein.
- Fig. 2 is a diagram of the computing device utilized within the gestural interactive system of Fig. 1 , according to one example of the principles described herein.
- FIG. 3 is a diagram of a gearshift graphical user interface (GUI) processed by a processor and displayed on a display device, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- FIG. 4 is a diagram of a gearshift graphical user interface (GUI) in which a user initiates selection of a play/pause command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- FIG. 5 is a diagram of a gearshift graphical user interface (GUI) in which a user completes the selection of a play/pause command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- FIG. 6 is a diagram of a gearshift graphical user interface (GUI) in which a user initiates selection of a fast forward command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. 7 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a user initiates selection of a fast forward command but partially deviates from a channel in the gearshift graphical user interface (GUI), according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. 8 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a user initiates selection of a fast forward command, but fully deviates from a channel in the gearshift GUI, and terminates the selection of the fast forward command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. 9 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a cursor returns to a neutral position after the user fully deviates from a channel in the gearshift graphical user interface (GUI) and terminates the selection of the fast forward command, according to one example of the principles described herein.
- GUI gearshift graphical user interface
- Fig. ⁇ O is a flowchart showing an exemplary method of browsing media using single stroke gestural interaction, according to one example of the principles described herein.
- identical reference numbers designate similar, but not necessarily identical, elements.
- An individual unfamiliar with a gestural interactive system may find it difficult to interact with the system because the gestural interactive system has a particular mode of interaction such as, for example, specific gestures to be performed to induce certain commands and the method by which the user is to interact with the system of which the user may not have foreknowledge.
- utilization of a familiar metaphor such as, for example, a gearshift graphical user interface provides a user with a familiar setting in which the user may instruct the gestural interactive system to perform a command.
- the graphical user interface is analogized with shifting gears in an automobile; an action generally familiar to all users.
- the gestural interactive system should make two things immediately understandable to a user in one visualization: all available commands, and how to perform a given command.
- all available commands In contrast to the systems and methods of the present application, other menu-ing systems are not immediately understandable to a novice user.
- the gearshift graphical user interface and an associated sensor and display device of the present application simultaneously provide for all available commands to be displayed to a user. Further, the user can instinctively know or quickly learn how to perform a given command based on the displayed gearshift graphical user interface.
- gestural interactive system is meant to be understood broadly as any system that interprets and utilizes the gestures of a user to command a processor to execute code.
- Some examples in which a gestural interactive system is used may comprise computer gaming systems, computing systems in which a mouse and/or keyboard is replaced with gesture interaction, remote control of media devices such as televisions and media playback devices, and robotics in which the gesture of a user is used to control a robotic device, among others.
- the terms "gearshift graphical user interface” or “gearshift GUI” are meant to be understood broadly as any graphical user interface that utilizes a single action to interact with a gestural interactive system.
- the graphical user interface may be presented to a user on a display device in the form of a gearshift pattern in which the terminals of the gearshift pattern represent a number of functions to be performed via the graphical user interface.
- a number of or similar language is meant to be understood broadly as any positive number comprising 1 to infinity; zero not being a number, but the absence of a number.
- Fig. 1 is a diagram of a gestural interactive system (100), according to one example of the principles described herein.
- the gestural interactive system (100) with which a number of users (120) interacts may comprise a computing device (1 15), a display device (105) communicatively coupled to the computing device (1 15), and a sensor (1 10) communicatively coupled to the computing device (1 15).
- the display device (105) may be any device from which the user (120) receives visual feedback while operating the gestural interactive system (100).
- the display (105) may be a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light emitting diode (OLED) display, a plasma display panel, a televisions, a computer monitor, a high-definition television, a cathode ray tube (CRT) display, or a display that utilizes a projector system, among others.
- LCD liquid crystal display
- LED light-emitting diode
- OLED organic light emitting diode
- plasma display panel a plasma display panel
- televisions a computer monitor
- high-definition television a cathode ray tube (CRT) display
- CTR cathode ray tube
- the sensor (1 10) of Fig. 1 may be any device that detects the presence and motion of a number of users (120).
- the sensor (110) comprises an imaging device that captures full-body three dimensional (3D) motion of each user (120).
- the sensor (110) comprises a sensor bar that detects the presence and motion of a number of handheld remotes with which the users (120) interact with the gestural interactive system (100).
- the sensor (1 10) comprises a sensor bar that detects the presence and motion of a number of glove remotes with which the users (120) interact with the gestural interactive system (100).
- the computing device (1 15), the display device (105), and the sensor (1 10) are separate devices communicatively coupled to each other.
- the principles set forth in the present specification extend equally to any alternative configuration in which a computing device (1 15), the display device (105), and the sensor (110) are configured as one device, or two devices with one device comprising one of these devices, and the other device comprising tow of these devices.
- the computing device (1 15), the display device (105), and the sensor (1 10) are implemented by the same computing device, examples in which the functionality of the computing device (115) is implemented by multiple interconnected computers, for example, a server in a data center and a user's client machine, and examples in which the computing device (1 15), the display device (105), and the sensor (110) communicate directly through a bus without intermediary network devices.
- the gestural interactive system (100) further comprises a computing device (1 15).
- the computing device will now be described in connection with Fig. 2. Fig.
- the computing device (1 15) of the present example determines the selection of a command such as, for example, a playback function by the user based on the data or information detected by the sensor (1 10). In the present example, this is accomplished by the computing device (1 15) receiving data from the sensor (1 10), determining the position of a cursor within a gearshift graphical user interface (GUI), and, based in the position of the cursor, executing the function or command.
- a command such as, for example, a playback function by the user based on the data or information detected by the sensor (1 10. In the present example, this is accomplished by the computing device (1 15) receiving data from the sensor (1 10), determining the position of a cursor within a gearshift graphical user interface (GUI), and, based in the position of the cursor, executing the function or command.
- GUI gearshift graphical user interface
- the computing device (1 15) includes various hardware components. These hardware components may comprise a processor (125), a number of data storage devices (130), and peripheral device adapters (135), among others. These hardware components may be interconnected through the use of one or more busses and/or network connections. In one example, the processor (125), data storage device (130), and peripheral device adapters (135) are communicatively . coupled via bus (107).
- the processor (125) may include the hardware architecture for retrieving executable code from the data storage device (130) and executing the executable code.
- the executable code when executed by the processor (125), causes the processor (125) to implement at least the functionality of determining the selection of a command by the user (120) based on the data or information detected by the sensor (1 10), and executing that command as described herein.
- the processor (125) may receive input from and provide output to one or more of the remaining hardware units.
- the computing device (115), and, specifically, the processor (125) receives data from the sensor (1 10); the data being indicative of the position of a cursor relative to the gearshift GUI.
- the sensor (1 10) captures the position of a hand of a user or a handheld remote used by the user (120) relative to the gearshift GUI displayed on the display device (105).
- the data storage device (130) may store data such as executable code as discussed above. This executable code is processed and produced by the processor (125).
- the data storage device (130) may include various types of memory devices, including volatile and nonvolatile memory.
- the data storage device (130) of the present example includes Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory, among others.
- RAM Random Access Memory
- ROM Read Only Memory
- HDD Hard Disk Drive
- the present specification contemplates the use of many varying type(s) of memory in the data storage device (130) as may suit a particular application of the principles described herein.
- different types of memory in the data storage device (130) may be used for different data storage needs.
- the processor (125) may boot from Read Only Memory (ROM), maintain nonvolatile storage in the Hard Disk Drive (HDD) memory, and execute program code stored in Random Access Memory (RAM).
- the data storage device (130) may comprise a computer readable storage medium.
- the data storage device (130) may be, but not limited to, an electronic, magnetic, optical,
- a computer 1 readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- peripheral device adapters (135) in the computing device (1 15) enable the processor (125) to interface with various other hardware elements, external and internal to the computing device (1 15).
- peripheral device adapters (135) may provide an interface to input/output devices, such as, for example, the display device (105) and the sensor (1 10), a keyboard, and a mouse, among others, to create a user interface and/or access external sources of memory storage, for example.
- Fig. 3 is a diagram of a gearshift graphical user interface (GUI) (300) processed by the processor (125) and displayed on the display device (105).
- the gearshift GUI (300) may comprise a number of channels (302).
- the channels comprise a number of terminals (304) that terminate the channels (302).
- the gearshift GUI (300) further comprises a number of commands (305 through 330) that instruct the gestural interactive system (100) to perform those commands, respectively.
- the commands (305 through 330) are located juxtaposition to a respective terminal (304) indicating to a user that if such a terminal (304) is selected, then the gestural interactive system (100) will perform the command, juxtaposition to that terminal (304).
- the commands may comprise any function of the gestural interactive system (100).
- the commands comprise the following media playback and browsing commands: stop (305), pause/play (310), fast forward (325), rewind (330), next (song/photo/chapter) (320), and previous (song/photo/chapter) (315), among others.
- the commands (305 through 330) will cause the gestural interactive system (100) to affect the playback of various forms of media including, for example, music, video, and pictures and the browsing of these forms of media.
- the gearshift GUI (300) further comprises a cursor (340).
- the cursor (340) is a circle.
- the cursor (340) may be represented as any shape.
- the cursor (340) provides feedback to the user regarding the position of the user's hand with respect to the sensor, display device, or a combination of these.
- the cursor (340) is visible within the displayed gearshift GUI (300) when the user's hand or controlling device such as a handheld remote is visible to the gestural interactive system (100), and, specifically, the sensor (1 10).
- the cursor (340) is visible within the displayed gearshift GUI (300) when the user is interacting with the gestural interactive system (100). In this example, the cursor (340) appears when the user moves his or her hand or controlling device.
- the cursor (340) is located in a neutral position (Fig. 4, 303). While in the neutral position (Fig. 4, 303), the . cursor (340) does not indicate the selection of a command (305 through 330). In the physical world, this is equivalent to the user (120) not gesturing in any direction, or not interacting with the gestural interactive system (100).
- the cursor (340) is presented as a relatively larger rendering of the cursor (340) when in the neutral position (Fig. 4, 303) with respect to when the cursor (340) is not in the neutral position (Fig. 4, 303).
- the cursor (340) is presented as a relatively smaller rendering of the cursor when hot in the neutral position (Fig. 4, 303).
- FIGs. 4 and 5 a diagram of a gearshift graphical user interface (GUI) (300) in which a user selects a pause/play command (310) is shown.
- GUI gearshift graphical user interface
- the user (120) begins to move his or her hand, for example, in a downward direction.
- the sensor (110) identifies and tracks the movement of the user's hand
- the processor (125) and display device (105) provide feedback within the gearshift GUI (300) displayed on the display device (105) by moving the cursor (340) in a downward direction relative to the movement of the user's hand.
- the processor (125) of the gestural interactive system (100) receives this input, and
- the gestural interactive system (100) provides for a single action or gesture to be used in interacting with the gestural interactive system to implement a command.
- FIG. 6 is a diagram of a gearshift graphical user interface (GUI) (300) in which a user (120) initiates selection of a fast forward command (325), according to one example of the principles described herein. This is
- the user (120) moves his or her hand or handheld device to the right in order to select the fast forward command (325) on the gearshift GUI (300).
- the display device (105) displays the movement of the cursor (340) to the right and towards the fast forward command (325) of the gearshift GUI (300).
- the user (120) receives visual feedback from the gearshift GUI (300) that the cursor (340) is in the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325).
- Fig. 7 is a diagram of the gearshift graphical user interface (GUI) (300) of Fig. 6 in which a user (120) initiates selection of a fast forward command (325) but partially deviates from the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325).
- GUI gearshift graphical user interface
- the cursor (340) if the cursor (340) deviates from a channel (302) of the gearshift GUI (300) to a predetermined degree, then the cursor (340) returns to the neutral position (303).
- the user may cancel a gesture before a command (305 through 330) is selected by causing the cursor (340) to move perpendicular or approximately perpendicular to a channel (302). This method of canceling a gesture is described in more detail in connection with Figs. 8 and 9.
- Fig. 8 is a diagram of the gearshift graphical user interface (GUI) (300) of Fig. 6 in which a user (120) initiates selection of a fast forward command (325), but fully deviates from a channel (302) in the gearshift GUI (300), and terminates the selection of the fast forward command (325), according to one example of the principles described herein.
- the cursor (340) is moved substantially away from the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325). In this manner, the gesture that the user (120) initiated is now canceled, and the fast forward command (325) is not selected.
- GUI gearshift graphical user interface
- GUI gearshift graphical user interface
- full deviation of the cursor (340) from a channel (302) causes the cancelation of a gesture.
- a portion of the cursor (340) deviates from a channel (302), then the gesture is canceled.
- the amount of deviation of the cursor (340) from a channel (302) that implements a cancelation of the gesture is determined by a user (120) as a user definable parameter.
- Fig. 10 is a flowchart showing an exemplary method of browsing media using single stroke gestural interaction, according to one example of the principles described herein.
- the method of Fig. 0 may begin by the processor (125) displaying a gearshift GUI (300) to a user (120) on the display device (105) (block 1005).
- the imaging device (110) images the scene including the user (120) to sense the movements of the user (120) (block 1010).
- the movements of the user (120) are transmitted to the processor (125).
- the movements of the. user are then translated into movement of the cursor (340), and displayed as feedback to the user (120) on to display device (105) (block 1020).
- the processor (125) continually monitors the user's movements and implements the display of these movements on the display device (105).
- the processor (125) determines, based on the imaged movements of the user (120) by the imaging device (110), if the movements are perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025) indicating the user's instructions to cancel the gesture.
- the processor (125) determines that the movements are perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025, Determination YES), then the gesture is canceled at block 1030. If, however, the processor (125) determines that the movements are not perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025, Determination NO), then the processor continues to monitor whether the movements are perpendicular or
- block 1035 may comprise continually performing the method of block 1025.
- the processor (125) executes the command (305 through 330). In this manner, the user (120) can interact with the gestural interactive system (100) to select a playback function, for example.
- the above-described system and method although described in the context of media playback and browsing may be applied in any command scenario.
- the above described gearshift GUI (300) and associated method of utilizing the gearshift GUI may be applied in any environment wherein a user (120) interactively selects commands via a gestural interactive system (100).
- the above-described system and method may be used in connection with computer gaming environments, computing systems in which a mouse and/or keyboard is replaced with gesture interaction, remote control of media devices such as televisions, and robotics in which the gesture of a user is used to control a robotic device, among others.
- a gearshift graphical user interface (300) is presented to a user (120) on a display device (105) that provides intuitive know-how regarding the operation thereof.
- the gearshift user interface (300) comprises a number of interactive commands (305 through 330).
- a sensor (110) detects the position of a user's hand, and the position of the user's hand relative to the command (305 through 330) of the gearshift user interface (300) is represented on the gearshift graphical user interface (300). If the representation of the user's hand indicates selection of a command (305 through 330), then a processor (125) implements the selected command (305 through 330).
- This single stroke gestural interaction may have a number of advantages, including the following: (1 ) the entire set of commands understood by the gestural interactive system, and instructions for performing them are always visible to the user; (2) the gearshift GUI is operated by tracking the coarse position of a single hand; (3) feedback about hand position is provided constantly, and it is possible to cancel out of a gesture at any time simply by moving perpendicular to the channel or groove; and (4) the system is more robust with respect to user and system errors.
- the fourth advantage if a user moves his hand to the left, for example, but does not move it sufficiently upwards, he may end up rewinding the movie rather than going to the previous chapter. However, this is as opposed to having distinct gestures for the different commands where the consequences of user or system errors may be unpredictable.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A method of gestural interaction comprises presenting a gearshift graphical user interface on a display device, the gearshift graphical user interface comprising a number of interactive commands, detecting, with a sensor, the position of a user's hand with respect to the graphical user interface, representing on the gearshift graphical user interface the position of the user's hand relative to the interactive commands with a curser, and if the curser indicates selection of a command, then, with a processor, executing the selected command.
Description
GESTURAL INTERACTION
BACKGROUND
[0001] Gestural interactive systems are computing systems that sense the movements of a user and translate those movements into
commands. The commands then cause an attached computing device to execute code based on the command. These gestural interactive systems may be used in situations wherein a user may desire a more intimate, efficient, and - meaningful interface with a computing device, and, more specifically, within gaming, media playback and browsing, and remote control applications, for example.
[0002] However, when utilizing gestural interactive systems in media playback and browsing scenarios, the graphical user interfaces (GUI) provided are designed for productivity applications that the user must concentrate on and excel in using in order to properly perform desired tasks. Further, in using these graphical user interfaces, the goal is to memorize the gestures, and not use a teaching system. Thus, in these types of GUI's a novice user cannot utilize the gestural interactive system because he or she is unfamiliar with the techniques and gestures used to control the system through the GUI.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] The accompanying drawings illustrate various embodiments of the principles described herein and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the claims.
[0004] Fig. 1 is a diagram of a gestural interactive system, according to one example of the principles described herein.
9 [0005] Fig. 2 is a diagram of the computing device utilized within the gestural interactive system of Fig. 1 , according to one example of the principles described herein.
[0006] Fig. 3 is a diagram of a gearshift graphical user interface (GUI) processed by a processor and displayed on a display device, according to one example of the principles described herein.
[0007] Fig. 4 is a diagram of a gearshift graphical user interface (GUI) in which a user initiates selection of a play/pause command, according to one example of the principles described herein.
[0008] Fig. 5 is a diagram of a gearshift graphical user interface (GUI) in which a user completes the selection of a play/pause command, according to one example of the principles described herein.
[0009] Fig. 6 is a diagram of a gearshift graphical user interface (GUI) in which a user initiates selection of a fast forward command, according to one example of the principles described herein.
[0010] Fig. 7 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a user initiates selection of a fast forward command but partially deviates from a channel in the gearshift graphical user interface (GUI), according to one example of the principles described herein.
[0011] Fig. 8 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a user initiates selection of a fast forward command, but fully deviates from a channel in the gearshift GUI, and terminates the selection of the fast forward command, according to one example of the principles described herein.
[0012] Fig. 9 is a diagram of the gearshift graphical user interface (GUI) of Fig. 6 in which a cursor returns to a neutral position after the user fully deviates from a channel in the gearshift graphical user interface (GUI) and terminates the selection of the fast forward command, according to one example of the principles described herein.
[0013] Fig. ^ O is a flowchart showing an exemplary method of browsing media using single stroke gestural interaction, according to one example of the principles described herein.
[0014] Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
DETAILED DESCRIPTION
[001.5] An individual unfamiliar with a gestural interactive system may find it difficult to interact with the system because the gestural interactive system has a particular mode of interaction such as, for example, specific gestures to be performed to induce certain commands and the method by which the user is to interact with the system of which the user may not have foreknowledge. However, utilization of a familiar metaphor such as, for example, a gearshift graphical user interface provides a user with a familiar setting in which the user may instruct the gestural interactive system to perform a command. In this ' example, the graphical user interface is analogized with shifting gears in an automobile; an action generally familiar to all users.
[0016] In order to provide a gestural interactive system with which a novice user may intuitively interact, the gestural interactive system should make two things immediately understandable to a user in one visualization: all available commands, and how to perform a given command. In contrast to the systems and methods of the present application, other menu-ing systems are not immediately understandable to a novice user. Thus, the gearshift graphical user interface and an associated sensor and display device of the present application simultaneously provide for all available commands to be displayed to a user. Further, the user can instinctively know or quickly learn how to perform a given command based on the displayed gearshift graphical user interface.
[0017] As used in the present specification and in the appended claims, the term "gestural interactive system" is meant to be understood broadly as any system that interprets and utilizes the gestures of a user to command a processor to execute code. Some examples in which a gestural interactive system is used may comprise computer gaming systems, computing systems in which a mouse and/or keyboard is replaced with gesture interaction, remote control of media devices such as televisions and media playback devices, and
robotics in which the gesture of a user is used to control a robotic device, among others.
[0018] Further, as used in the present specification and in the appended claims, the terms "gearshift graphical user interface" or "gearshift GUI" are meant to be understood broadly as any graphical user interface that utilizes a single action to interact with a gestural interactive system. In one example, the graphical user interface may be presented to a user on a display device in the form of a gearshift pattern in which the terminals of the gearshift pattern represent a number of functions to be performed via the graphical user interface.
[0019] Still further, as used in the present specification and in the appended claims, the term "a number of or similar language is meant to be understood broadly as any positive number comprising 1 to infinity; zero not being a number, but the absence of a number.
[0020] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present systems and methods. It will be apparent, however, to one skilled in the art that the present apparatus, systems and methods may be practiced without these specific details. Reference in the specification to "an embodiment," "an example" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least that one embodiment, but not necessarily in other embodiments. The Various instances of. the phrase "in one embodiment" or similar phrases in various places in the specification are not necessarily all referring to the same embodiment.
[0021] Fig. 1 is a diagram of a gestural interactive system (100), according to one example of the principles described herein. In one example, the gestural interactive system (100) with which a number of users (120) interacts may comprise a computing device (1 15), a display device (105) communicatively coupled to the computing device (1 15), and a sensor (1 10) communicatively coupled to the computing device (1 15). In one example, the display device (105) may be any device from which the user (120) receives
visual feedback while operating the gestural interactive system (100). For example, the display (105) may be a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light emitting diode (OLED) display, a plasma display panel, a televisions, a computer monitor, a high-definition television, a cathode ray tube (CRT) display, or a display that utilizes a projector system, among others.
[0022] The sensor (1 10) of Fig. 1 may be any device that detects the presence and motion of a number of users (120). In one example, the sensor (110) comprises an imaging device that captures full-body three dimensional (3D) motion of each user (120). In another example, the sensor (110) comprises a sensor bar that detects the presence and motion of a number of handheld remotes with which the users (120) interact with the gestural interactive system (100). In yet another example, the sensor (1 10) comprises a sensor bar that detects the presence and motion of a number of glove remotes with which the users (120) interact with the gestural interactive system (100).
[0023] In the present example, for the purposes of simplicity in illustration, the computing device (1 15), the display device (105), and the sensor (1 10) are separate devices communicatively coupled to each other. However, the principles set forth in the present specification extend equally to any alternative configuration in which a computing device (1 15), the display device (105), and the sensor (110) are configured as one device, or two devices with one device comprising one of these devices, and the other device comprising tow of these devices. As such, alternative examples within the scope of the principles of the present specification include, but are hot limited to, examples in which the computing device (1 15), the display device (105), and the sensor (1 10) are implemented by the same computing device, examples in which the functionality of the computing device (115) is implemented by multiple interconnected computers, for example, a server in a data center and a user's client machine, and examples in which the computing device (1 15), the display device (105), and the sensor (110) communicate directly through a bus without intermediary network devices.
[0024] As mentioned above, the gestural interactive system (100) further comprises a computing device (1 15). The computing device will now be described in connection with Fig. 2. Fig. 2 is a diagram of the computing device (1 15) utilized within the gestural interactive system (100) of Fig. 1 , according to one example of the principles described herein. The computing device (1 15) of the present example determines the selection of a command such as, for example, a playback function by the user based on the data or information detected by the sensor (1 10). In the present example, this is accomplished by the computing device (1 15) receiving data from the sensor (1 10), determining the position of a cursor within a gearshift graphical user interface (GUI), and, based in the position of the cursor, executing the function or command.
Illustrative processes for determining the selection of a command are set forth in more detail below.
[0025] To achieve its desired functionality, the computing device (1 15) includes various hardware components. These hardware components may comprise a processor (125), a number of data storage devices (130), and peripheral device adapters (135), among others. These hardware components may be interconnected through the use of one or more busses and/or network connections. In one example, the processor (125), data storage device (130), and peripheral device adapters (135) are communicatively. coupled via bus (107).
[0026] The processor (125) may include the hardware architecture for retrieving executable code from the data storage device (130) and executing the executable code. The executable code, when executed by the processor (125), causes the processor (125) to implement at least the functionality of determining the selection of a command by the user (120) based on the data or information detected by the sensor (1 10), and executing that command as described herein. In the course of executing code, the processor (125) may receive input from and provide output to one or more of the remaining hardware units.
[0027] In one example, the computing device (115), and, specifically, the processor (125) receives data from the sensor (1 10); the data being indicative of the position of a cursor relative to the gearshift GUI. In this
manner, the sensor (1 10) captures the position of a hand of a user or a handheld remote used by the user (120) relative to the gearshift GUI displayed on the display device (105).
[0028] The data storage device (130) may store data such as executable code as discussed above. This executable code is processed and produced by the processor (125). The data storage device (130) may include various types of memory devices, including volatile and nonvolatile memory. For example, the data storage device (130) of the present example includes Random Access Memory (RAM), Read Only Memory (ROM), and Hard Disk Drive (HDD) memory, among others. The present specification contemplates the use of many varying type(s) of memory in the data storage device (130) as may suit a particular application of the principles described herein. In certain examples, different types of memory in the data storage device (130) may be used for different data storage needs. For example, in certain examples the processor (125) may boot from Read Only Memory (ROM), maintain nonvolatile storage in the Hard Disk Drive (HDD) memory, and execute program code stored in Random Access Memory (RAM).
[0029] Generally, the data storage device (130) may comprise a computer readable storage medium. For example, the data storage device (130) may be, but not limited to, an electronic, magnetic, optical,
electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the computer readable storage medium may include, for example, the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD- ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer 1 readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0030] Turning again to Fig. 2, the peripheral device adapters (135) in the computing device (1 15) enable the processor (125) to interface with various other hardware elements, external and internal to the computing device (1 15). For example, peripheral device adapters (135) may provide an interface to input/output devices, such as, for example, the display device (105) and the sensor (1 10), a keyboard, and a mouse, among others, to create a user interface and/or access external sources of memory storage, for example.
[0031] Fig. 3 is a diagram of a gearshift graphical user interface (GUI) (300) processed by the processor (125) and displayed on the display device (105). The gearshift GUI (300) may comprise a number of channels (302). The channels comprise a number of terminals (304) that terminate the channels (302). The gearshift GUI (300) further comprises a number of commands (305 through 330) that instruct the gestural interactive system (100) to perform those commands, respectively. The commands (305 through 330) are located juxtaposition to a respective terminal (304) indicating to a user that if such a terminal (304) is selected, then the gestural interactive system (100) will perform the command, juxtaposition to that terminal (304).
[0032] The commands may comprise any function of the gestural interactive system (100). In one example, the commands comprise the following media playback and browsing commands: stop (305), pause/play (310), fast forward (325), rewind (330), next (song/photo/chapter) (320), and previous (song/photo/chapter) (315), among others. In this example, the commands (305 through 330) will cause the gestural interactive system (100) to affect the playback of various forms of media including, for example, music, video, and pictures and the browsing of these forms of media. Although the above commands relate to media playback and browsing, the commands may comprise any other functions used in connection with the gestural interactive system (100) including, for example, computer gaming environments, computing systems in which a mouse and/or keyboard is replaced with gesture interaction, remote control of media devices such as televisions, and robotics in which the gesture of a user is used to control a robotic device, among others.
[0033] The gearshift GUI (300) further comprises a cursor (340). In one example, the cursor (340) is a circle. However, the cursor (340) may be represented as any shape. Using the gestural interactive system (100), a user moves the cursor (340). In this manner, the movement of the cursor (340) corresponds to the movement of the user's hand. The cursor (340) provides feedback to the user regarding the position of the user's hand with respect to the sensor, display device, or a combination of these. In one example, the cursor (340) is visible within the displayed gearshift GUI (300) when the user's hand or controlling device such as a handheld remote is visible to the gestural interactive system (100), and, specifically, the sensor (1 10). In another example, the cursor (340) is visible within the displayed gearshift GUI (300) when the user is interacting with the gestural interactive system (100). In this example, the cursor (340) appears when the user moves his or her hand or controlling device.
[0034] As depicted in Fig. 3, the cursor (340) is located in a neutral position (Fig. 4, 303). While in the neutral position (Fig. 4, 303), the. cursor (340) does not indicate the selection of a command (305 through 330). In the physical world, this is equivalent to the user (120) not gesturing in any direction, or not interacting with the gestural interactive system (100). In one example, the cursor (340) is presented as a relatively larger rendering of the cursor (340) when in the neutral position (Fig. 4, 303) with respect to when the cursor (340) is not in the neutral position (Fig. 4, 303). As depicted in Figs. 4 through 8, the cursor (340) is presented as a relatively smaller rendering of the cursor when hot in the neutral position (Fig. 4, 303).
[0035] Turning now to Figs. 4 and 5, a diagram of a gearshift graphical user interface (GUI) (300) in which a user selects a pause/play command (310) is shown. In Figs. 4 and 5, in order for a user to select the pause/play command (310), the user (120) begins to move his or her hand, for example, in a downward direction. Accordingly, the sensor (110) identifies and tracks the movement of the user's hand, and the processor (125) and display device (105) provide feedback within the gearshift GUI (300) displayed on the display device (105) by moving the cursor (340) in a downward direction relative
to the movement of the user's hand. When the cursor (340) reaches the terminal (304) juxtaposition to the pause/play command (310), the processor (125) of the gestural interactive system (100) receives this input, and
commands the playback of the media to be paused or played/resumed depending on the state of the pause/play command (310) previously. In this manner, the gestural interactive system (100) provides for a single action or gesture to be used in interacting with the gestural interactive system to implement a command.
[0036] Fig. 6 is a diagram of a gearshift graphical user interface (GUI) (300) in which a user (120) initiates selection of a fast forward command (325), according to one example of the principles described herein. This is
accomplished in a similar manner as described in connection with Figs. 4 and 5. Specifically, the user (120) moves his or her hand or handheld device to the right in order to select the fast forward command (325) on the gearshift GUI (300). The display device (105) displays the movement of the cursor (340) to the right and towards the fast forward command (325) of the gearshift GUI (300). The user (120) receives visual feedback from the gearshift GUI (300) that the cursor (340) is in the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325).
[0037] Fig. 7 is a diagram of the gearshift graphical user interface (GUI) (300) of Fig. 6 in which a user (120) initiates selection of a fast forward command (325) but partially deviates from the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325). In one example, if the cursor (340) deviates from a channel (302) of the gearshift GUI (300) to a predetermined degree, then the cursor (340) returns to the neutral position (303). In this manner, the user may cancel a gesture before a command (305 through 330) is selected by causing the cursor (340) to move perpendicular or approximately perpendicular to a channel (302). This method of canceling a gesture is described in more detail in connection with Figs. 8 and 9.
[0038] Fig. 8 is a diagram of the gearshift graphical user interface (GUI) (300) of Fig. 6 in which a user (120) initiates selection of a fast forward
command (325), but fully deviates from a channel (302) in the gearshift GUI (300), and terminates the selection of the fast forward command (325), according to one example of the principles described herein. In Fig. 8, the cursor (340) is moved substantially away from the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325). In this manner, the gesture that the user (120) initiated is now canceled, and the fast forward command (325) is not selected. Fig. 9 is a diagram of the gearshift graphical user interface (GUI) (300) of Fig. 6 in which a cursor (340) returns to the neutral position (303) after the user (120) causes the cursor (340) to fully deviate from the channel (340) that bridges the neutral position (303) and the terminal (304) juxtaposition to the fast forward command (325).
[0039] In one example, full deviation of the cursor (340) from a channel (302) causes the cancelation of a gesture. In another example, if a portion of the cursor (340) deviates from a channel (302), then the gesture is canceled. In yet another example, the amount of deviation of the cursor (340) from a channel (302) that implements a cancelation of the gesture is determined by a user (120) as a user definable parameter.
[0040] Fig. 10 is a flowchart showing an exemplary method of browsing media using single stroke gestural interaction, according to one example of the principles described herein. The method of Fig. 0 may begin by the processor (125) displaying a gearshift GUI (300) to a user (120) on the display device (105) (block 1005). The imaging device (110) images the scene including the user (120) to sense the movements of the user (120) (block 1010).
[0041] In block 1015, the movements of the user (120) are transmitted to the processor (125). The movements of the. user are then translated into movement of the cursor (340), and displayed as feedback to the user (120) on to display device (105) (block 1020). In one example, the processor (125) continually monitors the user's movements and implements the display of these movements on the display device (105). The processor (125) determines, based on the imaged movements of the user (120) by the imaging device (110), if the movements are perpendicular or approximately perpendicular to a channel
(302) of the gearshift GUI (300) (block 1025) indicating the user's instructions to cancel the gesture.
[0042] If the processor (125) determines that the movements are perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025, Determination YES), then the gesture is canceled at block 1030. If, however, the processor (125) determines that the movements are not perpendicular or approximately perpendicular to a channel (302) of the gearshift GUI (300) (block 1025, Determination NO), then the processor continues to monitor whether the movements are perpendicular or
approximately perpendicular to a channel (302) of the gearshift GUI (300) at block 1035. In one example, block 1035 may comprise continually performing the method of block 1025.
[0043] At block 1040, it is determined whether the cursor (340) as interactively controlled by the user (120) has reached a terminal (304) indicating the selection of a command (305 through 330) by the user (120). If the cursor (340) has not reached the terminal (304) (block 1040, determination NO), then the method loops back to block 1035, and the gestural interactive system (100) continues to monitor the movements of the user (120) as described above.
[0044] If, however, the cursor (340) has reached the terminal (304) (block 1040, determination YES) indicating the selection of a command (305 through 330) by the user (120), then the processor (125) executes the command (305 through 330). In this manner, the user (120) can interact with the gestural interactive system (100) to select a playback function, for example.
[0045] The above-described system and method, although described in the context of media playback and browsing may be applied in any command scenario. Specifically, the above described gearshift GUI (300) and associated method of utilizing the gearshift GUI may be applied in any environment wherein a user (120) interactively selects commands via a gestural interactive system (100). For example, the above-described system and method may be used in connection with computer gaming environments, computing systems in which a mouse and/or keyboard is replaced with gesture interaction, remote control of
media devices such as televisions, and robotics in which the gesture of a user is used to control a robotic device, among others.
[0046] The specification and figures describe single stroke gestural interaction. A gearshift graphical user interface (300) is presented to a user (120) on a display device (105) that provides intuitive know-how regarding the operation thereof. The gearshift user interface (300) comprises a number of interactive commands (305 through 330). A sensor (110) detects the position of a user's hand, and the position of the user's hand relative to the command (305 through 330) of the gearshift user interface (300) is represented on the gearshift graphical user interface (300). If the representation of the user's hand indicates selection of a command (305 through 330), then a processor (125) implements the selected command (305 through 330).
[0047] This single stroke gestural interaction may have a number of advantages, including the following: (1 ) the entire set of commands understood by the gestural interactive system, and instructions for performing them are always visible to the user; (2) the gearshift GUI is operated by tracking the coarse position of a single hand; (3) feedback about hand position is provided constantly, and it is possible to cancel out of a gesture at any time simply by moving perpendicular to the channel or groove; and (4) the system is more robust with respect to user and system errors. With regard to the fourth advantage, if a user moves his hand to the left, for example, but does not move it sufficiently upwards, he may end up rewinding the movie rather than going to the previous chapter. However, this is as opposed to having distinct gestures for the different commands where the consequences of user or system errors may be unpredictable.
[0048] The preceding description has been presented only to illustrate and describe embodiments and examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.
Claims
1. A method of gestural interaction comprising:
presenting a gearshift graphical user interface on a display device, the gearshift graphical user interface comprising a number of interactive commands; detecting, with a sensor, the position of a user's hand with respect to the graphical user interface;
representing on the gearshift graphical user interface the position of the user's hand relative to the interactive commands with a curser; and
if the curser indicates selection of a command, then, with a processor, executing the selected command.
2. The method of claim 1 , further comprising determining whether the curser indicates the cancelation of an interactive gesture.
3. The method of claim 2, in which the graphical user interface comprises a number of channels, and in which if the curser deviates from a channel to a predetermined degree, then canceling the interactive gesture.
4. The method of claim 3, in which I the curser deviates
approximately perpendicular to a channel, then canceling the interactive gesture.
5. The method of claim 3, in which if the gesture is canceled, then returning the curser to a neutral position.
6. The method of claim 1 , in which the interactive commands comprise media playback commands.
7. The method of claim 3, in which the channels further comprise terminals, and in which the curser indicates selection of a command if the curser is on a terminal juxtaposition to the command.
8. A gestural interaction system comprising:
a display device that displays a gearshift graphical user interface, the gearshift graphical user interface comprising a number of interactive commands; a sensor that detects the position of a user's hand relative to the interactive commands; and
a processor that:
instructs the display device to display a curser on the gearshift based on the position of the user's hand relative to the interactive command, and
instructs an application to perform the command if the user's hand indicates selection of the command.
9. The gestural interactive system of claim 8, in which the graphical user interface comprises:
a neutral position;
a number of channels branching off from the neutral position;
a number of terminals that terminate at the ends of the channels; and a number of command indicators, each command indicator located juxtaposition to a respective terminal.
10. The gestural interactive system of claim 9, in which a command is selected using a single stroke of the user's hand.
11. The gestural interactive system of claim 8, in which the sensor is an imaging device.
12. A computer program product for browsing media, the computer program product comprising: a computer readable storage medium having computer usable program code embodied therewith, the computer usable program code comprising:
computer usable program code that displays a gearshift graphical user interface on a display device; and
computer usable program code that represents with a curser on the gearshift graphical user interface the position of a user's hand relative to a number of interactive media browsing commands. ,
13. The computer program product of claim 12, further comprising computer usable program code that executes a media browsing command if the cursor indicates selection of a media browsing command.
14. The computer program product of claim 12, further comprising computer usable program code that determines the cancelation of an interactive gesture if the curser deviates from a channel within the gearshift graphical user interface in an approximately perpendicular.
15. The computer program product of claim 14, further comprising computer usable program code that returns the curser to a neutral position if the interactive gesture is canceled.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IN2011/000136 WO2012120520A1 (en) | 2011-03-04 | 2011-03-04 | Gestural interaction |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/IN2011/000136 WO2012120520A1 (en) | 2011-03-04 | 2011-03-04 | Gestural interaction |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012120520A1 true WO2012120520A1 (en) | 2012-09-13 |
Family
ID=46797561
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IN2011/000136 Ceased WO2012120520A1 (en) | 2011-03-04 | 2011-03-04 | Gestural interaction |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2012120520A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103019562A (en) * | 2012-12-07 | 2013-04-03 | 东莞宇龙通信科技有限公司 | Terminal and control tray configuration method |
| GB2513200A (en) * | 2013-04-21 | 2014-10-22 | Biogaming Ltd | Kinetic user interface |
| CN104536556A (en) * | 2014-09-15 | 2015-04-22 | 联想(北京)有限公司 | An information processing method and electronic apparatuses |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070046643A1 (en) * | 2004-08-06 | 2007-03-01 | Hillis W Daniel | State-Based Approach to Gesture Identification |
| US20090102800A1 (en) * | 2007-10-17 | 2009-04-23 | Smart Technologies Inc. | Interactive input system, controller therefor and method of controlling an appliance |
| CN101621609A (en) * | 2008-07-03 | 2010-01-06 | 深圳华为通信技术有限公司 | Method, system and set-top box for performing operations in set-top box |
| CN201689407U (en) * | 2010-03-31 | 2010-12-29 | 北京播思软件技术有限公司 | Device adopting user's gestures to replace exit key and enter key of terminal unit |
| CN101943947A (en) * | 2010-09-27 | 2011-01-12 | 鸿富锦精密工业(深圳)有限公司 | Interactive display system |
-
2011
- 2011-03-04 WO PCT/IN2011/000136 patent/WO2012120520A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20070046643A1 (en) * | 2004-08-06 | 2007-03-01 | Hillis W Daniel | State-Based Approach to Gesture Identification |
| US20090102800A1 (en) * | 2007-10-17 | 2009-04-23 | Smart Technologies Inc. | Interactive input system, controller therefor and method of controlling an appliance |
| CN101621609A (en) * | 2008-07-03 | 2010-01-06 | 深圳华为通信技术有限公司 | Method, system and set-top box for performing operations in set-top box |
| CN201689407U (en) * | 2010-03-31 | 2010-12-29 | 北京播思软件技术有限公司 | Device adopting user's gestures to replace exit key and enter key of terminal unit |
| CN101943947A (en) * | 2010-09-27 | 2011-01-12 | 鸿富锦精密工业(深圳)有限公司 | Interactive display system |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103019562A (en) * | 2012-12-07 | 2013-04-03 | 东莞宇龙通信科技有限公司 | Terminal and control tray configuration method |
| CN103019562B (en) * | 2012-12-07 | 2016-04-06 | 东莞宇龙通信科技有限公司 | Terminal and control tray configuration method |
| GB2513200A (en) * | 2013-04-21 | 2014-10-22 | Biogaming Ltd | Kinetic user interface |
| CN104536556A (en) * | 2014-09-15 | 2015-04-22 | 联想(北京)有限公司 | An information processing method and electronic apparatuses |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9009594B2 (en) | Content gestures | |
| US10261742B2 (en) | Visual focus-based control of couples displays | |
| US9836146B2 (en) | Method of controlling virtual object or view point on two dimensional interactive display | |
| WO2022218146A1 (en) | Devices, methods, systems, and media for an extended screen distributed user interface in augmented reality | |
| Forlines et al. | Hybridpointing: fluid switching between absolute and relative pointing with a direct input device | |
| TWI553540B (en) | Highlighting of objects on a display | |
| US8866781B2 (en) | Contactless gesture-based control method and apparatus | |
| US9041649B2 (en) | Coordinate determination apparatus, coordinate determination method, and coordinate determination program | |
| EP2656193B1 (en) | Application-launching interface for multiple modes | |
| US9007299B2 (en) | Motion control used as controlling device | |
| US20130061180A1 (en) | Adjusting a setting with a single motion | |
| US8605219B2 (en) | Techniques for implementing a cursor for televisions | |
| US20140178047A1 (en) | Gesture drive playback control for chromeless media players | |
| US20130314320A1 (en) | Method of controlling three-dimensional virtual cursor by using portable electronic device | |
| JP2024169556A (en) | Method for Augmented Reality Applications Adding Annotations and Interfaces to Control Panels and Screens - Patent application | |
| US20120229392A1 (en) | Input processing apparatus, input processing method, and program | |
| CN106796351A (en) | By the head-mounted display apparatus and its control method of Control of line of sight, the computer program for controlling the device | |
| US20120304063A1 (en) | Systems and Methods for Improving Object Detection | |
| CN104685461A (en) | Input device using input mode data from controlled device | |
| US20130127731A1 (en) | Remote controller, and system and method using the same | |
| EP2960763A1 (en) | Computerized systems and methods for cascading user interface element animations | |
| CN104182035A (en) | Method and system for controlling television application program | |
| KR20190059727A (en) | Interactive system for controlling complexed object of virtual reality environment | |
| WO2012120520A1 (en) | Gestural interaction | |
| EP2698697A2 (en) | Method of searching for playback location of multimedia application and electronic device thereof |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11860295 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11860295 Country of ref document: EP Kind code of ref document: A1 |