[go: up one dir, main page]

HK1185676A - Method and apparatus for gesture based controls - Google Patents

Method and apparatus for gesture based controls Download PDF

Info

Publication number
HK1185676A
HK1185676A HK13112937.6A HK13112937A HK1185676A HK 1185676 A HK1185676 A HK 1185676A HK 13112937 A HK13112937 A HK 13112937A HK 1185676 A HK1185676 A HK 1185676A
Authority
HK
Hong Kong
Prior art keywords
video
gesture
playing
gestures
command
Prior art date
Application number
HK13112937.6A
Other languages
Chinese (zh)
Other versions
HK1185676B (en
Inventor
R.海斯
Original Assignee
TiVo解决方案有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TiVo解决方案有限公司 filed Critical TiVo解决方案有限公司
Publication of HK1185676A publication Critical patent/HK1185676A/en
Publication of HK1185676B publication Critical patent/HK1185676B/en

Links

Description

Method and apparatus for gesture-based control
Technical Field
The invention relates to the utilization of gestures. In particular, the present invention relates to gesture-based control for multimedia content.
Background
The approaches described in this section are approaches that could be pursued, but not necessarily approaches that have been conceived or pursued. Thus, unless otherwise indicated, any methods described in this section should not be construed as prior art, merely as a result of their inclusion in this section.
Multimedia content, such as web pages, images, videos, slides, text, graphics, sound files, audio/video files, etc., may be displayed or played on the device. The user may submit commands related to playing or displaying content on the device, either on the device itself or on a separate device that acts as a remote control.
For example, a user may select a button on the remote control to play, pause, stop, rewind, or fast forward a video being played on a television.
Drawings
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
FIG. 1 is a block diagram illustrating an example system in accordance with one or more embodiments;
FIG. 2 illustrates a flow diagram for detecting gestures in accordance with one or more embodiments;
FIG. 3 illustrates an example interface in accordance with one or more embodiments;
fig. 4 shows a block diagram illustrating a system on which embodiments of the invention may be implemented.
Detailed Description
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It may be evident, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
Several features are described below, each of which can be used independently of the other or in conjunction with other features. However, any individual feature may not solve any of the above problems or may only solve one of the above problems. Some of the above problems may not be fully solved by any of the features described herein. Although headings are provided, information related to a particular heading may not be found in the section having that heading, but may be found elsewhere in the specification.
Example features are described according to the following schema:
1.0 overview
2.0 System architecture
3.0 gestures
4.0 gesture area
5.0 Command
6.0 detecting gestures within gesture regions
7.0 example gestures and commands
8.0 remote controller use example
9.0 example embodiment
10.0 hardware overview
11.0 extensions and substitutions
1.0 overview
In one embodiment, a gesture is detected in a particular region of a touch screen interface of a device. The gesture may not necessarily select or move any visual object within a particular region. For example, the gesture may be detected in a blank box above the video, above the instructional information for performing the gesture, and so on. A video playback command associated with the gesture can be identified and an action corresponding to the video playback command can be determined. The action is then performed on the same device that detected the gesture. The action may be performed on a different device communicatively connected to the device that detected the gesture.
In one embodiment, parallel or identical gestures may be performed concurrently on a touch screen interface using a multi-input tool (e.g., multiple fingers). Based on the number of gestures detected, an action may be selected. For example, the number of gestures may be used to select a particular entry from a menu or to identify a command.
Although specific components are described herein to perform the method steps, in other embodiments, means or mechanisms representing specific components may also perform the method steps. Further, while aspects of the invention are described with components in one system, the invention may be practiced with components distributed across multiple systems. Embodiments of the present invention also include any system comprising means for performing the method steps described herein. Embodiments of the present invention also include computer-readable media having instructions that, when executed, cause performance of the methods described herein.
2.0 System architecture
Although a particular computer architecture is described herein, other embodiments of the invention may be applied to any architecture that is capable of being used to perform the functions described herein.
FIG. 1 is a block diagram illustrating an example system (100) in accordance with one or more embodiments. The example system (100) includes one or more components that function as content sources, a touch screen interface device, a multimedia device (e.g., a device that plays audio and/or video content), and/or a content management device. Certain components are shown to clarify the functions described herein, which are not necessary to implement one or more embodiments. Each of these components is shown to clarify the functionality described herein, which is not necessary to implement one or more embodiments.
Components not shown in fig. 1 may also be used to perform the functions described herein. The functions performed by one described element may alternatively be performed by another element.
The example system (100) may include one or more of the following: an input device (110), a multimedia device (140), and a data repository (150). One or more of the devices illustrated herein may be combined into a single device or further divided into multiple devices. For example, the input device (110) and the multimedia device (140) may be implemented on a single device. The multimedia device (140) may be configured to play audio and/or video content. The multimedia device (140) may be configured to display one or more still images. In another example, the input device (110) may be used as a remote control for detecting gesture-based commands related to content being displayed on a separate multimedia device (140). The input device (110) may communicate directly with the multimedia device (140) or may communicate with an intermediate device (not shown). The intermediary device may, for example, serve as a content source for the multimedia device (140) or the media management device. For clarity, a network bus (102) is shown connecting all of the components within the system (100). The network bus (102) may represent any local area network, intranet, internet, etc. The network bus (102) may include wired and/or wireless segments. All components (shown as communicatively connected) may not necessarily be communicatively connected to all other components within the system (100).
In one embodiment, the input device (110) may include a touchscreen interface (115) configured to detect one or more gestures described herein. The input device (110) may be configured to detect gestures, paths of gestures, speeds of gestures, accelerations of gestures, directions of gestures, and the like.
In one example, the input device (110) may include an impedance system in which current passes through two layers that are in contact at a touched spot/area on the touch screen interface (115). The coordinates of the point or spot of contact may be compared to gesture information stored in the data store (150) to identify a gesture performed by the user on the touch screen interface (115). In another example, the input device (110) may include a capacitive system having a layer storing an electrical charge, a portion of which is transferred to the user where the user touches the touchscreen interface (115). In another example, the input device (110) may include a surface acoustic wave system having two transducers (transducers) where electrical signals are sent from one transducer to the other. Any interruption of the electrical signal (e.g., due to a user touch) may be utilized to detect a point of contact on the touch screen interface (115). For example, the input device (110) may be configured to first detect an initial user touch directed to a visual representation of data displayed on the touch screen interface.
In one embodiment, the input device (110) may include hardware configured to receive data, transmit data, or otherwise communicate with other devices in the system (100). For example, the input device (110) may be configured to detect a gesture performed by a user and perform a video playback action associated with the gesture. In another embodiment, the input device (110) may include functionality associated with the gesture to send information (which may be referred to herein as such and used interchangeably with "metadata"). For example, the input device (110) may be configured to transmit information containing a temporal sequence of contact points detected on the touchscreen interface (115).
In one embodiment, the input device (110) may include one or more of the following: read Only Memory (ROM) (206), a Central Processing Unit (CPU), Random Access Memory (RAM), an Infrared Control Unit (ICU), a keyboard scan, a keyboard, non-volatile memory (NVM), one or more microphones, a general purpose input/output (GPIO) interface, speakers/speakers, key transmitters/indicators, microphones, radios, infrared transmitters (IR blasts), network cards, displays, Radio Frequency (RF) antennas, standard keyboards, network cards, network adapters, Network Interface Controllers (NICs), network interface cards, local area network adapters, ethernet cards, and/or any other component capable of receiving information over a network. In one embodiment, the input device (110) may be configured to communicate with one or more devices through wired and/or wireless segments. For example, the input device (110) may communicate wirelessly through one or more of: radio waves (e.g., WiFi signals, bluetooth signals), infrared waves, any other suitable frequency in the electromagnetic spectrum, network connections (e.g., intranet, internet, world wide web, etc.), or any other suitable method.
In one embodiment, the input device (110) generally represents any device that may be configured to detect gestures as user input. A user, including any operator of the input device (110), may perform a gesture by touching the touchscreen interface (115) on the input device (110). For example, the user may perform the gesture by tapping the touch screen interface (115) with a finger or sliding a finger across the touch screen interface (115).
For clarity, the examples described herein may be referred to as a particular input tool (e.g., a user's finger) performing a gesture. However, according to one or more embodiments, any input tool, including but not limited to a stylus, user finger, pen, stylus, etc., may be used to perform gestures.
Gestures associated with touching or contacting the touch screen interface (115), as referred to herein, may include hovering with a finger (or other input tool) over the touch screen interface (115) without having to touch the touch screen interface (115) such that the touch screen interface (115) detects the finger (e.g., due to a transfer of charge at a location on the touch screen interface (115)).
3.0 gestures
In one embodiment, the tapping gesture may be performed by touching a particular location on the touch screen interface (115) and then releasing contact with the touch screen interface (115). The tapping gesture may be detected by detecting a contact to the touch screen interface (115) at a particular location, followed by detecting that the contact is released.
A tap gesture may refer to a gesture performed using one or more fingers. For example, a two-finger tap may be performed by concurrently touching two locations on the touch screen interface (115) with two fingers and then releasing contact with the touch screen interface (115). A tap of two fingers may be detected by concurrently detecting a contact at two locations on the touch screen interface (115) and subsequently releasing the contact.
In one embodiment, the swipe gesture may include any action of the user swiping one or more fingers across the surface of the touch screen interface (115). Examples of swipe gestures include a flick (flick) gesture, a swipe gesture, or a gesture involving moving a finger along any path on the touch screen interface (115). The path may be a closed shape, such as a circle or square, with the same starting and ending points, or an open shape, such as a right angle, with different starting and ending points. Examples of paths include, but are not limited to, straight lines, curved lines, circles, squares, triangles, corners, and the like.
In one embodiment, a flick gesture may be performed by: touching a particular location on a touch screen interface (115) of an input device (110) with a finger (or any other object, e.g., a stylus) and sliding the finger away from the particular location while maintaining contact with the touch screen interface (115) for a portion of the sliding action performed by the user and continuing the sliding action even after the contact with the touch screen interface (115) is completed. In one embodiment, the touch screen interface (115) may be configured to detect the approach of a finger after physical contact with the touch screen interface (115) has ended.
For example, the user may release contact with the touch screen interface (115) while still moving the finger in the direction of the sliding motion, although additional surface area of the touch screen interface (115) is available in the direction of the sliding motion to continue the sliding motion while maintaining contact.
In another embodiment, the flick gesture may include a user touching a particular location on a touchscreen interface (115) of the input device (110) and then sliding a finger beyond an edge of the touchscreen interface (115) while maintaining contact with the touchscreen interface (115). Thus, the user may remain in contact with the touch screen interface (115) (e.g., with a finger) until the finger reaches the edge of the touch screen interface (115) and continues to act in the same direction past the edge of the touch screen interface (115).
A user performing a flick gesture may continue the sliding action after releasing contact with the touch screen interface (115). The input device (110) may detect that contact between the finger and the touch screen interface (115) is released as the finger is still moving based on a period of contact with the touch screen interface at the last point of contact. The release detected in the finger movement may be determined as a flick gesture.
In one embodiment, a swipe gesture may be performed by touching a particular location of a touch screen interface (115) of an input device (110) with a finger and sliding the finger off of the particular location while maintaining contact with the touch screen interface (115) during a sliding motion.
In another embodiment, the user may slide a finger along the touch screen interface (115) from a first position to a second position, and thereafter stop by maintaining contact with the second position for a threshold period of time (e.g., one second). The detected continuous contact with the second location may be used to determine that the user has completed the swipe gesture.
In one embodiment, a swipe motion (e.g., a swipe or flick) may be detected before the swipe motion is completed. For example, a swipe to the right gesture may be detected by detecting a contact at a first location and then a second location contact to the right (or to some extent to the right) of the first location. The user may continue the slide gesture to a third position to the right of the second position, however, using the first and second positions may already detect the direction of the slide gesture.
In one embodiment, a flick gesture and a swipe gesture (e.g., in the same direction) may be mapped to the same video playback command. Thus, the device may be configured to detect a flick gesture or flick gesture and recognize the same video playback command in response to the detected flick gesture or flick gesture.
In one embodiment, a flick gesture and a swipe gesture (possibly in the same direction) may be mapped to different commands. For example, a flick gesture to the left may correspond to a twenty second rewind command, and a swipe gesture to the left may correspond to a command to select a previously bookmarked scene in the video. Scenes may be marked or hard-coded, for example by a user, into a medium that records optional scenes, for example from a movie recorded on a Digital Video Disc (DVD).
In one embodiment, a swipe gesture may be performed with multiple input tools used concurrently. For example, the user may slide two fingers on the touch screen interface concurrently. Further, the user may slide both fingers concurrently in parallel (e.g., sliding both fingers in the same direction from left to right).
The term "concurrent" is herein meant to include substantial concurrency. For example, two fingers performing a parallel gesture concurrently may mean that two fingers of different lengths perform the same gesture at slightly different times. For example, one finger may lag in time relative to the other finger for starting and/or ending the gesture. Thus, two fingers may start and end gestures at different start and/or end times.
The term "parallel" is meant herein to include paths in substantially the same direction. The two fingers perform a parallel motion, which here means that the user drags both fingers in the same direction on the touch screen interface. Due to differences in finger length or due to the angle of the hand, two or more fingers performing parallel motions in the same general direction may differ in direction by a few degrees. In one embodiment, the paths along which two parallel gestures are performed may overlap. The term "parallel," as used herein, may refer to any combination of two or more gestures performed in the same general direction.
4.0 gesture area
In one embodiment, the touch screen interface (115) includes a gesture area. The gesture area is at least a portion of a touch screen interface (115) configured to detect a gesture performed by a user. The gesture area may include the entire touch screen interface (115) or a portion of the touch screen interface (115). The gesture area may display a blank box or one or more entries. For example, the gesture area may display a video. In another example, the gesture area may display information about how to perform the gesture.
In one embodiment, gestures may be detected within a gesture area without requiring the user to interact with any visual objects that may be displayed within the gesture area. For example, a swipe gesture on a touchscreen interface (115) of a cell phone may be detected in a gesture area of an empty box on the touchscreen interface. In another example, the progress indicator displayed in the gesture area is free of a detected swipe gesture touch, the swipe gesture associated with a rewind (rewind) command.
In one embodiment, any visual objects displayed in the gesture area are not necessarily used to detect a gesture or to determine a command related to a gesture. In one embodiment, any visual objects displayed within the gesture area are not selected or dragged by the finger performing the gesture.
In one embodiment, the touch screen interface (115) may include a plurality of gesture regions. A gesture detected within one gesture area may be mapped to a different command than the same gesture performed in a different gesture area. The device may be configured to identify a region where a gesture was performed and determine an action based on the gesture and the gesture region where the action was performed.
In one embodiment, when a gesture is detected across multiple gesture regions, a gesture region of the multiple gesture regions may be selected by the device. The gesture area that initiated the gesture may be identified as the selected gesture area. For example, the user may begin the swipe gesture in the first gesture area and end the swipe gesture in the second gesture area. A swipe gesture is detected as being initiated within the first gesture area, and in response, a command mapped to the gesture and the first gesture area may be selected. In another example, a gesture region where the end of the swipe motion is detected may be identified as an intentional gesture region. The selected gesture area or intentional gesture area may be used to recognize the command.
5.0 Command
In one embodiment, the gesture may be mapped (or associated) to a command. For example, the command mapped to the gesture may be a video playback (playback) command related to playback of a video. The command may relate to playback of video on the device or another device receiving the command.
In one embodiment, the command may specify a video playback speed and direction. For example, the command may select to rewind at a particular rewind speed or fast forward at a particular fast forward speed. Examples of other video playback commands include, but are not limited to, pausing video play, continuing video play, replaying a played portion of a video, stopping video play, and continuing video play at a particular play location, playing a video in slow motion, browsing videos in a frame-stepping (frame-stepping) manner, playing a video from the beginning, playing one or more videos from the next list, playing a video forward from a particular scene, marking a play location of a video, stopping play and continuing play at the marked location, or rating a video.
In one embodiment, the command may select a particular option from a list of options. For example, a list of available media content may be displayed on the screen, and the command may select particular ones of the available media content. In another example, a list of configuration settings may be displayed and the command may select a particular setting for change.
6.0 detecting gestures within gesture regions
FIG. 2 illustrates a flow diagram for detecting a gesture within a gesture area. One or more of the steps described below may be omitted, repeated, and/or performed in a different order. Accordingly, the particular arrangement of steps shown in FIG. 2 should not be construed as limiting the scope of the invention.
In one or more embodiments, detecting the gesture may include detecting an interface contact at an initial location that is part of the detected gesture (step 202). The initial contact on the touch screen interface may be made with a user's finger, a stylus, or any other object that may be used to perform a gesture on the touch screen interface. The initial contact with the touch screen interface may include a quick touch at the initial location (e.g., a tap gesture) or a touch held at the initial location for any period of time (e.g., one millisecond, one second, two seconds, etc.). The initial contact with the touch screen interface may be as brief as if the finger had moved in a certain direction. For example, a finger moves in air without contact, and thereafter makes initial contact with a portion of the touch screen interface during the movement.
In one embodiment, the initial contact, referred to herein, may include a finger (or other object) being sufficiently close to the touch screen interface such that the touch screen interface detects the finger. For example, when using a device comprising a capacitive system with a layer storing an electrical charge, a portion of the electrical charge may be transferred to the user, to where the user touches the touch screen interface or where the user merely hovers close to the touch screen interface without touching. Thus, initial contact or maintained contact, referred to herein, may include a user hovering a finger or other object over a touch screen interface.
In one embodiment, the initial contact on the touch screen interface does not select any visual object displayed on the touch screen interface. When no visual object is displayed, an initial touch may be made. The initial contact may be made over the display of the visual object without selecting the visual object. For example, the initial contact may be made on a touch screen interface that is displaying a user selected background image for the cell phone. In another example, the initial contact may be made on a blank screen. The initial contact may be detected above a television program being played on the tablet computer.
In one or more embodiments, detecting a gesture may further include detecting interface contact at additional locations on the touchscreen interface (step 204). For example, detecting a flick gesture or a swipe gesture may include chronologically detecting interface contacts at additional locations along a path from an initial contact location. For example, the interface contact may be detected continuously along a path in a leftward direction from the initial contact location of the touch screen interface.
Contact along a path away from the location of the initial point of contact may be referred to herein as a swipe gesture. In one or more embodiments, a speed or direction of the swipe gesture may be determined. For example, contact at two or more locations of the interface (such as an initial contact point and a second contact point along a path of the swipe gesture) may be used to determine a direction and/or speed of the swipe gesture. The contact at multiple points may be used to calculate the acceleration of the swipe gesture.
In one or more embodiments, a gesture may be recognized based on detected contact at one or more locations on the touch screen interface (step 206). For example, detecting concurrent contact at three locations on the remote control interface, followed by release of contact at all three locations, may be recognized as a tap gesture with three fingers. In one embodiment, detecting the gesture may include identifying a path along which the contact detected on the touch screen follows. For example, in response to detecting a contact along a circular path on the touch screen interface, a circular gesture can be recognized. Based on the chronological contact point on the touch screen interface, a flick gesture or a swipe gesture may be recognized.
In one or more embodiments, recognizing the gesture may include determining a number of concurrent parallel gestures (step 208). For example, the initial contact may be detected concurrently at multiple locations on the touch screen interface. After the initial contact at each initial position, the contact along the path from the initial position may be detected. If the paths are determined to be parallel, the number of paths may be identified to determine the number of concurrent parallel gestures.
In one embodiment, the number of concurrent parallel gestures may be determined based on the number of paths that conform to a known configuration. For example, if a path has at least a first contact point and a subsequent second contact point within 10 degrees of a horizontal line to the right from the first contact point, the path may be determined to correspond to a swipe gesture to the right. The number of detected gestures, which correspond to paths that meet the same criteria over a particular time period, may be counted to determine the number of concurrent parallel gestures. In one embodiment, other methods not described herein may be used to determine the number of concurrent parallel gestures.
In one embodiment, a command is determined based on the recognized gesture (step 210). A command may also be determined while the gesture is still being performed or after the gesture is completed.
In one embodiment, determining a command may include determining that a particular detected gesture is mapped to a command in a database. For example, a right swipe of two fingers may be queried in a command database to identify a command associated with the two finger swipe. In another embodiment, a flick of two fingers toward the bottom of the gesture area may be associated with a command to select a second menu item from the items currently displayed by the menu.
In one embodiment, the number of parallel fingers in a command may be used to determine the playback speed at which the multimedia content is played. For example, detecting two parallel gestures may be mapped to a command for playback speed that is twice the normal playback speed.
In one embodiment, the direction of the gesture command and the number of parallel fingers in the gesture command may be combined to determine the playback speed. For example, a concurrent swipe of two fingers from the right side of the screen to the left side of the screen may be mapped to rewind at twice the normal speed. In another embodiment, a concurrent swipe of two fingers from the left to the right of the screen may be mapped to fast forward at twice the normal playback speed (no fast forward).
In one embodiment, the command may include resuming playing the video at a particular tag (e.g., a user-defined tag or a manufacturer-defined tag). The number of fingers to perform the concurrent parallel gesture may be used to select the label. For example, a two finger flick down is detected, and in response, video playback may continue at a second label from the current playback position.
In one embodiment, determining a command may include identifying a device corresponding to the command. For example, a device associated with a command may be identified based on the gesture and/or the gesture area in which the gesture was detected.
In one embodiment, the action corresponding to the command is performed (step 212). The action may be performed by the device that detected the command. For example, if a gesture for a fast forward command is detected on a handheld touchscreen telephone that is playing a video, the handheld touchscreen telephone plays the video in fast forward mode.
In one embodiment, the action corresponding to the command may include sending information related to the command to another device. For example, a gesture may be detected on a touch screen remote control. Information related to the gesture (e.g., information identifying the gesture or information identifying a command associated with the gesture) is then sent to the dvd player. The dvd player may then perform the corresponding action. If the command is to pause video playback, the DVD player may pause video playback on the display screen.
7.0 example gestures and commands
FIG. 3 illustrates a screenshot of an example screen for an input device configured to detect gestures. Gestures, commands, mappings between gestures and commands, gesture regions, visual objects, and any other items related to FIG. 3 are examples and should not be construed as limitations on scope. One or more of the items described with respect to fig. 3 need not be implemented, and other items described may be implemented in accordance with one or more embodiments.
FIG. 3 shows an example interface (300) with a circular gesture area (305) and a square gesture area (310). Any gesture detected in the circular gesture area (305) is mapped to a navigation command. For example, two finger taps detected in the circular gesture area (305) may be associated with such a command: a second item is selected on the currently displayed menu. If the second item is a folder, the items in the folder may be displayed in response to detecting the two finger tap.
In one embodiment, the square gesture area (310) may identify commands associated with one or more gestures detected within the square gesture area (310). For example, the square gesture area (310) may include a graphic that represents: a single finger swipe to the left gesture corresponds to a rewind command, a single finger tap gesture corresponds to a pause command, a single finger swipe to the right gesture corresponds to a fast forward command, a two finger swipe to the left gesture corresponds to a ten second rewind, a two finger tap gesture corresponds to a slow motion playback command, and a two finger swipe to the right corresponds to a jump to the next tag command.
In one embodiment, the example interface (300) may include a progress indicator (315) that is separate from the circular gesture area (305) and the square gesture area (310). The progress indicator (315) may include a current play position, a tag, a current playback speed, etc. of the video. For example, the progress indicator (315) may include a symbol representing the current playback speed (e.g., play, fast forward at 1x, pause, rewind at 2x, etc.).
In one embodiment, the symbol may be displayed in response to a command. For example, in response to a rewind by 3 command, a symbol indicating 3 rewind by 3 may be displayed while rewinding multimedia content by 3x is performed by displaying frames in reverse order three times the normal playback speed. However, the progress indicator (315) need not necessarily be selected by a gesture associated with the video playback command. In one embodiment, no visual object in the example interface (300) is necessarily selected when a user performs a gesture within the example interface (300).
In one embodiment, the example interface (300) may also include tools (e.g., drop-down boxes) to select a particular media device to be controlled by the detected gesture. In one embodiment, the example interface (300) may include options to toggle between input mechanisms (e.g., input-based gestures, buttons, text boxes, radio boxes, etc.).
8.0 remote control use example
In one embodiment, a remote control device communicates with a media device (e.g., a digital video recorder, a digital video disc player, a media management device, a video recorder, a blu-ray player, etc.). The remote control device may communicate with the media device via wired and/or wireless communication segments. For example, the remote control device may communicate via a network (e.g., the internet, intranet, etc.), radio, bluetooth, infrared, etc.
In one embodiment, the remote controller displays a progress indicator (315) shown in the screenshot (300) of FIG. 3. The progress indicator (315) may indicate a play position of the multimedia content displayed on the separate multimedia device. The progress indicator (315) may display the exact play position or the approximate play position. For example, the progress indicator (315) can include a slider (320) displayed along the trick play bar (330) to indicate the play position. In one embodiment, a particular play position may be indicated by time (e.g., 8: 09). The time may indicate, for example, the actual streaming time at which the content is currently playing or may indicate an offset from the beginning of the content.
In one embodiment, information related to the playing location of the multimedia content may be obtained from a media device (e.g., a digital video recorder, a cable box, a computer, a media management device, a digital video disc player, a multimedia player, an audio player, etc.). For example, a remote control device communicatively connected with a media device may be configured to receive frame information related to a particular frame displayed (played) by the media device. In one embodiment, the media device may periodically send frame information to the remote control device. Alternatively, the remote control device may periodically request frame information from the media device. The remote device uses this information to position the slider along the trick play bar (330). The remote control device may also receive information from the media device indicating a degree of a cache bar (325) indicating an amount of the media device storing or recording the multimedia content. If the media device is in the process of recording or caching multimedia content, the cache bar (325) will increase in size as more content is recorded or cached by the media device. If the media device is playing recorded multimedia content, the cache bar (325) extends the length of the trick play bar (330).
Another example may include a remote control device configured to receive a timestamp closest to a frame being played. The remote control device may also be configured to use a step function (stepfunction), for example, using the next or previous frame of the timestamp if no frame exactly matches the timestamp. Another example may include a remote control device that continuously receives images (e.g., bitmaps, display instructions, etc.) from a media device of the progress indicator for display on the remote control device. In one embodiment, the remote control device may include a particular starting location and display rate for use by the remote control device to determine the playback location of the multimedia content. For example, the digital video recorder may send the initial play position in the multimedia content play along with the progress rate (e.g., change per unit time slider (320), frame rate, etc.) to the remote control device. The remote control device may use this information to first display a progress indicator based on the initial play position and then may calculate a subsequent position over time.
In one embodiment, the slider (320) and the displayed video are out of synchronization when a trick play function is performed (e.g., when a ten second rewind is performed). In response to the trick play function, updated information regarding the new play position may be provided to the remote control device.
In one embodiment, the remote control device may further receive an update that selects a particular play position or indicates a change in the rate of progress. For example, the user may submit one or more commands to pause the playing of the multimedia content at the current play position, then jump back from the current play position for 10 seconds, and then continue playing. In this case, the media device may provide information to the remote control device to pause the slider (320), display a new play position corresponding to 10 seconds before the current play position by moving the slider (320), and then continue to periodically update the slider (320).
In one embodiment, the slider (320) may be updated when the remote control device is activated. For example, when a user picks up or touches the remote control device, the remote control device may request playback position information from the media device. For example, the remote control device may include an accelerometer configured to detect motion and/or a touch screen interface configured to detect touch. In response, the media device may provide playback position information to the remote control device. The remote control device may then display a slider (320) indicating a current play position of the multimedia content based on the play position information received from the media device.
In one embodiment, the remote control device may continuously receive information related to the play position of the multimedia content such that the remote control device continuously updates the slider (320). In another embodiment, information related to the play position of the multimedia content may be received periodically, and the remote control device may update the slider each time the information is received.
In one embodiment, the remote control device may send multimedia content to the multimedia device for display by the multimedia device. For example, the remote control device may obtain a video stream over the internet and send the video stream to the multimedia device for display on the multimedia device. In this example, the remote control device may determine the display position of the slider (320) based on play position information determined by the remote control device itself, e.g., the remote control device may calculate play position information based on frames sent from the remote control device to the multimedia device.
9.0 example embodiment
In one embodiment, a method includes detecting, at a particular region on a touch screen interface of a device, a swipe gesture from a first location of the particular region to a second location of the particular region; identifying a video playback command based at least on the swipe gesture; performing an action associated with the video playback command; wherein the method is performed by at least one device.
In one embodiment, a swipe gesture is detected without detecting selection of any video progress indicators displayed in a particular region. A swipe gesture in a particular region may be detected while at least a portion of the video is displayed in the particular region. A swipe gesture in a particular region may be detected while displaying information in the particular region on how to perform one or more gestures.
In one embodiment, identifying the video playback command is further based on a particular one of the plurality of regions of the touch screen interface in which the swipe gesture was detected.
In one embodiment, performing the action includes: the first device sends information to the second device, the information based on the video playback command. Performing the action associated with the video may include performing the action on the same device that detected the swipe gesture. The video playback command may select a play speed and direction.
In one embodiment, the swipe gesture includes a swipe gesture from a first location to a second location. The swipe gesture may include a flick gesture starting from a first location.
In one embodiment, the video playback command is for one or more of: pausing the video playing; continuing to play the video; playing back the part of the video which is already played; stopping video playing; stopping video playing and continuing video playing at a specific playing position; playing the video in slow motion; playing the video from the head; playing one or more videos from the next playlist; playing the video from a specific scene forward; marking a playing position in the video; stopping playing and continuing playing at the marked position; or for video rating.
In one embodiment, a method comprises: concurrently detecting a plurality of parallel gestures on a touch screen interface of a device; determining a number of a plurality of parallel gestures; selecting a command from a plurality of commands based on a number of parallel gestures; an action associated with the command is performed.
In one embodiment, selecting the command includes selecting a menu option based on a number of parallel gestures. The plurality of parallel gestures may include a plurality of parallel swipe gestures performed in the same direction.
In one embodiment, determining the number of the plurality of parallel gestures includes determining a number of tap gestures concurrently performed on the touch screen interface.
Although specific components are described herein as performing the method steps, in other embodiments, means or mechanisms representing specific components may also perform the method steps. Further, while some aspects of the invention are discussed with respect to components in one system, the invention may be implemented with components distributed across multiple systems. Embodiments of the present invention also include any system comprising means for performing the method steps described herein. Embodiments of the present invention also include computer-readable media having instructions that, when executed, cause performance of the methods described herein.
10.0 hardware overview
According to one embodiment, the techniques described herein are implemented by one or more specialized computing devices. The specialized computing device may be hardwired to perform the techniques, or may include digital electronic devices such as one or more Application Specific Integrated Circuits (ASICs) or Field Programmable Gate Arrays (FPGAs) firmly programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to execute techniques of program instructions in firmware, memory, other storage, or a combination thereof. Such specialized computing devices may also combine custom hardware wired logic, ASICs or FPGAs, and custom programming to accomplish the techniques. The specialized computing device may be a desktop computer system, portable computer system, handheld device, network device, or any other device that incorporates hardwired and/or programmed logic to implement the techniques.
For example, FIG. 4 is a block diagram that illustrates a computer system 400 by which an embodiment of the invention may be implemented. Computer system 400 includes a bus 402 or other communication mechanism for communicating information, and a hardware processor 404 coupled with bus 402 for processing information. Hardware processor 404 may be, for example, a general purpose microprocessor.
Computer system 400 also includes a main memory 406, such as a Random Access Memory (RAM) or other dynamic storage device, coupled to bus 402 for storing information and instructions to be executed by processor 404. Main memory 406 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 404. Such instructions, when stored in a non-transitory storage medium accessible to processor 404, make computer system 400 a special-purpose machine that is customized to perform the operations specified in the instructions.
Computer system 400 also includes a Read Only Memory (ROM) 408 or other static storage device coupled to bus 402 for storing static information and instructions for processor 404. A storage device 410, such as a magnetic disk or optical disk, is provided and coupled to bus 402 for storing information and instructions.
Computer system 400 may be coupled via bus 402 to a display 412, such as a Cathode Ray Tube (CRT), for displaying information to a computer user. An input device 414, including alphanumeric and other keys, is coupled to bus 402 for communicating information and command selections to processor 404. Another type of user input device is a pointer controller 416, such as a mouse, a trackball, or pointer direction keys for communicating direction information and command selections to processor 404 and for controlling pointer movement on display 412. The input device typically has two degrees of freedom in two axes, a first axis (e.g., x-axis) and a second axis (e.g., y-axis), which enables the device to specify positions in a plane.
Computer system 400 may implement the techniques described herein using custom hardwired logic, one or more ASICs or FPGAs, firmware, and/or programming logic that, in combination with the computer system, causes or programs computer system 400 to become a special-purpose machine. According to one embodiment, computer system 400 performs the techniques described herein in response to processor 404 executing one or more sequences of one or more instructions contained in main memory 406. Such instructions may be read into main memory 406 from another storage medium, such as storage device 410. Execution of the sequences of instructions contained in main memory 406 causes processor 404 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
The term "storage medium" as used herein refers to any non-transitory medium that stores data and/or instructions that cause a machine to operate in a specific manner. Such storage media may include non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, which store 410. Volatile media includes dynamic memory, such as main memory 406. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state disk, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FALSH-EPROM, NVRAM, any other memory chip or cartridge.
A storage medium is distinct from, but may be used in conjunction with, a transmission medium. The transmission medium participates in transmitting information between the storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 402. Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 404 for execution. For example, the instructions may initially be carried on a magnetic disk or a solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 400 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 402. Bus 402 carries the data to main memory 406, from which processor 404 retrieves and executes the instructions. The instructions received by main memory 406 may optionally be stored on storage device 410 either before or after execution by processor 404.
Computer system 400 also includes a communication interface 418 coupled to bus 402. Communication interface 418 provides a two-way data communication coupling to a network link 420, and network link 420 is connected to a local network 422. For example, communication interface 418 may be an Integrated Services Digital Network (ISDN) card, a cable modem, a satellite modem, or a modem to provide a data communication connection to a corresponding telephone line type. In another example, communication interface 418 may be a Local Area Network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 418 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Network link 420 typically provides data communication through one or more networks to other data devices. For example, network link 420 may provide a data connection through local network 422 to a host computer 424 or to data equipment operated by an Internet Service Provider (ISP) 426. ISP426 in turn provides data communication services through the world wide message data communication network now commonly referred to as the "Internet" 428. Local network 422 and internet 428 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 420 and through communication interface 418, which carry the digital data to and from computer system 400, are exemplary forms of transmission media.
Computer system 400 can send messages and receive data, including program code, through the network(s), network link 420 and communication interface 418. In the internet example, a server 430 might transmit a requested code for an application program through internet 428, ISP426, local network 422 and communication interface 418.
The received code may be executed by processor 404 as it is received, and/or stored in storage device 410, or other non-volatile storage for later execution.
The received code may be executed by processor 604 as it is received, and/or stored in storage device 610, or other non-volatile storage for later execution. In one embodiment, an apparatus is a combination of one or more hardware and/or software components described herein. In one embodiment, a subsystem for performing a step is a combination of one or more hardware and/or software components that may be configured to perform the step.
11.0 extensions and substitutions
In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. Thus, the invention is directed solely and exclusively to the invention and which is intended by the applicant to be the invention, in the form in which the claims including any subsequent amendments are issued in accordance with the claims as issued in this application. Any definitions expressly set forth herein for terms contained in such claims shall govern the meaning of such terms as used in the claims. Hence, no limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should limit the scope of such claim in any way. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims (24)

1. A method, comprising:
detecting a sliding gesture from a first position of a specific area to a second position of the specific area in the specific area on a touch screen interface of a device;
based at least on the swipe gesture, identifying a video playback command for the video;
performing an action associated with the video playback command.
2. The method of claim 1, wherein the swipe gesture is detected without detecting selection of any video progress indicators displayed within the particular region.
3. The method of claim 1, wherein the swipe gesture is detected in the particular region and concurrently, at least a portion of the video is displayed in the particular region.
4. The method of claim 1, wherein the swipe gesture is detected in the particular region while information about how to perform one or more gestures is displayed in the particular region.
5. The method of claim 1, wherein identifying the video playback command comprises:
identifying the particular region from a plurality of regions of the touch screen interface in which the swipe gesture was detected;
wherein identifying the video playback command comprises selecting the video playback command from a plurality of video playback commands associated with the particular zone.
6. The method of claim 1, wherein performing the action comprises the first device sending information to the second device, the information based on the video playback command.
7. The method of claim 1, wherein performing the action associated with the video comprises performing the action on the same device as the device that detected the swipe gesture.
8. The method of claim 1, wherein the video playback command selects a play speed and direction.
9. The method of claim 1, wherein the swipe gesture comprises a swipe gesture from the first location to a second location.
10. The method of claim 1, wherein the swipe gesture comprises a flick gesture starting from the first location.
11. The method of claim 1, wherein the video playback command is for one or more of:
pausing the playing of the video;
continuing playing the video;
playing back the played part of the video;
stopping playing of the video;
stopping the playing of the video and continuing the playing of the video at a specific playing position;
playing the video in slow motion;
browsing videos by frame stepping;
playing the video from the head;
playing one or more videos from the next playlist;
playing the video from a specific scene forward;
marking a playing position in the video;
stopping playing and continuing playing at the marked position; or
The video is graded.
12. A method, comprising:
concurrently detecting a plurality of parallel gestures on a touch screen interface of a device;
determining a number of the plurality of parallel gestures;
selecting a command from a plurality of commands based on the number of the plurality of parallel gestures;
an action associated with the command is performed.
13. The method of claim 12, wherein selecting the command comprises selecting a menu option based on a number of the plurality of parallel gestures.
14. The method of claim 12, wherein the plurality of parallel gestures comprise a plurality of parallel swipe gestures performed in the same direction.
15. The method of claim 12, wherein determining the number of the plurality of parallel gestures comprises determining a number of tap gestures performed concurrently on the touch screen interface.
16. The method of claim 12, further comprising determining a playback speed for multimedia content play based on a number of the plurality of parallel gestures.
17. A method, comprising:
concurrently detecting, in a left-to-right direction, a plurality of parallel gestures on a touchscreen interface of a remote control device;
determining a number of the plurality of parallel gestures;
selecting a playback speed from a plurality of playback speeds based on the number of the plurality of parallel gestures;
the multimedia content on the multimedia device is played at the selected playback speed.
18. The method of claim 17, wherein the plurality of playback speeds comprises two or more fast forward speeds.
19. The method of claim 17, wherein the multimedia device comprises an audio device.
20. The method of claim 17, wherein the multimedia device comprises a video device.
21. A method, comprising:
concurrently detecting a plurality of parallel gestures on a touchscreen interface of a remote control device in a right-to-left direction;
determining a number of the plurality of parallel gestures;
selecting a rewind speed from a plurality of rewind speeds based on the number of the plurality of parallel gestures;
the multimedia content is played in a rewind mode on the multimedia device at the selected rewind speed.
22. The method of claim 21, wherein the plurality of playback speeds comprises two or more rewind speeds.
23. A computer-readable storage medium comprising a sequence of instructions which, when executed by one or more processors, causes performance of the steps recited in any one of claims 1-22.
24. An apparatus, comprising:
one or more processors;
the device is configured to perform the steps of any of claims 1-22.
HK13112937.6A 2011-01-06 2012-01-05 Method and apparatus for gesture based controls HK1185676B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/986,060 2011-01-06
US12/986,054 2011-01-06

Publications (2)

Publication Number Publication Date
HK1185676A true HK1185676A (en) 2014-02-21
HK1185676B HK1185676B (en) 2018-09-14

Family

ID=

Similar Documents

Publication Publication Date Title
US9430128B2 (en) Method and apparatus for controls based on concurrent gestures
CN103329075B (en) For the method and apparatus based on gesture control
US20120179967A1 (en) Method and Apparatus for Gesture-Based Controls
US20240353997A1 (en) Flick to send or display content
EP2417517B1 (en) Directional touch remote
US8881049B2 (en) Scrolling displayed objects using a 3D remote controller in a media system
US8194037B2 (en) Centering a 3D remote controller in a media system
US8341544B2 (en) Scroll bar with video region in a media system
US20100101872A1 (en) Information processing apparatus, information processing method, and program
US20160253087A1 (en) Apparatus and method for controlling content by using line interaction
US20090158222A1 (en) Interactive and dynamic screen saver for use in a media system
EP3385824A1 (en) Mobile device and operation method control available for using touch and drag
US20090153475A1 (en) Use of a remote controller Z-direction input mechanism in a media system
WO2012104288A1 (en) A device having a multipoint sensing surface
US10031641B2 (en) Ordering of objects displayed by a computing device
HK1185676A (en) Method and apparatus for gesture based controls
HK1185676B (en) Method and apparatus for gesture based controls
US10162499B2 (en) Content playback apparatus and content playing method
HK1167027B (en) Directional touch remote
HK1184554A (en) Flick to send or display content