HK1193662B - Edge gesture - Google Patents
Edge gesture Download PDFInfo
- Publication number
- HK1193662B HK1193662B HK14107011.4A HK14107011A HK1193662B HK 1193662 B HK1193662 B HK 1193662B HK 14107011 A HK14107011 A HK 14107011A HK 1193662 B HK1193662 B HK 1193662B
- Authority
- HK
- Hong Kong
- Prior art keywords
- gesture
- edge
- user interface
- display
- display edge
- Prior art date
Links
Description
Background
Conventional techniques for selecting a user interface that is not currently exposed on a display are often confusing, take up valuable display space, are not universally applicable across different devices, or provide a poor user experience.
Some conventional techniques enable selection of a user interface through on-screen controls in a taskbar, within a floating window, or on a window frame, for example. However, these on-screen controls take up valuable display real estate (realest) and can annoy the user by requiring the user to find and select the correct control.
Some other conventional techniques enable the user interface to be selected through hardware such as hot keys and buttons. These techniques also preferably require the user to remember what key, combination of keys, or hardware button to select. Even in this best case, the user often accidentally selects a key or button. Furthermore, in many cases, hardware selection techniques cannot be universally applied because hardware on a computing device may vary by device model, generation, vendor, or manufacturer. In such a case, the techniques will not work, or work differently between different computing devices. This exacerbates the problem of users having to remember the correct hardware, as many users own multiple devices and therefore may need to remember different hardware choices for different devices. Still further, for many computing devices, hardware selection forces the user to engage the computing device outside of the user's normal stream of interactions, such as when a touch screen device requires the user to change his or her mental and physical orientation from display-based interactions to hardware-based interactions.
Disclosure of Invention
This document describes techniques and apparatuses that enable edge gestures. In some embodiments, these techniques and apparatus enable selection of a user interface that is not currently exposed on the display through edge gestures that are easy to use and remember.
This summary is provided to introduce simplified concepts for enabling edge gestures, which are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter. Techniques and/or means to enable edge gestures may be referred to herein, individually or in combination, as "techniques" as the context permits.
Drawings
Embodiments enabling edge gestures are described with reference to the following figures. The same reference numbers are used throughout the drawings to reference like features and components:
FIG. 1 illustrates an example system in which techniques enabling edge gestures can be implemented.
FIG. 2 illustrates an example method for enabling an edge gesture based on an edge gesture that is approximately perpendicular to an edge at which the gesture begins.
Fig. 3 illustrates an example tablet computing device with a touch-sensitive display presenting an immersive interface.
Fig. 4 illustrates the example immersive interface of fig. 3 along with showing example edges.
Fig. 5 illustrates the example immersive interface of fig. 3 and 4, along with showing an angular deviation (angular variance) line from a vertical line and a line from a starting point to a later point of the gesture.
Fig. 6 shows the edge of the immersive interface shown in fig. 4, along with the two regions shown in the right edge.
FIG. 7 illustrates an application selection interface presented by the system interface module in response to an edge gesture and over the immersive interface and web page of FIG. 3.
FIG. 8 illustrates an example method for enabling an edge gesture that includes determining an interface to present based on some factor of the gesture.
FIG. 9 illustrates an example method of enabling expansion of a user interface presented in response to an edge gesture or suspension (cease) of its presentation or enabling presentation of additional user interfaces.
Fig. 10 shows a laptop computer with a touch-sensitive display having a window-based email interface and two immersive interfaces.
FIG. 11 illustrates the interface of FIG. 10, along with showing two gestures having a start point, a later point, and one or more successive points.
FIG. 12 illustrates the window-based email interface of FIGS. 10 and 11, along with an email processing interface that is presented in response to an edge gesture.
FIG. 13 illustrates the interface of FIG. 12, along with showing additional email option interfaces presented in response to a gesture determined to have successive points at a preset distance from the edge.
FIG. 14 illustrates an example device in which techniques to enable edge gestures can be implemented.
Detailed Description
SUMMARY
This document describes techniques and apparatuses that enable edge gestures. These techniques enable a user to quickly and easily select interfaces that are not currently exposed on the user's device, as well as other operations.
Consider the case where a user is watching a movie on a tablet computing device. Assume that the movie is playing on an immersive interface that occupies the entire display and that the user wants to check her social networking web pages without stopping the movie. The described techniques and apparatus enable her to select other interfaces by a simple swipe gesture starting from the edge of her display. She may swipe and drag from one edge of her display to enable her to select her social networking site's user interface. Or conversely, assume that she wants to interact with a media application playing the movie in a manner not allowed by the immersive interface, such as she wants to display a menu that enables subtitles or director's commentary. She can swipe and drag the control menu for the immersive interface from the other edge of her flat panel display and quickly and easily select items and/or commands from this menu.
In both cases, valuable real estate for playing the movie is not occupied by on-screen controls, nor does the user need to remember and find hardware buttons. Still further, in this example, no gesture is used by the techniques except for one starting from an edge, thus permitting the immersive interface to use nearly all commonly available gestures. Additionally, by considering edge gestures or portions thereof, the techniques do not affect the performance of the gesture or touch input system, as edge gestures may be processed before the entire gesture is completed, thereby avoiding the latency associated with processing the entire gesture starting elsewhere.
These are just two examples of the many ways in which the techniques enable edge gestures to be implemented and used, with other examples described below.
Example System
FIG. 1 illustrates an example system 100 in which techniques enabling edge gestures can be embodied. The system 100 includes a computing device 102, the computing device 102 being shown in six examples: laptop 104, tablet 106, smart phone 108, set-top box 110, desktop computer 112, and gaming device 114, although other computing devices and systems, such as servers and netbooks, may be used as well.
The computing device 102 includes one or more computer processors 116 and computer-readable storage media 118 (media 118). The media 118 includes an operating system 120, a window-based mode module 122, an immersive mode module 124, a system interface module 126, a gesture handler 128, and one or more applications 130, each having one or more application user interfaces 132.
The computing device 102 also contains or has access to one or more displays 134 and input mechanisms 136. Four example displays are shown in fig. 1. The input mechanism 136 may include gesture-sensitive sensors and devices, such as touch-based sensors and motion tracking sensors (e.g., camera-based), as well as a mouse (either standalone or integrated with a keyboard), a track pad, and a microphone with accompanying voice recognition software, to name a few. The input mechanism 136 may be separate from or integrated with the display 134; an example of integration includes a gesture-sensitive display with integrated touch-sensitive or motion-sensitive sensors.
The window-based mode module 122 presents the application user interface 132 through a window having a frame. These boxes may provide controls through which to interact with the application and/or controls that enable a user to move and resize the window.
Immersive mode module 124 provides an environment with which a user can view and interact with one or more of applications 130 through application user interface 132. In some embodiments, the environment presents the content of the application and enables interaction with the application with little or no window frames and/or without requiring the user to manage the layout of the window frames or the top window relative to other windows (e.g., which window is active or in front) or to manually resize and position the application user interface 132.
The environment may be, but need not be, hosted (hosted) and/or surfaced (surfaced) without the use of a window-based desktop environment. Thus, in some cases, immersive mode module 124 presents an immersive environment that is not a window (even an environment without a substantial box) and precludes the use of desktop-like displays (e.g., a taskbar). Further, in some embodiments, the immersive environment is similar to an operating system in that it is not closeable or capable of being unloaded. Although not required, in some cases the immersive environment enables an application to use all or nearly all of the pixels of the display. Examples of immersive environments are provided below as part of describing the techniques, but they are not exhaustive and are not intended to limit the techniques described herein.
The system interface module 126 provides one or more interfaces by which interaction with the operating system 120 is enabled, such as an application launch interface, a start menu, or a system tools or options menu, to name a few.
The operating system 120, modules 122,124, and 126, and gesture handler 128 may be separate from one another or combined or integrated in any suitable form.
Example method
FIG. 2 depicts a method 200 for enabling an edge gesture based on the edge gesture, the edge gesture being approximately perpendicular to an edge at which the gesture begins. In portions of the following discussion, reference may be made to the system 100 of fig. 1, the reference being made for example purposes only.
Block 202 receives a gesture. The gesture may be received at various portions of the display, such as on a window-based interface, an immersive interface, or no interface. Further, the gesture may be made and received in various ways, such as a pointer that tracks motion received through a touchpad, mouse, or roller ball, or physical motion made by one or more arms, one or more fingers, or a stylus received through a motion-or touch-sensitive mechanism. In some cases, the gesture is received by a touch digitizer, capacitive touch screen, or capacitive sensor (to name a few) when away from or near a physical edge of the display (e.g., when a finger or stylus encounters an edge of the display).
Consider, by way of example, fig. 3, which illustrates a tablet computing device 106. Tablet 106 contains a touch-sensitive display 302, which display 302 is shown displaying an immersive interface 304 containing a web page 306. As part of the ongoing example, at block 202, gesture handler 128 receives gesture 308 as shown in FIG. 3.
Block 204 determines whether the starting point of the gesture is at an edge. As noted above, the edge in question may be an edge of a user interface (whether immersive or window-based) and/or an edge of a display. In some cases, of course, the edge of the user interface is also the edge of the display. The size of the edge may vary based on various factors relating to the display or interface. A small display or interface may have a smaller size in terms of absolute values (absolute) or pixels than a large display or interface. Highly sensitive input mechanisms also permit smaller edges. In some instances, the edge may extend beyond the edge of the display or screen when the input mechanism is capable of receiving a portion of the gesture beyond the display or screen. The example edges are rectangular and vary between one and twenty pixels in one dimension (dimension) and the interface or interface limitations of the display in the other dimension, but other sizes and shapes including convex and concave edges may be used instead.
Continuing the ongoing example, consider fig. 4, which shows immersive interface 304 and gesture 308 of fig. 3, as well as left edge 402, top edge 404, right edge 406, and bottom edge 408. For purposes of visual clarity, web page 306 is not shown. In this example, the dimensions of the interface and display are of a medium size, between the size of a smartphone and many laptop and desktop displays. Edges 402, 404, 406, and 408 have twenty pixels or small dimensions of about 10-15mm in absolute value, the area of each edge shown being bounded by dashed lines at edge boundaries 410, 412, 414, and 416, respectively, twenty pixels from the display boundary.
The gesture handler 128 determines that the gesture 308 has a start point 418, and this start point 418 is within the left edge 402. The gesture handler 128 in this case determines the starting point by receiving data indicating the [ X, Y ] coordinates of the pixels at the beginning of the gesture 308 and comparing the first of these coordinates to those pixels contained within each of the edges 402 and 408. The gesture handler 128 is generally able to determine the starting point and whether it is in an edge faster than the sample rate, resulting in less or no performance degradation than those techniques that simply pass the gesture directly to the exposed interface on which the gesture is made.
Returning generally to method 200, if block 204 determines that the starting point of the gesture is not at an edge, method 200 proceeds along the NO path to block 206. Block 206 passes the gesture to an exposed user interface, such as the underlying interface on which the gesture was received. Changing the ongoing example, assume that gesture 308 is determined to have no starting point within the edge. In such a case, the gesture handler 128 passes the buffered data for the gesture 308 to the immersive user interface 304. After passing the gesture, the method 200 ends.
If block 204 determines that the starting point of the gesture is in an edge, the method 200 proceeds along the YES path to block 208. Optionally, block 204 may determine a length of the portion of the gesture before the method proceeds to block 208. In some cases, determining the length of the portion of the gesture allows the determination of the starting point to be made prior to completion of the gesture. Block 208 responds to an affirmative determination at block 204 by determining whether a line from the starting point to a later point of the gesture is approximately perpendicular to the edge.
In some embodiments, block 208 determines the later point used. For example, the gesture handler 128 can determine a later point of the gesture based on a later point received at a preset distance from the edge or the starting point, such as crossing the edge boundary 410 of the edge 402 or twenty pixels from the starting point 418, all of fig. 4. In some other embodiments, the gesture handler 128 determines the later point based on the later point being received a preset time after receipt of the starting point, such an amount of time being slightly greater than the time that the computing device 102 would normally use to determine that the gesture was a tap-and-hold or hover gesture.
For this embodiment in progress, the gesture handler 128 uses a later received point of the gesture 308 received outside the edge 402 as long as the later received point is received within a preset time. If no point outside the edge is received within the preset time, the gesture handler 128 proceeds to block 206 and passes the gesture 308 to the immersive interface 304.
Using the starting point, block 208 determines whether a line from the starting point to a later point of the gesture is approximately perpendicular to the edge. Various deviation angles may be used in such a determination by block 208, such as five, ten, twenty, or thirty degrees.
By way of example, consider a thirty degree deviation angle from vertical. Fig. 5 illustrates this example deviation, which shows immersive interface 304, gesture 308, left edge 402, left edge boundary 410, and start point 418 of fig. 3 and 4, along with deviation line 502 showing thirty degrees from vertical line 504. Thus, the gesture handler 128 determines that it is approximately vertical within the example thirty degree deviation line 502 based on a line 506 from the starting point 418 to a later point 508 that is approximately twenty degrees from vertical.
In general, if block 208 determines that the line is not approximately perpendicular to the edge, method 200 proceeds along the "no" path to block 206 (e.g., a path where the finger is bent). As noted in the above section, block 208 may also determine that a later point or other aspect of the gesture disqualifies the gesture. Examples include when a later point is within the edge, such as due to a hover, tap, press and hold, or up and down gesture (e.g., to scroll content in the user interface), when the gesture is set to a single input gesture and a second input is received (e.g., a first finger starts at the edge but a second finger falls anywhere after), or if a tap event occurs during or prior to the gesture (e.g., a finger has contacted elsewhere or contact is received elsewhere during the gesture).
If block 208 determines that the line is approximately vertical based on a later point outside the edge, method 200 proceeds along the YES path to block 210.
Block 210 responds to a positive determination at block 208 by passing the gesture to an entity outside of the exposed user interface. This entity is not the user interface on which the gesture was received, assuming that the gesture was received entirely on the user interface. Block 210 may also determine to which entity to pass the gesture, such as based on the edge or region of the edge where the starting point of the gesture was received. Consider, for example, fig. 6, which shows the immersive interface 304 and edges 402, 404, 406, and 408 of fig. 4, but with the addition of a top region 602 and a bottom region 604 to the right edge 406. The starting point in the top region 602 compared to the starting point of the received bottom region 604 can result in a different entity (or even the same entity but a different user interface being provided in response). Similarly, a starting point in the top edge 404 can result in a different entity or interface than the left edge 402 or the lower edge 408.
In some cases, this entity is an application associated with the user interface. In such a case, passing the gesture to the entity may be effective to cause the application to present a second user interface that enables interaction with the application. In the movie example above, the entity may be a media player that plays the movie instead of an immersive interface that displays the movie. The media player can then present a second user interface that enables selection of subtitles or director's commentary, rather than enabling selections such as "pause", "play", and "stop" by the interface displaying the movie. This capability is permitted in fig. 1, where one of the applications 130 can contain or can present more than one application user interface 132. Accordingly, block 210 can pass the gesture to the system interface module 126, one of the applications 130 that is currently presenting the user interface, or another one of the applications 130 (to name just three possibilities).
To end the ongoing embodiment, at block 210, the gesture handler 128 passes the gesture 308 to the system interface module 126. The system interface module 126 receives the buffered portion of the gesture 308 and continues to receive the remainder of it as the user makes the gesture 308. Fig. 7 illustrates a possible response after receiving gesture 308, showing application selection interface 702 presented by system interface module 126 and on immersive interface 304 and web page 306 of fig. 3. The application selection interface 702 enables selection of various other applications and their respective interfaces at selectable application tiles (tiles) 704, 706, 708, and 710.
This example application selection interface 702 is an immersive user interface presented using immersive mode module 124, but this is not required. The presented interface or list thereof may alternatively be window-based and presented using window-based module 122. Both of these modules are shown in fig. 1.
Block 210 may likewise or alternatively determine to pass the gesture to a different entity and/or interface based on other factors regarding the received gesture. Example factors are described in more detail in method 800 below.
It should be noted that method 200, as well as other methods described below, may be performed in real-time, such as when a gesture is made and received. This permits, among other things, the user interface presented in response to the gesture to be presented prior to completion of the gesture. Further, the user interface may be progressively presented as the gesture is received. When the gesture is performed with a user interface that appears to "stick" to the gesture (e.g., stick to a mouse point or a human finger making the gesture), this permits a user experience of dragging the user interface out of the edge.
FIG. 8 depicts a method 800 for enabling an edge gesture that includes determining an interface to present based on some factor of the gesture. In portions of the following discussion, reference is made to the system 100 of fig. 1, the reference of which is made for exemplary purposes only. The method 800 may function in whole or in part separately or in combination with other methods described herein.
Block 802 determines that a gesture made on a user interface has a starting point at an edge of the user interface and has a later point that is not within the edge. Block 802 may operate similar to aspects of method 200 or using aspects of method 200, such as determining a later point at which the determination of block 802 was made. Block 802 may function differently as well.
For example, in one instance, block 802 determines that the gesture is a single-finger swipe gesture that begins at an edge of the exposed immersive user interface and has a later point that is not at the edge, but that the determination is not based on the angle of the gesture. Based on this determination, block 802 proceeds to block 804 rather than passing the gesture to the exposed immersive user interface.
Block 804 determines which interface to present based on one or more factors of the gesture. Block 804 may do this based on the final or intermediate length of the gesture, whether the gesture is single or multi-point (e.g., single or multi-fingered), or may do so based on the speed of the gesture. In some cases, two or more factors of the gesture determine which interface to present, such as a drag-and-hold gesture having a drag length and a hold time or a drag-and-drop gesture having a drag length and a drop position. Thus, for example, block 804 may determine to present a start menu in response to a multi-finger gesture, present an application selection interface in response to a relatively short single-finger gesture, or present a system control interface that permits selection of a shutdown of computing device 102 in response to a relatively long single-finger gesture. To do so, the gesture handler 128 may determine the length, speed, or number of inputs (e.g., fingers) of the gesture.
In response, block 806 presents the determined user interface. The determined user interface may be any of the interfaces mentioned herein as well as an entirely new screen, such as a new page of an electronic book, an additional screen (e.g., a toolbar or navigation bar), or a modified view of the current user interface (text that presents the current user interface in a different font, color, or highlighting). In some cases, visual or non-visual effects such as actions related to the video game or sound effects associated with the current or presented user interface may be presented.
By way of example, assume that gesture handler 128 determines to present a user interface that enables interaction with operating system 120 based on factors of the gesture. In response, the system interface module 126 presents the user interface. The presentation of the user interface may be presented in a manner similar to that described in other methods, such as in the progressive display of the application-selection user interface 702 of FIG. 7.
Following all or portions of method 200 and/or method 800, the techniques may proceed to perform method 900 of fig. 9. The method 900 enables extending a user interface, presenting another interface, or aborting presentation of a user interface presented in response to an edge gesture.
Block 902 receives a successive point of the gesture upon presentation of at least some portion of the second user interface. As noted in the above section, the methods 200 and/or 800 can present or cause presentation of a second user interface, such as a second user interface for the same application, a different application, or a system user interface associated with the current user interface.
By way of example, consider fig. 10, which shows a laptop 104 with a touch-sensitive display 1002, the display 1002 displaying a window-based email interface 1004 and two immersive interfaces 1006 and 1008. The window-based email interface 1004 is associated with an application that manages email, which may be remote or local to the laptop 104. FIG. 10 also shows two gestures 1010 and 1012. Gesture 1010 travels in a straight line and gesture 1012 reverses (shown with two arrows to show two directions).
FIG. 11 shows a gesture 1010 with a starting point 1102, a later point 1104, and a successive point 1106, and a gesture 1012 with the same starting point 1102, later point 1108, and a first successive point 1110 and a second successive point 1112. Fig. 11 also shows a bottom edge 1114, a dot later region 1116, and an interface attachment region 1118.
Block 904 determines whether the gesture contains a reversal, an extension, or neither based on the successive points. Block 904 may determine a reversal in the direction of the gesture by determining that a successive point is at or closer to the edge than a previous point of the gesture. Block 904 may determine that the gesture extends based on the successive points at a preset distance from the edge or the later point. If neither is determined to be true, the method 900 may repeat blocks 902 and 904 to receive and analyze additional successive points until the gesture ends. If block 904 determines that there is a reversal, the method 900 proceeds along a "reverse" path to block 906. If block 904 determines that the gesture is extended, the method 900 proceeds along an "extend" path to block 908.
In the context of this example, assume that gesture handler 128 receives a first successive point 1110 of gesture 1012. The gesture handler 128 then determines that the first successive point 1110 is not at the edge 1114, is not closer to the edge 1114 than a point prior to the gesture (e.g., is not closer than the later point 1108), and is not a preset distance from the edge or the later point because it is not within the interface append region 1118. In such a case, the method 900 returns to block 902.
In a second iteration of block 902, assume that the gesture handler 128 receives a second continuation point 1112. In such a case, the gesture handler 128 determines that the second successive point 1112 is closer to the edge 1114 than the first successive point 1110, and thus the gesture 1012 involves a reversal. Gesture handler 128 then proceeds to block 906 to abort presentation of the second user interface previously presented in response to the gesture. By way of example, consider fig. 12, which illustrates an email processing interface 1202. In this example case of block 906, gesture handler 128 causes the email application to cease presenting interface 1202 (not shown removed) in response to the reversal of gesture 1012.
However, block 908 presents or causes presentation of a third user interface or an extension of the second user interface. In some cases, presenting the third user interface causes the second user interface to cease being presented by de-presenting or hiding the second user interface (e.g., presenting the third user interface on the second user interface). Continuing the ongoing example, consider FIG. 13, which shows an additional email options interface 1302 responsive to a gesture 1010, the gesture 1010 being determined to have a successive point 1106 a preset distance from the edge 1104, in which case the successive point 1106 is within the interface attachment area 1118 of FIG. 11. The region and the preset distance may be set based on a size of a user interface previously presented to respond to the gesture. Thus, a user wishing to add additional controls may simply extend the gesture beyond the user interface that was presented in response to the earlier portion of the gesture.
Method 900 may be repeated to add additional user interfaces or to expand a presented user interface. For example, returning to the example interface 702 in fig. 7, as the gesture 308 extends across the interface 702, the gesture handler 128 can continue to add interfaces or controls to the interface 702, such as by presenting an additional set of selectable application tiles. If the gesture 308 extends across additional tiles, the gesture handler 128 may cause the system interface module 124 to present additional interfaces adjacent to the tiles to enable the user to select controls such as pause, hibernate, switch modes (immersive to windows-based and vice versa), or close the computing device 102.
Although the above-described example user interfaces presented in response to edge gestures are opaque, they may also be partially transparent. This may be useful as it does not obscure the content. In the movie example described above, the presented user interface may be partially transparent, permitting the movie to be only partially obscured during use of the user interface. Similarly, in the example of fig. 12 and 13, the interfaces 1202 and 1302 may be partially transparent, enabling the user to see the text of the email while also being able to select a control in one of the interfaces.
The foregoing discussion describes methods in which the techniques may enable edge gestures to be implemented and used. These methods are illustrated as a collection of blocks that specify operations performed, but are not necessarily limited to the orders shown for performing the operations by the respective blocks.
Aspects of these methods may be implemented in hardware (e.g., fixed logic circuitry), firmware, system on a chip (SoC), software, manual processing, or any combination thereof. Software implementations represent program code, such as software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and so forth, that perform specified tasks when executed by a computer processor. The program code can be stored in one or more computer-readable storage devices, both local and/or remote to a computer processor. The method may also be practiced in a distributed computing environment by multiple computing devices.
Example apparatus
Fig. 14 illustrates different components of an example device 1400, which device 1400 may be implemented as any type of client, server, and/or computing device as described above with reference to fig. 1-13 to implement techniques to enable edge gestures. In embodiments, device 1400 may be implemented as one or a combination of a wired and/or wireless device, in the form of a television client device (e.g., a television set-top box, Digital Video Recorder (DVR), etc.), a consumer device, a computer device, a server device, a portable computer device, a user device, a communication device, a video processing and/or rendering device, an appliance device, a gaming device, an electronic device, and/or as another type of device. Device 1400 may likewise be associated with a user (e.g., a person) and/or an entity that operates the device such that a device describes logical devices that include users, software, firmware, and/or a combination of devices.
Device 1400 includes a communication device 1402 that enables wired and/or wireless communication of device data 1404 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 1404 or other device content can include configuration settings for the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on device 1400 can include any type of audio, video, and/or image data. Device 1400 includes one or more data inputs 1406 via which any type of data, media content, and/or input can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
Device 1400 also includes communication interfaces 1408, which can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interface 1408 provides a connection and/or communication link between the device 1400 and a communication network by which other electronic, computing, and communication devices communicate data with the device 1400.
Device 1400 includes one or more processors 1410 (e.g., any of microprocessors, controllers, and the like) that process various computer-executable instructions to control the operation of device 1400 and enable the described techniques for enabling and/or using edge gestures. Alternatively or in addition, device 1400 can be implemented in any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 1412. Although not shown, device 1400 may contain a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that employs any of a variety of bus architectures.
Device 1400 also includes computer-readable storage media 1414, such as one or more storage devices that enable persistent and/or non-transitory data storage (i.e., as opposed to mere signal transmission), examples of which include Random Access Memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable Compact Disc (CD), any type of a Digital Versatile Disc (DVD), and the like. Device 1400 may also include a mass storage media device 1416.
Computer-readable storage media 1414 provides data storage mechanisms to store the device data 1404, as well as various device applications 1418 and any other types of information and/or data related to operational aspects of device 1400. For example, an operating system 1420 may be maintained as a computer application with the computer-readable storage media 1414 and may run on processors 1410. The device applications 1418 may include a device manager, such as any form of a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so forth.
The device applications 1418 also include any system components or modules to implement techniques to use or enable edge gestures. In this example, the device applications 1418 may include a system interface module 122, a gesture handler 128, and one or more applications 130.
Conclusion
Although embodiments of techniques and apparatuses to enable edge gestures have been described in language specific to features and/or methods, it is to be understood that the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example embodiments enabling edge gestures to be implemented and/or used.
Claims (10)
1. A computer-implemented method, comprising:
determining that the gesture has a starting point at a display edge of a display presenting the exposed immersive user interface and has a later point that is not at the display edge;
in response to determining that the starting point is at the display edge and the later point is not at the display edge, determining whether a line from the starting point to the later point of the gesture is within a predetermined angular deviation of a perpendicular to the display edge; and
determining whether the display edge is a first display edge or a second display edge in response to determining that the line is within a predetermined angular deviation of a perpendicular to the display edge;
responsive to the display edge being a first display edge, communicating the gesture to a first application that is not associated with the exposed immersive user interface;
and
responsive to the display edge being a second display edge different from the first display edge, the gesture is passed to a second application that is not associated with the exposed immersive user interface.
2. The computer-implemented method of claim 1, wherein communicating the gesture to a first application that is not associated with the exposed immersive user interface causes the first application to present a second immersive user interface that enables interaction with the first application.
3. The computer-implemented method of claim 1, wherein the predetermined angular deviation of the perpendicular to the edge of the display is thirty degrees.
4. The computer-implemented method of claim 1, wherein the display edge corresponds to a top or bottom edge of the exposed immersive user interface.
5. The computer-implemented method of claim 1, wherein the display edge corresponds to a left or right edge of the exposed immersive user interface.
6. The computer-implemented method of claim 2, wherein presenting the second immersive user interface progressively presents the second immersive user interface as the gesture is received.
7. A computer-implemented method, comprising:
receiving a gesture made on the exposed user interface;
determining whether a starting point of the gesture is received at a display edge of a display presenting the exposed user interface;
responsive to determining that the starting point is not at the edge of the display, passing the gesture to the exposed user interface;
in response to determining that the starting point is at the display edge, determining whether a line from the starting point to a later point of the gesture is within a predetermined angular deviation of a perpendicular to the display edge, and
responsive to determining that the line is not within a predetermined angular deviation of a perpendicular line to the display edge, passing the gesture to the exposed user interface;
in response to determining that the display edge is a first display edge and that the line is within a predetermined angular deviation of a perpendicular line to the display edge, passing the gesture to a first application that is not associated with the exposed user interface;
in response to determining that the display edge is a second display edge different from the first display edge and that the line is within a predetermined angular deviation of a perpendicular to the display edge, passing the gesture to a second application that is not associated with the exposed immersive user interface.
8. The computer-implemented method of claim 7, wherein communicating the gesture to the first application that is not associated with the exposed user interface presents a second user interface that enables interaction with the first application, the second user interface being at least partially transparent.
9. The computer-implemented method of claim 7, further comprising: determining a later point of the gesture based on the later point of the gesture being received at a preset distance from the display edge or the starting point before determining whether a line from the starting point to the later point of the gesture is within a predetermined angular deviation of a perpendicular to the display edge.
10. The computer-implemented method of claim 7, further comprising: determining a later point of the gesture based on the later point of the gesture being received at a time preset after receiving the starting point before determining whether a line from the starting point to the later point of the gesture is within a predetermined angular deviation of a perpendicular to the edge of the display.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/118,181 | 2011-05-27 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1193662A HK1193662A (en) | 2014-09-26 |
| HK1193662B true HK1193662B (en) | 2018-03-09 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9658766B2 (en) | Edge gesture | |
| EP2715491B1 (en) | Edge gesture | |
| CN103649900B (en) | Edge gesture | |
| EP2815299B1 (en) | Thumbnail-image selection of applications | |
| US9329774B2 (en) | Switching back to a previously-interacted-with application | |
| US20120304114A1 (en) | Managing an immersive interface in a multi-application immersive environment | |
| US20120299968A1 (en) | Managing an immersive interface in a multi-application immersive environment | |
| HK1193662B (en) | Edge gesture | |
| HK1193662A (en) | Edge gesture | |
| HK1193659A (en) | Edge gesture | |
| HK1193660B (en) | Edge gesture |