HK1169729A - Scrubbing touch infotip - Google Patents
Scrubbing touch infotip Download PDFInfo
- Publication number
- HK1169729A HK1169729A HK12110448.3A HK12110448A HK1169729A HK 1169729 A HK1169729 A HK 1169729A HK 12110448 A HK12110448 A HK 12110448A HK 1169729 A HK1169729 A HK 1169729A
- Authority
- HK
- Hong Kong
- Prior art keywords
- representation
- information
- item
- user
- processor
- Prior art date
Links
Description
Background
Users may provide input to a computer system where they manipulate an on-screen cursor, for example, with a computer mouse. In such a scenario, a user manipulates a computer mouse to cause corresponding movement of an on-screen cursor. This can be viewed as a "three state" system in which the mouse cursor can (1) disengage from a user interface element (e.g., an icon or text link); (2) on UI elements where the mouse button is occupied (engage); or (3) on UI elements that do not have an occupied mouse button (which is sometimes referred to as "mouse over" or "float"). In response to the mouse being placed, the system may provide information to the user regarding the icons or text on the mouse. For example, in some web browsers, a user may place a mouse over a hypertext link and the Uniform Resource Locator (URL) of the link may be displayed in the status region of the web browser. These mouse-on events provide the user with a representation of information that he may not otherwise be able to obtain.
There are also ways for a user to provide input to a computer system where there is no on-screen cursor present. A user may provide input to the computer system by, for example, touching the touch-sensitive surface with his or her finger(s) or a stylus. This can be viewed as a "two-state" system, where a user can (1) touch portions of a touch input device; or (2) no touch is made to a portion of the touch input device. There is no third state of mouse on in the cursor free situation. An example of such a touch-sensitive surface is a track pad as found in many laptop computers, where the user moves his fingers along the surface, and those finger movements are reflected on the display device as cursor or pointer movements. Another example of this touch sensitive surface is a touch screen as found in many mobile phones, where the touch sensitive surface is integrated into the display device, and where the user moves his fingers along the display device itself, and those finger movements are interpreted as input to the computer.
An example of such a touch input is in an address book application that displays letters from a to Z (including a and Z) of the alphabet in a list. The user may "wipe" (or drag along the touch surface) his or her finger along the list of letters to move through the address book. The beginning of the "M" entry in the address book may be displayed, for example, when he or she wipes his or her finger against "M". The user may also manipulate the address book entry list itself to scroll through the entries.
These known techniques for providing information to a user, where the user uses touch input to a computer system, have a number of problems, some of which are well known.
Disclosure of Invention
One problem that arises with touch input is that there is no cursor. Since there is no cursor, there is no means to place the mouse over an icon or other portion of the user interface, and therefore mouse-over events cannot be used. A user may touch an icon or other user interface element in an attempt to replace a mouse-over event, but this makes it difficult for the user to distinguish between attempting to click on an icon instead of "mouse-over" an icon. Even if the user has a mechanism for entering a "mouse-up" input (as opposed to a click input via touch), icons or items (e.g., a list of hypertext links) may still be grouped closely together, and it may be difficult for the user to select a particular item from among a plurality of grouped icons.
Another problem that arises with touch input is that the input itself is somewhat imprecise. The cursor may be used to occupy a single pixel on the display. In contrast, a human finger has a larger area than one pixel (and even a stylus that typically presents a smaller area to the touch input device than the finger, still has a larger area than a pixel). This imprecision associated with touch input makes it challenging for a user to target or otherwise engage small user interface elements.
A problem with known techniques for receiving information using a scrubbing input is that they are limited in the information they present. For example, in the address book example used above, scrubbing is only one of several ways to move to a particular entry in the address book. Furthermore, these known techniques using scrubbing do not replicate mouse-on inputs.
It would therefore be an advancement to provide an invention for providing a representation of information for an item of a plurality of grouped items via touch input. In one embodiment of the invention, a computer system displays a user interface including a plurality of grouped icons. The computer system receives touch input from a user indicating a wipe. In response to this wiping user touch input, the system determines an item of the plurality of grouped items to which the user input corresponds and, in response, displays a representation of information for the item.
Other embodiments of the invention exist for providing a representation of information for an item of a plurality of grouped items via touch input, and some examples of such are described with respect to the detailed description of the figures.
Drawings
Systems, methods, and computer-readable media for providing a representation of information for an item of a plurality of grouped items via touch input are further described with reference to the accompanying drawings in which:
FIG. 1 depicts an example general-purpose computing environment in which aspects of one embodiment of the invention may be implemented.
FIG. 2 depicts an example computer including a touch-sensitive surface in which an aspect of an embodiment of the present invention may be implemented.
FIG. 3 depicts, as an example, a grouped plurality of items for which aspects of one embodiment of the invention may be implemented.
FIG. 4 depicts the grouped plurality of items of FIG. 3 for which representations of information not otherwise available via user input are displayed in response to user touch input.
FIG. 5 depicts the grouped plurality of items of FIG. 4 for which representations of second information not otherwise available via user input is displayed in response to additional user touch input.
FIG. 6 depicts an example word processor window in which aspects of an embodiment of the present invention may be implemented.
FIG. 7 depicts an example web browser window in which aspects of one embodiment of the present invention may be implemented.
FIG. 8 depicts an exemplary text menu list in which aspects of one embodiment of the present invention may be implemented.
FIG. 9 depicts example operational procedures for practicing one embodiment of the present invention.
Detailed Description
Embodiments may execute on one or more computer systems. FIG. 1 and the following discussion are intended to provide a brief general description of a suitable computing environment in which the disclosed subject matter may be implemented.
The term processor as used throughout this specification may include hardware components (e.g., hardware interrupt controllers, network adapters, graphics processors, hardware-based video/audio codecs), and the firmware used to operate such hardware. The term processor can also include microprocessors, application specific integrated circuits, and/or one or more logical processors (e.g., one or more cores of a multi-core general processing unit configured by instructions read from firmware and/or software). The logical processor(s) may be configured by instructions loaded from memory (e.g., RAM, ROM, firmware, and/or mass storage) that embody logic operable to perform the function(s).
Referring now to FIG. 1, an exemplary general purpose computing system is depicted. The general purpose computing system can include, among other things, a conventional computer 20 including at least one processor or processing unit 21, a system memory 22, and a system bus 23 that communicatively couples various system components including the system memory to the processing unit 21 when the system is in an operating state. The system bus 23 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory may include Read Only Memory (ROM) 24 and Random Access Memory (RAM) 25. A basic input/output system 26 (BIOS), containing the basic routines that help to transfer information between elements within the computer 20, such as during start-up, is stored in ROM 24. The computer 20 may also include a hard disk drive 27 for reading from and writing to a hard disk (not shown), a magnetic disk drive 28 for reading from or writing to a removable magnetic disk 29, and an optical disk drive 30 for reading from or writing to a removable optical disk 31 such as a CD ROM or other optical media. The hard disk drive 27, magnetic disk drive 28, and optical disk drive 30 are shown as connected to the system bus 23 by a hard disk drive interface 32, a magnetic disk drive interface 33, and an optical drive interface 34, respectively. The drives and their associated computer-readable media provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computer 20. Although the exemplary environment described herein employs a hard disk, a removable magnetic disk 29 and a removable optical disk 31, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as flash memory cards, digital video disks, Random Access Memories (RAMs), Read Only Memories (ROMs) and the like may also be used in the exemplary operating environment. In general, such computer-readable storage media may be used in some embodiments to store processor-executable instructions embodying aspects of the present disclosure.
A number of program modules comprising computer-readable instructions may be stored on a computer-readable medium such as the hard disk, magnetic disk 29, optical disk 31, ROM 24 or RAM 25, including an operating system 35, one or more application programs 36, other program modules 37 and program data 38. When executed by a processing unit, the computer readable instructions cause the actions described in more detail below to be performed or cause various program modules to be instantiated. A user may enter commands and information into the computer 20 through input devices such as a keyboard 40 and pointing device 42. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 21 through a serial port interface 46 that is coupled to the system bus, but may be connected by other interfaces, such as a parallel port, game port or a Universal Serial Bus (USB). A monitor 47, display, or other type of display device is also connected to the system bus 23 via an interface, such as a video adapter 48. In addition to the display 47, computers typically include other peripheral output devices (not shown), such as speakers and printers. The exemplary system of FIG. 1 also includes a host adapter 55, Small Computer System Interface (SCSI) bus 56, and an external storage device 62 connected to the SCSI bus 56.
The computer 20 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 49. The remote computer 49 may be another computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 20, although only a memory storage device 50 has been illustrated in fig. 1. The logical connections depicted in FIG. 1 can include a Local Area Network (LAN) 51 and a Wide Area Network (WAN) 52. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
When used in a LAN networking environment, the computer 20 can be connected to the LAN 51 through a network interface or adapter 53. When used in a WAN networking environment, the computer 20 typically includes a modem 54 or other means for establishing communications over the wide area network 52, such as the Internet. The modem 54, which may be internal or external, may be connected to the system bus 23 via the serial port interface 46. In a networked environment, program modules depicted relative to the computer 20, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used. Additionally, while it is envisioned that numerous embodiments of the present disclosure are particularly well-suited for computerized systems, nothing in this document is intended to limit the disclosure to such embodiments.
The system memory 22 of the computer 20 may include instructions that, when executed by the computer 20, cause the computer 20 to implement the present invention (e.g., the operational procedure of fig. 9).
FIG. 2 depicts an example computer including a touch-sensitive surface in which aspects of one embodiment of the present invention may be implemented. The touch screen 200 of FIG. 2 may be implemented as the display 47 in the computing environment 100 of FIG. 1. Additionally, the memory 214 of the computer 200 may include instructions that, when executed by the computer 200, cause the computer 200 to implement the present invention (e.g., the operational procedures of FIG. 17, which are used to implement aspects of the present invention depicted in FIGS. 3-16).
The interactive display device 200 (sometimes referred to as a touch screen or touch display) includes a projection display system having an image source 202, optionally one or more mirrors 204 for increasing the optical path length and image size of the projection display, and a horizontal display screen 206 onto which the image is projected. Although shown in the context of a projection display system, it will be understood that the interactive display device may comprise any other suitable image display system, including but not limited to Liquid Crystal Display (LCD) panel systems and other light valve systems. Additionally, although shown in the context of a horizontal display system, it will be understood that the disclosed embodiments may be used in displays of any orientation.
The display screen 206 includes a clear transparent portion 208 (e.g., a glass sheet) and a diffuser screen layer 210 disposed on the clear transparent portion 208. In some embodiments, an additional transparent layer (not shown) may be disposed on the diffuser screen layer 210 to provide a smooth look and feel to the display screen.
Continuing with FIG. 2, the interactive display device 200 further includes an electronic controller 212, the electronic controller 212 including a memory 214 and a processor 216. The controller 212 may also include a wireless transmitter and receiver 218 configured to communicate with other devices. Controller 212 may include computer-executable instructions or code (e.g., programs) stored in memory 214 or on other computer-readable storage media and executed by processor 216 that control various visual responses to detected touches described in more detail below. Generally, programs include routines, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The term "program" as used herein may mean a single program or multiple programs that work in concert and may be used to represent applications, services, or any other type or class of program.
To sense objects located on the display screen 206, the interactive display device 200 includes one or more image capture devices 220 configured to capture an image of the entire back of the display screen 206 and provide the image to the electronic controller 212 for detecting objects appearing in the image. The diffuser screen layer 210 helps to avoid imaging of objects that are not in contact with the display screen 206 or are not positioned within a few millimeters of the display screen 206, and thus helps to ensure that the image capture device 220 only detects objects that touch the display screen 206 (or in some cases only objects in close proximity to the display screen 206). Although the depicted embodiment includes a single image capture device 220, it will be understood that any suitable number of image capture devices may be used to image the back side of the display screen 206. Additionally, it will be understood that the term "touch" as used herein may include both physical (physical) touch and/or "proximity touch" of an object in close proximity to the display screen.
Image capture device 220 may include any suitable image sensing mechanism. Examples of suitable image sensing mechanisms include, but are not limited to, CCD (charge coupled device) and CMOS (complementary metal oxide semiconductor) image sensors. In addition, the image sensing mechanism may capture images of the display screen 206 at a sufficient frequency or frame rate to detect motion of an object (object) across the display screen 206 at a desired rate. In other embodiments, a scanning laser may be used in combination with a suitable photodetector to acquire an image of the display screen 206.
Image capture device 220 may be configured to detect reflected or emitted energy of any suitable wavelength, including but not limited to infrared and visible wavelengths. To assist in detecting objects placed on the display screen 206, the image capture device 220 may also include additional light sources 222, such as one or more Light Emitting Diodes (LEDs), configured to generate infrared or visible light. Light from light source 222 may be reflected by an object placed on display screen 222 and then detected by image capture device 220. Using infrared LEDs as opposed to visible LEDs may help avoid washing away the appearance of the projected image on the display screen 206.
FIG. 2 also depicts a finger 226 of a user's hand touching the display screen. Although the embodiments herein are described in the context of a user's finger touching a touch-sensitive display, it will be understood that the concepts may be extended to detect touches on the display screen 206 by any other suitable physical object, including but not limited to, a stylus, a cell phone, a smart phone, a camera, a PDA, a media player, other portable electronic products, barcodes and other optically readable indicia, and the like. Additionally, although disclosed in the context of an optical touch sensing mechanism, it will be understood that the concepts disclosed herein may be used with any suitable touch sensing mechanism. The term "touch sensitive display" is used herein to describe not only the display screen 206, light source 222, and image capture device 220 of the described embodiments, but also any other suitable display screen and associated touch sensing mechanisms and systems (including, but not limited to, capacitive and resistive touch sensing mechanisms).
3-5 depict aspects of one embodiment of the invention in which a user interacts with a plurality of grouped icons over time. FIG. 3 depicts, as an example, a grouped plurality of items for which aspects of one embodiment of the invention may be implemented. Region 304 includes grouped items 306, 308, and 310. As shown, item 306 includes an icon for a wireless network connection of the computer, item 308 includes an icon for a system sound of the computer, and item 310 includes an icon for a battery of the computer. These icons 306-310 are grouped and displayed within the area 304. For example, in a version of the MICROSOFT WINDOWS operating system, region 304 may be the notification area of the WINDOWS taskbar and icons 306 and 310 may be icons in the notification area that display system and program features.
Area 302 represents a border region of grouped icons. This may serve as a boundary for: where initial user touch input appearing within this area (e.g., within area 302 when it is displayed on the touch screen where input is received) is recognized as input that is interpreted as affecting area 304 and icons 306 contained therein as well as 310. This initial user touch input is the first time the user touches the touch screen after a period of time in which the user has not yet touched the touch screen. There may also be embodiments that do not involve a border region (e.g., border region 302). For example, rather than determining what portion of the display was manipulated as a result of the initial user touch input, the system may periodically re-evaluate the current user touch input and determine from this which area the input affected.
FIG. 4 depicts the grouped plurality of items of FIG. 3 for which representations of information not otherwise available via user input are displayed in response to user touch input. As shown in fig. 4, the user has wiped with his or her finger 414 within the boundary 302 and is now touching the icon 308, the system sound icon. As a result of this, a representation of information not otherwise available through touch input is provided to the user. In this case, it is text 412 and an enlarged icon 408, the text 412 indicating the volume level ("system sound: 80%"), and the enlarged icon 408 providing a larger representation of the icon 308. The representation of other information not otherwise available via touch input may include a small pop-up window identifying the purpose of the icon (e.g., it is for system sound). In versions of the MICROSOFT WINDOWS operating system, such pop-up WINDOWS may be "information tips".
Also depicted in fig. 4 are icons 406 and 410, which combine with the enlarged icon 408 to produce a "cascading" effect (for the icon currently being manipulated by the user) centered on the enlarged icon 408. These icons 406 and 410 are displayed even though they are not as large as the enlarged icon 408 and do not display the corresponding text information as the text information 412 is displayed with the enlarged icon 408. This may help the user to recognize that: he or she may obtain a representation of information about nearby icons that is not otherwise available via touch input by wiping them, similar to how he or she currently receives a representation of information for such icons 308.
FIG. 5 depicts the grouped plurality of items of FIG. 4 for which representations of second information not otherwise available via user input is displayed in response to additional user touch input. As shown in fig. 5, time has elapsed since the time depicted in fig. 4, and the user has now wiped his or her finger 414 further to the right so that it touches the icon 310. Thus, in FIG. 5, the system displays a representation of information about icon 310 that is not otherwise available via touch input, while in FIG. 4, the system displays a representation of information about icon 308 that is not otherwise available via touch input. The representation of the information about the icon 310 is text 512 (which reads "battery: 60%" and is similar to text 412 of fig. 4) and an enlarged icon 510, the icon 510 showing an enlarged version of the icon 310 (and is similar to the enlarged icon 408 of fig. 4).
Fig. 5 also depicts a cascading effect similar to that of fig. 4. The cascading effect of fig. 5 is centered on enlarged icon 510 and refers to icon 508. There are no additional small icons presented for icon 306 because in this cascading effect, only the nearest neighbors to the left and right receive the effect. Similarly, there is no cascading effect displayed to the right of the enlarged icon 510, because item 310 is the rightmost item, and thus there is no item to the right of it for which a cascading effect can be generated.
Fig. 6 depicts, as an example, a word processor window in which aspects of one embodiment of the invention may be implemented, similar to how the invention may be implemented as depicted in fig. 3-5. Fig. 6 depicts a word processor window 602. The word processor window 602 includes a text area 608 in which text is typed and displayed (which displays the text "res ipsa loquitor" 604) and a menu area 606 in which buttons for manipulating the word processor (e.g., print, save, or highlight text buttons) are displayed. Menu area 606 includes a plurality of grouped items 610, which in turn are comprised of items 612, 614, and 616. Each item 612-. For example, style may set forth the font, the size of the font, adjustments to the text, whether the text is bold, underlined, and/or italicized.
Fig. 6 depicts a different, alternative version of the mouse up/click presented in fig. 3-5. While in fig. 3-5, clicking (or finger tapping) on an item may have opened an application window for the item and wiping on the item shows information about the item (as with enlarged icon 510 and text 512), here in fig. 6 clicking/tapping on an item may select the style until a new style is selected to replace it, while wiping over an item shows how the style will affect the preview of text 604 (and not showing the preview when the finger is no longer wiped on the item).
For example, in FIG. 6, item 612 corresponds to style 612 that includes bold and underlined text. The user has wiped his or her finger 414 until it is over item 612, thus showing a preview of the style on text 604 and the text appears to be both bold and underlined. If the user then wipes his or her finger 414 further to the right across item 612, the preview will not be shown, and a preview of genre 2 or genre 3 may be shown if the user wipes across item 614 or 616. The present invention is directed to providing a representation of information for an item of a plurality of grouped items via touch input with this difference between the application style and the preview of the acquisition style, wherein the representation is not otherwise readily available via touch input (accessible).
FIG. 7 depicts an example web browser window in which aspects of one embodiment of the present invention may be implemented. Among other things, fig. 7 differs from fig. 6 in that: in fig. 7, the items (items 708, 710, and 712) are text, and in fig. 6, the items (items 612, 614, and 616) are icons. Web browser window 702 includes a status area 704. Within the body of web browser window 702 are a plurality of grouped items — a hyperlink 708, a hyperlink 710, and a hyperlink 712. The three grouped items 708-.
As shown in fig. 7, the user has wiped his or her finger 414 within the border region 714 and now touches the hyperlink 2710. Because of this touch input, the system displaying the web browser window 702 displays a representation of information that would otherwise not be available via touch input in the form of the URL 706 for the hyperlink 710 "—" http:// www.contoso.com "—. The information itself may additionally be used by the user in a different representation. For example, if the user clicks on the link, causing the web browser to load and display a web page at http:// www.contoso.com, and display "http:// www.contoso.com" in its address bar. Although this information may be the same as that displayed in the status area, it is a different representation of this information because it is located in the address bar rather than the status bar, and it is information about the current page being viewed rather than the page that would be viewed if the user followed the link.
FIG. 8 depicts an exemplary text menu list in which aspects of one embodiment of the present invention may be implemented. Fig. 8 differs from fig. 3-6 in that: the plurality of grouped items in fig. 8 are all text items, while they are icons in fig. 3-6. Fig. 8 differs from fig. 7 in that: although they each depict a plurality of grouped items as text, in FIG. 7 the text is displayed within the page (items 708 and 712), while in FIG. 8 the text (items 804, 806, 808, and 810) is displayed in the menu list 802 (e.g., drop down menu). In fig. 8, the user has engaged menu list 802 and wiped his or her finger against menu item 4810. Because of this user input, the system displaying menu list 802 is displaying a representation of information 812 about menu item 4 that is not otherwise accessible via touch input. For example, where menu item 4810 (when selected) causes a window associated with menu list 802 to be printed, the representation of information 812 about menu item 4 may be a pop-up window indicating which printer the window is to be printed to.
FIG. 9 depicts example operational procedures for practicing one embodiment of the present invention. The present invention may be implemented by storing computer readable instructions for performing the operations of fig. 9 in the memory 22 of the computer 21 of fig. 1. The operational procedure of fig. 9 may be used to implement aspects of the embodiments of the present invention depicted in fig. 2-8. The operational procedure of fig. 9 begins with operation 900, which results in operation 902.
Operation 902 depicts displaying a plurality of grouped items in a user interface. The items of the groupings may be items 306-310 as depicted in fig. 3-5, items 612-616 as depicted in fig. 6, items 708-712 as depicted in fig. 7, or items 804-810 as depicted in fig. 8. The items may be icons (as shown in fig. 3-6) or text (as shown in fig. 7-8). Items may be considered grouped because wiping a finger or otherwise providing touch input to an item area (e.g., the border area 302 of FIG. 3) causes the present invention to provide a representation of information not otherwise readily available via touch input based on which item of a plurality of grouped items is occupied.
Operation 904 depicts determining that the user input received at the touch input device indicates an input near the grouped items. This input near the grouped items may be, for example, an input within the bounding region 302 of fig. 3-5, the region 610 of fig. 6, the region 714 of fig. 7, or the region 802 of fig. 8. The user input may include a finger press at a touch input device (e.g., at interactive display 200 of fig. 2), a stylus press at the touch input device, or an input otherwise implemented using the touch input device. The user input may include a wiping motion in which the user presses down on the touch input device at an initial point and then moves his or her finger in a certain direction while maintaining contact with the touch input device.
Operation 906 depicts displaying, in response to the user input, a representation of information for an item of the plurality of grouped items, the representation of information not readily available via other touch inputs. A representation of this information not otherwise available via other touch inputs may be, for example, the magnified icon 408 and explanatory text 412 of fig. 4, the magnified icon 510 and explanatory text 512 of fig. 5, a preview of style 1 applied to the text 604 of fig. 6, an indication of the URL 706 of the hyperlink 2710 displayed in the status region 704 of fig. 7, or information 812 about the menu item 4 of fig. 8.
In one embodiment, operation 906 includes enlarging the item in the user interface. This is shown in the magnified icons 408 and 510 of fig. 4 and 5, respectively. In one embodiment, operation 906 includes displaying an animation that displays the representation prior to displaying the representation. For example, in fig. 4, representations of information not otherwise accessible via touch input include an enlarged icon 408. In this embodiment, the magnified icon is initially rendered small and may be gradually magnified via animation to its full size as depicted in FIG. 4.
In one embodiment, the representation includes textual or graphical information that informs the user of the purpose or status of the item. For example, the user is notified of the purpose and status of the item 308 via the explanatory text 412. The user is notified via text 412 of the purpose of the item-the icon is for "system sound". The user is also notified via text 412 of the status of the item-the status of the system sound is that the sound level is 80%.
Inputs accepted into a system implementing the operational procedure of FIG. 9 may include both touch inputs and mouse inputs (including an on-screen pointer). In such a scenario, it is possible that this representation of information is readily available via mouse input, where the user performs mouse-up with an on-screen pointer. The representation of the input is in this way not easily available via other touch inputs, since it can be easily obtained via non-touch inputs.
Likewise, the information itself may be readily available additionally via touch input, but the current representation of the information is not readily available additionally via other touch input. Taking fig. 4 as an example, the representation of information that is not otherwise readily available via other touch inputs includes explanatory text 412, which reads "system sound: 80% ". It may be possible to additionally determine that the system sound level is 80%. For example, the user may tap his or her finger 414 on the system sound icon 308, which causes a separate window for system sound settings to be presented, and the settings window may show the current system sound level to be 80%. In this sense, the information itself is otherwise readily available via other touch inputs, but it is represented via a separate window in a different manner than the current explanatory text 412 shown directly on the icon 308 in the display area of the icon 308.
In addition, the representation can be easily derived additionally via a touch input, since another touch gesture of the same type can cause it to be presented. For example, when the gesture includes a right swipe until the touch corresponds to the item, a swipe starting from the right side of the item and moving to the left until the touch corresponds to the item may also cause the representation to be presented. However, other types of touch gestures or inputs may not cause the representation to be presented. For example, a gesture of tapping an item or performing a finger fold or spread on an item (commonly referred to as "pinch" and "reverse pinch" gestures) may not cause the representation to be presented.
This concept, which is not otherwise accessible via touch input, can be seen in some address book applications. For example, where wiping across a list of letters to the letter "M" may cause address book entries beginning with that letter to be displayed in the display area, the user may also scroll (e.g., by a "flick" gesture) across the display area itself to reach the display point of entries beginning with "M". In such a scenario, the representation of the information is otherwise readily available via touch input.
Operation 908 depicts determining that a second user input received at the touch input device is indicative of an input navigating away from the plurality of grouped icons; and stops displaying the representation of the information of the item. There is no need to persistently display a representation of information that is not otherwise readily available via other touch inputs. Where the user wipes toward an item, displaying a representation of information not otherwise available via other touch inputs, he or she may later wipe away from the item. In such a case, the representation is not displayed persistently, but only when the user interacts with the item. Thus, when the user navigates away, the representation is no longer displayed.
Operation 910 depicts determining that a second user input received at the touch input device is indicative of navigating toward a second icon of the plurality of grouped icons; ceasing to display the representation of the information of the item; and displaying a representation of information of a second item of the plurality of grouped items, the representation of information not readily available via other touch inputs. Operation 910 can be seen in the difference between fig. 4 and 5. In FIG. 4, the user interacts with the first item, item 308, and a representation of the information for that item is displayed (via enlarged icon 408 and explanatory text 412). FIG. 5 depicts a later point in time than in FIG. 4, and the user has now continued to wipe right until interacting with a second item, item 310, of the plurality of grouped items. Now, in fig. 5, a representation of the information of the second item (item 310) is displayed (via an enlarged icon 510 and explanatory text 512).
Operation 912 depicts determining that no user input is received at the touch input device; and stops displaying the representation of the information of the item. Similar to operation 908, where displaying the representation of the information terminates where the user's input now indicates that it is not interacting with the item, the display of the representation of the information may terminate or stop where the user lifts his or her finger or other input device (e.g., a stylus) from the touch input area. In response, the display of the representation is terminated at operation 912.
The operational procedure of fig. 9 ends at operation 914. It will be appreciated that embodiments of the invention may be implemented with a subset of the operations of fig. 9, or with an arrangement of these operations. For example, one embodiment of the invention may function where it implements operational procedures 900, 902, 904, 906, and 914. As such, one embodiment of the invention may function where operation 910 is performed prior to operation 908.
Conclusion
While the present invention has been described in connection with the preferred aspects as illustrated in the various figures, it is to be understood that other similar aspects may be used or modifications and additions may be made to the described aspects for performing the same function of the present invention without deviating therefrom. Accordingly, the present invention should not be limited to any single aspect, but rather construed in breadth and scope in accordance with the appended claims. For example, the various processes described herein may be implemented in hardware or software or a combination of both. Thus, the methods and apparatus of the disclosed embodiments, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium. When the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus configured to implement the disclosed embodiments. In addition to the specific embodiments explicitly set forth herein, other aspects and embodiments will be apparent to those skilled in the art from consideration of the specification disclosed herein. It is intended that the specification and illustrated embodiments be considered as examples only.
Claims (15)
1. A method for providing a user interface in a touch input environment, comprising:
displaying a plurality of grouped items in the user interface (902);
determining that user input received at the touch input device indicates an input in proximity to the grouped items (904); and is
In response to the user input, displaying a representation of information for an item of the plurality of grouped items, the representation of information not readily available via other touch inputs (906).
2. The method of claim 1, further comprising:
determining that a second user input received at the touch input device is indicative of navigating toward a second icon of the plurality of grouped icons;
ceasing to display the representation of the information of the item; and is
Displaying a representation of information of a second item of the plurality of grouped items, the representation of information not readily available via other touch inputs.
3. The method of claim 1, wherein displaying a representation of information of an item comprises:
zooming in the item in the user interface.
4. The method of claim 1, further comprising:
determining that a second user input received at the touch input device is indicative of an input navigating away from the plurality of grouped icons; and is
Ceasing to display the representation of the information of the item.
5. The method of claim 1, wherein displaying a representation of information of an item comprises:
displaying an animation that displays the representation prior to displaying the representation.
6. The method of claim 1, further comprising:
determining that user input is not received at the touch input device; and is
Ceasing to display the representation of the information of the item.
7. The method of claim 1, wherein the representing comprises:
text or image information informing a user of the use or status of the item.
8. The method of claim 1, wherein the user input comprises:
and (6) wiping.
9. The method of claim 1, wherein the user input comprises a finger press at the touch input device.
10. The method of claim 1, wherein the user input comprises a stylus press at the touch input device.
11. A system for providing a user interface in a touch input environment, comprising:
a processor (22); and
a memory (21) communicatively coupled to the processor when the system is in operation, the memory carrying processor-executable instructions that, when executed by the processor, cause the processor to perform operations comprising:
displaying a plurality of grouped items in the user interface (902);
determining that user input received at the touch input device indicates an input in proximity to the grouped items (904); and is
In response to the user input, displaying a representation of information for an item of the plurality of grouped items, the representation of information not readily available via other touch inputs (906).
12. The system of claim 11, wherein the memory further bears processor-executable instructions that, when executed by the processor, cause the processor to perform operations comprising:
determining that a second user input received at the touch input device is indicative of navigating toward a second icon of the plurality of grouped icons;
ceasing to display the representation of the information of the item; and is
Displaying a representation of information of a second item of the plurality of grouped items, the representation of information not readily available via other touch inputs.
13. The system of claim 11, wherein the memory further bears processor-executable instructions that, when executed by the processor, cause the processor to perform operations comprising:
zooming in the item in the user interface.
14. The system of claim 11, wherein the memory further bears processor-executable instructions that, when executed by the processor, cause the processor to perform operations comprising:
determining that a second user input received at the touch input device is indicative of an input navigating away from the plurality of grouped icons; and is
Ceasing to display the representation of the information of the item.
15. The system of claim 11, wherein the memory further bears processor-executable instructions that, when executed by the processor, cause the processor to perform operations comprising:
displaying an animation that displays the representation prior to displaying the representation.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US12/907893 | 2010-10-19 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| HK1169729A true HK1169729A (en) | 2013-02-01 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11550471B2 (en) | Touch input cursor manipulation | |
| US12131005B2 (en) | Systems, methods, and user interfaces for interacting with multiple application windows | |
| AU2022200212B2 (en) | Touch input cursor manipulation | |
| US9207806B2 (en) | Creating a virtual mouse input device | |
| US20220100368A1 (en) | User interfaces for improving single-handed operation of devices | |
| US12277308B2 (en) | Interactions between an input device and an electronic device | |
| EP2815299B1 (en) | Thumbnail-image selection of applications | |
| US10162452B2 (en) | Devices and methods for processing touch inputs based on their intensities | |
| US8386963B2 (en) | Virtual inking using gesture recognition | |
| US20120092381A1 (en) | Snapping User Interface Elements Based On Touch Input | |
| US20120233545A1 (en) | Detection of a held touch on a touch-sensitive display | |
| CA2847177A1 (en) | Semantic zoom gestures | |
| US10067653B2 (en) | Devices and methods for processing touch inputs based on their intensities | |
| AU2011318454B2 (en) | Scrubbing touch infotip | |
| WO2014034369A1 (en) | Display control device, thin-client system, display control method, and recording medium | |
| US20170228128A1 (en) | Device comprising touchscreen and camera | |
| HK1169729A (en) | Scrubbing touch infotip |