[go: up one dir, main page]

HK1179020B - Dynamic scaling of touch sensor - Google Patents

Dynamic scaling of touch sensor Download PDF

Info

Publication number
HK1179020B
HK1179020B HK13106565.7A HK13106565A HK1179020B HK 1179020 B HK1179020 B HK 1179020B HK 13106565 A HK13106565 A HK 13106565A HK 1179020 B HK1179020 B HK 1179020B
Authority
HK
Hong Kong
Prior art keywords
user interface
user
display screen
touch sensor
mapping
Prior art date
Application number
HK13106565.7A
Other languages
Chinese (zh)
Other versions
HK1179020A (en
Inventor
M.C.米勒
M.施维辛格
H.根茨科
B.阿什利
J.哈里斯
R.汉克斯
A.J.格兰特
R.萨林
Original Assignee
微软技术许可有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 微软技术许可有限责任公司 filed Critical 微软技术许可有限责任公司
Publication of HK1179020A publication Critical patent/HK1179020A/en
Publication of HK1179020B publication Critical patent/HK1179020B/en

Links

Description

Dynamic scaling of touch sensors
Technical Field
The present invention relates to mapping between touch sensitive devices and display devices, and more particularly to dynamic scaling of touch sensors.
Background
Many computing devices utilize touch sensors as user input devices. Input made via the touch sensor can be translated into actions on the graphical user interface in a variety of ways. For example, in some cases, a touch sensor may be used solely to track changes in finger position on a surface, for example, to control movement of a cursor. Thus, the particular location of the touch on the touch sensor does not affect the particular location of the cursor on the graphical user interface. Such interpretation of touch input may be useful, for example, for a touchpad of a laptop computer in which the touch sensor is not located directly above the display device.
In other cases, the location on the touch sensor may be mapped to a corresponding location on the graphical user interface. In this case, a touch made to the touch sensor may affect the user interface element at the particular display screen location that is mapped to that touch sensor location. Such direct mapping may be useful, for example, where a transparent touch sensor is positioned over a display.
Disclosure of Invention
Embodiments are disclosed that relate to dynamically scaling a mapping between a touch sensor and a display screen. For example, one disclosed embodiment provides a method that includes setting a first user interface mapping that maps an area of a touch sensor to a first area of a display screen, receiving a user input from a user input device that changes a user interaction context of the user interface, and setting a second user interface mapping that maps the area of the touch sensor to a second area of the display screen in response to the user input. The method also includes providing output to the display device representing the user input as a user interface image at a location based on the second user interface mapping.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Drawings
FIG. 1 illustrates an example embodiment of a use environment for a touch-sensitive input device.
FIG. 2 shows a flow diagram depicting an embodiment of a method of dynamically scaling a mapping of touch sensors to a display screen.
FIG. 3 illustrates an embodiment of a touch sensitive user input device including a touch sensor, and also illustrates an example first mapping of touch sensors to a display screen.
FIG. 4 illustrates a second example mapping of the embodiment of FIG. 5 based on changes in user interface context.
FIG. 5 shows another example mapping of sub-regions of a touch sensor that are mapped to corresponding sub-regions of a user interface at different screen aspect ratios.
FIG. 6 shows a block diagram of an example embodiment of a dedicated remote control user input device.
FIG. 7 shows an example of user interaction with the embodiment of FIG. 6.
FIG. 8 shows an example of another user interaction with the embodiment of FIG. 6.
FIG. 9 shows a flow diagram depicting an embodiment of a method of operating a user input device.
FIG. 10 illustrates a block diagram of an embodiment of a computing device.
Detailed Description
As described above, the touch sensor may be mapped to the graphical user interface such that a particular location on the touch sensor corresponds to a particular location on the graphical user interface. In the case where such a touch sensor is located directly above a graphical user interface, as in the case of a smartphone or notebook computer, selecting an appropriate location to make a desired touch input simply involves directly touching a surface located above the desired user interface element.
However, finding the correct location on the touch sensor to make a touch input may be difficult in situations where the touch sensor is not located directly above the graphical user interface. Fig. 1 illustrates an example embodiment of a use environment 100 in which a user 102 is utilizing a touch-sensitive device 104 to remotely interact with a user interface displayed on a separate display system, such as a display device 106 (e.g., a television or monitor) connected to a media presentation device 107, such as a video game system, personal media computer, set-top box, or other suitable computing device. Examples of touch sensitive devices that may be used as remote control devices in use environment 100 include, but are not limited to, smart phones, portable media players, notebook computers, laptop computers, and dedicated remote control devices.
In such a use environment, it may be desirable to not display an image of the user interface on the remote control device during use to avoid a potentially disruptive user experience that must be viewed back and forth between the display screen and the remote control device. However, when the touch sensor is not in the user's direct field of view, the user may experience some difficulty in quickly selecting user interface elements if the user views a relatively distant display screen. To help overcome these difficulties, current touch sensitive devices may allow a user to zoom in on a portion of the user interface for greater accuracy. However, this may occlude other areas of the user interface and may also increase the complexity of interacting with the user interface.
Accordingly, embodiments disclosed herein relate to facilitating use of a touch sensitive user input device by dynamically scaling a mapping of touch sensors to active portions of a user interface. Referring herein to FIG. 1, a user 102 is shown interacting with a text input user interface 110, the text input user interface 110 including active areas (e.g., areas with user-selectable controls) in the form of a layout of letter input controls 112 and text display and editing fields 114. The active area of the user interface 110 occupies only a portion of the display screen 116 of the display device 106. Thus, if the entire touch sensor 118 of the touch sensitive device 104 is mapped to the entire display screen 116, only a portion of the touch sensor 118 will be available to interact with the active area of the user interface 110, while other portions of the touch sensor 118 will not be available.
Thus, according to the disclosed embodiments, as the user 102 navigates to the text input user interface 110, the mapping of the touch sensor 118 to the display screen 116 may be dynamically adjusted such that a relatively large area of the touch sensor 118 is mapped to an area of the display device 106 that corresponds to the active area of the user interface 110. This may allow the user to have more precise control over the user input.
In some embodiments, different areas of the touch sensor may be dynamically scaled to different degrees relative to the user interface. For example, this may allow more commonly used user interface controls to be assigned a relatively larger area on the touch sensor than less commonly used controls of similar size on the user interface. This may allow a user to select a more common control with less precise touch input than a less common control. Likewise, user interface controls that are more serious to incorrect selection may be assigned a relatively smaller area on the touch sensor than controls that are similarly sized but are less serious to incorrect selection. This may require the user to more carefully select a more serious-consequential action (high-consequential). As a more specific example, the mapping of the touch sensors may be scaled differently for the "pause" control and the "stop" control on the media playback user interface to make the "pause" control easier to select, as accidentally selecting the "pause" control may not be as serious as accidentally selecting the "stop" control.
FIG. 2 shows a flow diagram depicting an embodiment of a method 200 of dynamically scaling a mapping of a touch sensor to a display screen of a display device. It will be understood that the method 200 may be performed by any suitable device, including but not limited to the remote control device, the media presentation device of fig. 1. The method 200 includes, at 202, setting a first user interface mapping that maps an area of a touch sensor of a remote control device to a first area of a display device screen. The method 200 also includes receiving a first user input from a touch-sensitive user input device at 204, and providing an output to a display device representing the first user input as a first user interface image at a location based on a first user interface mapping at 206. FIG. 3 illustrates an example embodiment of a touch input and a user interface image. In the example of FIG. 3, the entire area of the touch sensor 118 is mapped to the entire area of the display screen 116 at a single aspect ratio. In this figure, it can be seen that movement of the touch input 300 between selected locations on the touch sensor 118 results in movement of a cursor 302 at an appropriate location on a user interface displayed on the display screen 116.
Continuing with FIG. 2, method 200 then includes receiving a second touch input at 208 that changes the context of the user interaction with the user interface. As used herein, "change context" or the like may refer to any change in an interactive aspect of a user interface, such as a change in a selection of a displayed control, a change in a position of a control, or the like. In FIG. 2, an example touch input is depicted as selection of the search bar shown in FIG. 3. In response to the second touch input, method 200 includes, at 210, setting a second user interface mapping that maps an area of the touch sensor to a second area of the display screen that is different from the first area of the display screen. The second region of the display screen may have a different size than the first region, as indicated at 212, may have a different location than the first region, as indicated at 214, and/or any other suitable difference compared to the first region. In addition, the second region of the display screen may also have a different aspect ratio than the first mapping. Method 200 also includes providing an output of a second user interface image representing a second user input at a location based on a second user interface mapping, at 218. The second user interface image may include any other suitable information, such as a plurality of user interface controls configured to be displayed within the second region of the display screen.
FIG. 4 illustrates an example embodiment of a second mapping of areas of a touch sensor to a display screen. Instead of mapping the entire sensor area to the entire display screen at a single aspect ratio, FIG. 4 illustrates mapping the entire area of the touch sensor to the area of the display screen occupied by the active letter input control 112 and the text display and editing field 114 at a single aspect ratio, while excluding other areas of the display screen not occupied by these elements. Thus, in the depicted embodiment, the second area of the display screen is smaller than the first area of the display screen. Such mapping may allow display space for other elements, such as search results, to be included on the display screen while facilitating the entry of touch inputs by providing more touch sensor area by which to make such touch inputs. Although the change in touch sensor mapping is described herein in the context of a text input user interface, it will be understood that dynamic touch sensor mapping changes may be used in any other suitable user interface context in which additional touch input precision may be required.
As described above, in some embodiments, different areas of the touch sensor may be dynamically scaled to different degrees relative to the user interface, such that different user interface spaces may be easier or less easily located. For example, this may allow more commonly used user interface elements to be assigned a relatively larger area on the touch sensor than less commonly used controls of similar size on the user interface.
FIG. 5 illustrates an embodiment of touch sensor mapping in which a first sub-region of a display screen and a second sub-region of the display screen are mapped to touch sensors at different screen aspect ratios based on similar usage patterns. More specifically, because user interaction with letter input controls on a text entry user interface is more common than with text display and editing fields, the mapping of touch sensors to the user interface of FIG. 5 is configured to facilitate selection of letter input controls and encourage more deliberate user input to select text display and editing fields. The first sub-region 500 of the display screen is depicted as including the letter input control 112, while the second sub-region is depicted as including the text display and editing field 114. As shown, the first sub-region 500 is mapped to a sub-region 504 of the touch sensors 118 that occupies a greater relative amount of touch sensor area than the display screen area occupied by the letter input controls 112. Likewise, the second sub-region 502 of the display screen is mapped to a sub-region 506 of the touch sensors 118 that occupies a smaller relative area of the touch sensors 504 than the relative amount of display screen area occupied by the text display and editing field 114. In this manner, the touch sensor mapping shown in FIG. 5 may facilitate selection of the letter input control 112 while helping to avoid inadvertent selection of the text display and editing field 114.
In some embodiments, the user interface mapping may be configured to exhibit some hysteresis as the touch input moves between the sub-regions. For example, after a user's finger has passed across a border of a first sub-region of a touch sensor/user interface map to a second sub-region to enter a touch sensor region corresponding to a user interface control, a user interface element in the second sub-region that is currently focused due to a touch input may not be changed, even after the user has passed across the border back to the first sub-region, until the cursor has passed the border by a threshold distance. This may involve more deliberate user input to move between user interface controls and thus may help avoid inadvertent input. In other embodiments, a single border location may be used to identify switching between touch sensor sub-regions in any direction of movement. It will be appreciated that the degree of hysteresis between sub-regions may vary similarly to the mapping of sub-regions. For example, a greater amount of hysteresis may be applied when moving into areas with more severe consequences of an unintended selection than areas with less consequences.
As described above, dynamic scaling of touch sensors to a user interface may be used with any suitable touch-sensitive input device, including, but not limited to, smart phones, portable media players, notebook computers, laptop computers, and dedicated remote control devices. FIG. 6 illustrates a block diagram of an embodiment of a dedicated touch-sensitive remote control device 600 configured to facilitate text entry in comparison to conventional touch-sensitive devices, while FIG. 7 illustrates an example environment of use for the remote control device 600. The remote control device 600 includes a touch sensor 602 having at least a first touch area 604 and a second touch area 606. Further, a first actuator 608 is associated with the first touch area 604, and a second actuator 610 is associated with the second touch area 606. The first actuator 608 is configured to actuate via a press in the first touch area 604, while the second actuator 610 is configured to actuate via a press in the second touch area 606. The user may select the letter to be input by moving the cursor over the desired letter via a touch input, followed by pressing the touch area to trigger the corresponding actuator. FIG. 7 shows a first cursor 700 for the first touch area 604 and a second cursor 702 for the second touch area 606, each indicating a location of a touch input as mapped onto the display screen. In other embodiments, the dedicated remote control device may include a single actuator that is triggered via pressure on the touch-sensitive surface, or no actuator. In such embodiments, various heuristics may be used to simulate a click-type user intent. It will also be understood that the two touch areas may also include a single physical touch surface without contours between the touch areas, and further mapped in various applications such that the two touch areas are considered a single touch area.
The use of two touch areas and two actuators allows the user to independently manipulate separate cursors for each hand, as depicted in FIG. 7, and thus may help to improve the efficiency of text entry. Furthermore, in some embodiments, the remote control device 600 may lack other features on the display screen or touch sensor. This may help prevent the user from being distracted from the display screen of the display device being controlled and thus help focus the user's attention on the display device.
Remote control device 600 also includes a logic subsystem 612 and a data-holding subsystem 614, data-holding subsystem 614 including instructions stored thereon that are executable by logic subsystem 612 to perform various tasks, such as receiving user input and communicating user input to a media presentation system, a display system, and the like. Examples of these components are discussed in more detail below.
The use of separate first and second touch areas, each with independently operated actuators, may allow a user to quickly enter text with two thumbs or other fingers without having to lift the fingers off the surface between letter inputs. Furthermore, since the remote control device 600 may lack a display screen, the user is not distracted by looking down at the remote control device 600 during use, but instead the user may focus the full attention on the display device. These features may provide various advantages over other methods of entering text in a use environment where the touch sensor may be located a distance from the display screen and outside of a direct field of view when the user is looking at the display screen. For example, some remote control devices utilize a directional pad (e.g., controlled with up, down, left, and right commands) to move a cursor over a displayed alphanumeric keyboard layout. However, such text entry can be slow and error prone. Other remote control devices may include a hard keyboard. A hard keyboard may improve the efficiency of text entry compared to using a directional pad, but may also increase the size, complexity, and cost of the input device. The inclusion of a hard keyboard may also force the user to distract between looking down at the device and looking up at the display screen. In contrast, in the embodiment of FIG. 6, the inclusion of two actuators instead of an actuator for each button of the hard keyboard may help reduce the cost of the device. It will be appreciated that the touch sensor 602 of the remote control device 600 may be dynamically mapped to the display screen as described above, which may further facilitate text selection.
The first actuator 608 and the second actuator 610 may utilize any suitable actuation mechanism. In some embodiments, the actuators 608, 610 may comprise physical buttons that provide tactile feedback when text is selected. In other embodiments, the actuators 608, 610 may utilize pressure sensors or other actuation mechanisms. Where pressure sensors or similar actuation mechanisms are used, remote control device 600 may include a haptic feedback system 616, such as a vibration mechanism, to provide feedback to the user regarding the registered input.
In the embodiment of FIG. 7, the cursors 700, 702 indicate the location of fingers on the touch sensor 602, while other highlighting is used as a focus indicator to indicate which user interface controls are currently in focus. In the particular example of FIG. 7, the left cursor 700 is positioned to provide focus on the letter "e", and the right cursor 702 is positioned to provide focus on the letter "j". In other embodiments, the touch location and focus of the touch input may be indicated via a single user interface element.
It will be appreciated that the number of cursors displayed and the mapping of the touch sensor 602 to the display screen may depend on the number of fingers touching the touch sensor 602. For example, as depicted in FIG. 7, two cursors 700, 702 may be displayed when two fingers are touching the touch sensor 602. In this case, the first touch area 604 and the second touch area 606 of the touch sensor 602 may be mapped to corresponding first and second areas of the display screen. Likewise, a single cursor 800 may be displayed on the display screen when a single finger is touching the touch sensor 602, for example, when the remote control device 600 is held in a portrait orientation (as shown in FIG. 8). In this case, one touch area (e.g., the first touch area 604) of the touch sensor 602 may be mapped to the entire active area of the display screen.
FIG. 9 illustrates an embodiment of a method 900 of operating a remote control device, such as remote control device 600. Method 900 includes, at 902, independently detecting and tracking movement of first and second touch inputs respectively occurring in first and second regions of a touch sensor, such as first touch region 604 and second touch region 606 of touch sensor 602. Method 900 then includes, at 904, independently tracking actuation of a first actuator corresponding to the first touch surface and a second actuation corresponding to the second touch surface. The method 900 further includes communicating information about the detected touch input and actuation with a remote computing device, at 906. When the actuation is performed by the user, the remote computing device may then perform an action corresponding to the user interface element based on the location of the touch input.
As mentioned above, display systems and touch-sensitive input devices as described above include, but are not limited to, touch-sensitive device 104, display device 106, media presentation device 107, and remote control device 600, each of which may take the form of a computing system. FIG. 10 schematically illustrates a non-limiting example computing system 1000 that can perform one or more of the above-described methods and processes. Computing system 1000 is shown in simplified form. It should be understood that substantially any computer architecture may be used without departing from the scope of this disclosure. In different embodiments, the computing system 1000 may take the form of a mainframe computer, server computer, desktop computer, laptop computer, tablet computer, home entertainment computer, network computing device, mobile communication device, gaming device, or the like.
Computing system 1000 includes a logic subsystem 1002 and a data-holding subsystem 1004. Computing system 1000 may optionally include a display subsystem 1006, or may omit a display system (as described with reference to the remote control device of FIG. 6). Computing system 1000 may also include a communication subsystem 1008 for communicating with other computing devices and a sensor subsystem 1009 including touch sensors configured to detect touch inputs. Computing system 1000 may also include other input and/or output devices not described herein.
Logic subsystem 1002 may include one or more physical devices configured to execute one or more instructions. For example, logic subsystem 1002 may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result.
Logic subsystem 1002 may include one or more processors configured to execute software instructions. Additionally or alternatively, logic subsystem 1002 may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The processors of logic subsystem 1002 may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. Logic subsystem 1002 may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of logic subsystem 1002 may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.
Data-holding subsystem 1004 may include one or more physical, non-transitory devices including computer-readable media configured to store data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. In implementing such methods and processes, the state of data-holding subsystem 1004 may be transformed (e.g., to hold different data).
Data-holding subsystem 1004 may include removable media and/or built-in devices. Data-holding subsystem 1004 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-ray disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. Data-holding subsystem 1004 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, logic subsystem 1002 and data-holding subsystem 1004 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
FIG. 10 also illustrates an aspect of the data-holding subsystem in the form of removable computer-readable storage media 1010, which may be used to store and/or transfer data and/or instructions executable to implement the herein described methods and processes. Removable computer-readable storage medium 1010 may take the form of, inter alia, a CD, DVD, HD-DVD, blu-ray disc, EEPROM, and/or floppy disk.
It will be appreciated that data-holding subsystem 1004 includes one or more physical, non-transitory devices. Rather, in some embodiments, aspects of the instructions described herein may propagate in a transient manner through a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by the physical device for at least a finite duration. In addition, data and/or other forms of information pertaining to the present invention may propagate through a pure signal.
When included, display subsystem 1006 may be used to present a visual representation of data held by data-holding subsystem 1004. As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of display subsystem 1006 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1006 may include one or more display devices using virtually any type of technology. Such display devices may be combined in a shared enclosure with logic subsystem 1002 and/or data-holding subsystem 1004, or such display devices may be peripheral display devices.
Communication subsystem 1008 may be configured to communicatively couple computing system 1000 with one or more other computing devices. Communication subsystem 1008 may include wired and/or wireless communication devices compatible with one or more different communication protocols. By way of non-limiting example, the communication subsystem may be configured to communicate via a wireless telephony network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, or the like. In some embodiments, the communication subsystem may allow computing system 1000 to send and/or receive messages to and/or from other devices via a network such as the internet.
It will be appreciated that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Also, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims (9)

1. In a computing device configured to receive input from a user input device comprising a touch sensor and to output a user interface image to a display device separate from the touch sensor, a method comprising:
setting a first user interface mapping for a first user interaction context of a user interface, the first user interface mapping a first area of a display screen of the display device including a plurality of user interface controls to a first corresponding area of the touch sensor at a first relative proportion and mapping a second area of the display screen to a second corresponding area of the touch sensor at the first relative proportion;
receiving, from the user input device, a user input to change a first user interaction context of the user interface to a second user interaction context;
setting a second user interface mapping for the second user interaction context, the second user interface mapping the first area of the display screen and each of the plurality of user interface controls to a larger corresponding area of the touch sensor than in the first user interaction context at a second relative scale that is different from the first relative scale and mapping the second area of the display screen to a smaller corresponding area of the touch sensor than in the first user interaction context at the second relative scale; and
providing output to the display device representing the user input as a user interface image at a location based on the second user interface mapping.
2. The method of claim 1, wherein the second corresponding area of the display screen is smaller than the first corresponding area of the display screen.
3. The method of claim 2, wherein the user interface image comprises a plurality of user interface controls configured to be displayed in the second corresponding region of the display screen.
4. The method of claim 3, wherein the plurality of user interface controls comprise a text entry keyboard.
5. The method of claim 1, wherein the second corresponding region of the display screen comprises a different location than the first corresponding region of the display screen.
6. The method of claim 1, wherein the user interface image comprises a text input control in a first sub-region and a text box in a second sub-region.
7. The method of claim 6, further comprising receiving touch input data corresponding to a cursor moving on a boundary between the first sub-region and the second sub-region, and not changing focus of the user input until the cursor exceeds the boundary by a threshold distance.
8. A computing device, comprising:
means for setting a first user interface mapping for a first user interaction context of a user interface, the first user interface mapping a first area of a display screen of a display device comprising a plurality of user interface controls to a first corresponding area of a touch sensor of a remote control device at a first relative scale and a second area of the display screen to a second corresponding area of the touch sensor at the first relative scale;
means for receiving a first user input;
means for providing, in response to the first user input, an output to the display device representing the first user input as a first user interface image at a location based on the first user interface mapping;
means for receiving a second user input from the user input device changing the first user interaction context to a second user interaction context;
means for setting a second user interface mapping for the first user interaction context, the second user interface mapping the first region of the display screen and each of the plurality of user interface controls to a larger corresponding region of the touch sensor in the first user interaction context at a second relative proportion and mapping the second region of the display screen to a smaller corresponding region of the touch sensor in the first user interaction context at the second relative proportion; and
means for providing an output to the display device representing the second user input as a second user interface image at a location based on the second user interface mapping;
wherein the second user interface mapping comprises a first sub-region of the display screen and a second sub-region of the display screen mapped to the touch sensor at different aspect ratios.
9. The computing device of claim 8, wherein the second user interface image includes a plurality of user interface controls configured to be displayed in the second corresponding region of the display screen.
HK13106565.7A 2011-11-23 2013-06-04 Dynamic scaling of touch sensor HK1179020B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/304,093 2011-11-23

Publications (2)

Publication Number Publication Date
HK1179020A HK1179020A (en) 2013-09-19
HK1179020B true HK1179020B (en) 2018-01-12

Family

ID=

Similar Documents

Publication Publication Date Title
EP2597548A2 (en) Dynamic scaling of touch sensor
US11752432B2 (en) Information processing device and method of causing computer to perform game program
JP6039801B2 (en) Interaction with user interface for transparent head mounted display
JP5684291B2 (en) Combination of on and offscreen gestures
EP2715499B1 (en) Invisible control
US9146672B2 (en) Multidirectional swipe key for virtual keyboard
JP5775526B2 (en) Tri-state touch input system
US20140306897A1 (en) Virtual keyboard swipe gestures for cursor movement
TWI590147B (en) Touch modes
WO2015084684A2 (en) Bezel gesture techniques
US20130127731A1 (en) Remote controller, and system and method using the same
US8842088B2 (en) Touch gesture with visible point of interaction on a touch screen
JP2013109667A (en) Information processing device and information processing method
JP7196246B2 (en) USER INTERFACE PROCESSING PROGRAM, RECORDING MEDIUM, USER INTERFACE PROCESSING METHOD
HK1179020B (en) Dynamic scaling of touch sensor
HK1179020A (en) Dynamic scaling of touch sensor
Lai et al. Virtual touchpad for cursor control of touchscreen thumb operation in the mobile context
WO2013119477A1 (en) Presentation techniques
US9146662B2 (en) Method for controlling an image on a display
US11481110B2 (en) Gesture buttons
US10552022B2 (en) Display control method, apparatus, and non-transitory computer-readable recording medium
EP4513305A1 (en) Improvements in touchless user interface pointer movement for computer devices
KR20140083303A (en) Method for providing user interface using one point touch, and apparatus therefor
JP2018077343A (en) Display controller, display system, display method and program
KR101436586B1 (en) Method for providing user interface using one point touch, and apparatus therefor