[go: up one dir, main page]

HK1180058A - User interface for editing a value in place - Google Patents

User interface for editing a value in place Download PDF

Info

Publication number
HK1180058A
HK1180058A HK13107051.6A HK13107051A HK1180058A HK 1180058 A HK1180058 A HK 1180058A HK 13107051 A HK13107051 A HK 13107051A HK 1180058 A HK1180058 A HK 1180058A
Authority
HK
Hong Kong
Prior art keywords
value
display
user interface
swipe gesture
interface element
Prior art date
Application number
HK13107051.6A
Other languages
Chinese (zh)
Inventor
B.E.兰帕森
K.X.程
Original Assignee
微软技术许可有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 微软技术许可有限责任公司 filed Critical 微软技术许可有限责任公司
Publication of HK1180058A publication Critical patent/HK1180058A/en

Links

Description

User interface for in-place editing of values
Technical Field
The present invention relates to graphical user interfaces, and more particularly, to techniques for editing user interfaces.
Background
When working on many mobile computing devices (e.g., smart phones, tablet devices), the available screen real estate and input devices are often limited, making editing display content challenging for many users. For example, many devices use a software-based input panel (SIP) instead of a physical keyboard, rather than just a limited size display. The display of SIP may use a large amount of limited screen space, leaving little space available for showing information associated with applications on a computing device.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
User interface elements for in-place editing of values within a document are displayed. For example, in response to selecting a value, a user interface may be displayed near the value that receives a swipe gesture for adjusting the value in place. The user interface may be configured to select one of the different portions of the value in response to changing the level of the swipe gesture. For example, a user may move a swipe gesture from a level for adjusting a day value to a level representing a year value. The displayed user interface and the method for adjusting the value may be based on the type of value and the structure and content of the document.
Drawings
FIG. 1 illustrates an exemplary computing environment;
FIG. 2 illustrates a system that includes in-place editing of values using a user interface;
FIG. 3 illustrates a process for in-place selection and adjustment of values using user interface elements;
FIG. 4 illustrates a process for selecting and adjusting different portions of a value;
FIG. 5 illustrates a display for in-place adjustment of values within a spreadsheet;
FIG. 6 illustrates a display for in-place adjustment of values within a spreadsheet;
FIG. 7 illustrates a display for in-place adjustment of values within a document; and
FIG. 8 illustrates a display for in-place adjustment of values within a document using a slide UI element.
Detailed Description
Embodiments will now be described with reference to the drawings, wherein like reference numerals represent like elements. In particular, FIG. 1 and the corresponding discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.
Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Other computer system configurations may also be used, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Distributed computing environments may also be used where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Referring now to FIG. 1, an illustrative computing environment for a computer 100 utilized in the various embodiments will be described. The computing environment shown in fig. 1 includes computing devices that may each be configured as a mobile computing device (e.g., a phone, a tablet, a netbook, a laptop), a server, a desktop computer, or some other type of computing device, and that each include a central processing unit 5 ("CPU"), a system memory 7 including a random access memory 9 ("RAM") and a read only memory ("ROM") 10, and a system bus 12 that couples the memory to the central processing unit ("CPU") 5.
A basic input/output system containing the basic routines that help to transfer information between elements within the computer, such as during startup, is stored in the ROM 10. The computer 100 also includes a mass storage device 14, the mass storage device 14 for storing an operating system 16, applications 24 (e.g., productivity applications, web browsers, etc.), and a user interface manager 26, which will be described in greater detail below.
The mass storage device 14 is connected to the CPU 5 through a mass storage controller (not shown) connected to the bus 12. The mass storage device 14 and its associated computer-readable media provide non-volatile storage for the computer 100. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, the computer-readable media can be any available media that can be accessed by the computer 100.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media also include, but are not limited to: computer storage media includes, but is not limited to, RAM, ROM, erasable programmable read-only memory ("EPROM"), electrically erasable programmable read-only memory ("EEPROM"), flash memory or other solid state memory technology, CD-ROM, digital versatile disks ("DVD"), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 100.
The computer 100 operates in a networked environment using logical connections to remote computers through a network 18, such as the internet. The computer 100 may connect to the network 18 through a network interface unit 20 connected to the bus 12. The network connection may be wireless and/or wired. The network interface unit 20 may also be used to connect to other types of networks and remote computer systems. The computer 100 may also include an input/output controller 22 for receiving and processing input from a number of other devices, including a keyboard, mouse, touch input device, or electronic stylus (not shown in FIG. 1). Similarly, an input/output controller 22 may provide input/output for a display screen 23, a printer, or other type of output device.
The touch input device may utilize any technology that allows for the recognition of single/multi-touch inputs (touch/no-touch). For example, these techniques may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optical capture, tuned electromagnetic induction, ultrasonic receivers, sensing microphones, laser rangefinders, shadow capture, and the like. According to one embodiment, the touch input device may be configured to detect a proximity touch (i.e., within a certain distance from the touch input device, but without physical contact with the touch input device). The touch input device may also act as a display. An input/output controller 22 may also provide output to one or more display screens 23, a printer, or other type of input/output device.
The camera and/or some other sensing device may be operable to record one or more users and capture motions and/or gestures made by the user of the computing device. The sensing device may also be operable to capture words such as dictated by a microphone and/or to capture other input from the user such as by a keyboard and/or mouse (not depicted). The sensing device may comprise any motion detection device capable of detecting movement of a user. For example, the camera may include Microsoft WindowsA motion capture device comprising a plurality of cameras and a plurality of microphones.
Embodiments of the invention may be practiced with a system on a chip (SOC) in which each or many of the components/processes shown in the figures may be integrated onto a single integrated circuit. Such SOC devices may include one or more processing units, graphics units, communication units, system virtualization units, and various application functions, all integrated (or "burned") onto a chip substrate as a single integrated circuit. When operating via an SOC, all or a portion of the functionality described herein with respect to unified communications may be operated via application specific logic integrated with other components of computing device/system 100 on a single integrated circuit (chip).
As mentioned briefly above, a number of program modules and data files may be stored in the mass storage device of computer 100Within the device 14 and RAM 9, there is included an operating system 16 suitable for controlling the operation of a computer, such as WINDOWS PHONE from MICROSOFT CORPORATION of Redmond, WashingtonWINDOWSOr WINDOWSAnd (4) operating the system. The mass storage device 14 and RAM 9 may also store one or more program modules. In particular, the mass storage device 14 and the RAM 9 may store one or more application programs, such as a spreadsheet application, a word processing application, and/or other applications. According to one embodiment, a MICROSOFT OFFICE application suite is included. The applications may be client-based and/or web-based. For example, web services 27 may be used, such as: MICROSOFT WINDOWS LIVE, MICROSOFT OFFICE 365, or some other network-based service.
User interface manager 26 is configured to present user interface elements (e.g., UI 28) to edit/adjust values in place. User interface manager 26 may be external to an application (e.g., a spreadsheet application or some other application) as shown, or may be part of an application. Further, all/some of the functionality provided by user interface manager 26 may be located internal/external to the application in which the user interface elements are used to edit values in place. More details regarding the user interface manager are disclosed below.
FIG. 2 illustrates a system that includes in-place editing of values using a user interface. As shown, system 200 includes an application 210, a User Interface (UI) manager 26, and a touch screen input device/display 215.
To facilitate communication with UI manager 26, one or more callback routines may be implemented. According to one embodiment, the application 210 is a commercial productivity application configured to receive input from the touch-sensitive input device 215 and/or keyboard input (e.g., physical keyboard and/or SIP). For example, UI manager 26 may provide information to application 210 in response to selection of a value by a user gesture (i.e., a finger on hand 230) and a swipe gesture on user interface element 216 to adjust the selected value. The term "swipe gesture" may include a swipe action and/or a drag action.
The illustrated system 200 includes a touch screen input device/display 215 that detects when a touch input (e.g., a finger touch or near touch to the touch screen) is received. Any type of touch screen that detects a touch input by a user may be utilized. For example, a touch screen may include one or more layers of capacitive material that detect touch inputs. Other sensors may be used in addition to or in place of capacitive materials. For example, an Infrared (IR) sensor may be used. According to an embodiment, the touch screen is configured to detect an object in contact with or above the touchable surface. Although the term "above" is used in this specification, it should be understood that the orientation of the touch panel system is irrelevant. The term "above" is intended to be applicable to all such orientations. The touch screen may be configured to determine a location (e.g., a start point, an intermediate point, and an end point) at which the touch input is received. The actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, a vibration sensor or microphone coupled to the touch panel. A non-exhaustive list of examples of sensors for detecting contact includes: pressure-based mechanisms, micromechanical accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.
UI manager 26 is configured to display UI elements for editing selected values in place and process received input devices/displays 202. A user interface element 216 for in-place editing of the value is displayed in response to the value being selected. For example, in response to selecting the spreadsheet cell 232 that includes a date value, the UI element 216 is displayed near the value, and the UI element 216 receives a swipe gesture for adjusting the value in place. As shown, the selected cell is distinguished from other cells by changing the fill color of the selected cell and selecting the portion of the value that is currently being edited/adjusted. Other methods may be used to distinguish the selected cells (e.g., a border around the cells, different fill patterns, display of changing values, etc.). In the current example, in addition to performing the swipe gesture, the user may tap the "+" or "-" indicator to change the value. The user interface displays different levels corresponding to each different portion of the date value. As shown, the user has performed a swipe gesture to a first level, which currently shows a change from "15" to "16". When the user terminates the swipe gesture, the value within the cell is set to the currently selected value. The values in the cells are also updated in real-time as the swipe gesture is performed. To change the year or month value within the date value, the user moves the swipe gesture from the day (day) level to the desired level. The displayed user interface and the method for adjusting the value may be based on the type of value and the structure and content of the document. For example, the contents of units in the vicinity of the selected unit may be used to determine the type of value, the predicted value, and so on.
FIGS. 3 and 4 show illustrative processes for displaying and interacting with user interface elements for in-place editing of values. When reading the discussion of the routines provided herein, it should be appreciated that the logical operations of various embodiments are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention. Accordingly, the logical operations illustrated and making up the embodiments described herein are referred to variously as operations, structural devices, acts or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof.
FIG. 3 illustrates a process for in-place selection and adjustment of values using user interface elements.
After a start block, process 300 moves to operation 310 where the values are displayed within a graphical window. The value may be associated with one or more applications. For example, the application may be an office productivity application (e.g., spreadsheet, word processing, presentation, etc.) or some other type of application. The value may be displayed within the document and/or within a user interface for setting the value. According to one embodiment, the values to be adjusted are displayed in a document such as a spreadsheet, word processing document, table, or the like.
Proceeding to operation 320, a determination is made as to when to select a value for editing. Various methods may be used to select the value. For example, the user may select a cell by: tapping on a value/cell/user interface option, moving a pointer (e.g., mouse, pen) over a value/cell/user interface option, and moving a pointer (e.g., mouse, pen) over a value/cell/user interface option and clicking, etc.
Moving to operation 330, a determination is made as to what type of value to select. The value may comprise a single portion or more than one portion (e.g., date, social security number, complexity number … …). The type of value may be a numerical value or a different value selected from a group of values. For example, the value may be a day, a month, a type of item, a value selected from a row/column, and the like. The type of value may be determined from the types of values allowed by the elements/locations/fields within the document/table, and/or may be determined from the content surrounding the selected value. For example, one or more cells within the spreadsheet may be detected to determine the type of values within the cells that are near the selected cell (e.g., a date on which the selected cell (even if it is currently empty) should be included near the current year "2011" may be determined). In general, the type of value may be any type of value that is displayed in the vicinity of the selected value. For example, the values may be from the same row/column of the selected value.
Transitioning to operation 340, a user interface element for editing the value in place is displayed. The UI element may be displayed in response to a selection, which is an action indicating that the unit/value/field is to be edited (e.g., a general editing action), and/or an action to launch the UI element through some other UI, and so forth. The UI element is displayed proximate to the display of the value such that input to adjust the value is received proximate to the display of the value. According to an embodiment, the UI element includes a display of lines extending outward from each side of the value (see fig. 8). According to another embodiment, the UI appears adjacent to the display of the value and includes a level for each portion of the value displayed (see FIGS. 4-7). According to an embodiment, the UI element is alpha-blended such that content displayed underneath the UI element remains visible.
Proceeding to operation 530, a swipe gesture is received to adjust the value. The swipe gesture may be a touch input and/or an input received using another input device (e.g., a mouse). According to an embodiment, the swipe gesture is a touch input that detects a swipe of a finger (e.g., a horizontal/vertical movement moving from a start point to an end point). The swipe gesture can adjust the value at different speeds based on the position of the swipe from the selected value. For example, if the swipe is near the value, the value may be incremented by a single value, while when the swipe is farther, the value may be incremented by a multiple (e.g., two, four, ten), depending on the distance from the value. The range of possible values may also be used to determine the speed of adjustment. For example, when the range of values is small (e.g., one to ten), the adjustment of the values may be slow. When the range of values is large (e.g., one thousand to one hundred thousand), the adjustment of the values may be faster and/or the multiples may be larger. The same swipe gesture may be used to set one or more portions of the value. For example, a user may place their finger down on a UI element, move to a first level and move left/right to set a value within the first level, and then move up to another level without lifting their finger, and then move left/right to set a value for that level. According to an embodiment, as long as the same swipe gesture (e.g., a fingertip remains dropped on the display) is detected, the user may continue to adjust different portions of the value by moving to a different level and selecting the value of that level.
Moving to operation 360, the display is updated with the adjusted value. The display of the value may be updated during receipt of the swipe gesture and/or after completion of the swipe gesture. For example, the value may be shown as changing when the user is performing a swipe gesture. According to an embodiment, the display of the UI element is updated to reflect the currently adjusted value, without updating the display of the selected value.
Transitioning to operation 370, the adjusted value is set at the end of the swipe gesture (e.g., the user removes their finger from the display, or the user does not select the value using some other input method).
The process then flows to an end block and returns to processing other actions.
Fig. 4 shows a process for selecting and adjusting different parts of a value.
After a start block, process 400 moves to operation 410 where different portions of the values are determined. The value may comprise one or more portions. For example, an integer value contains one part, while a date value and a social security value each contain three different parts. In general, a portion of a value may be adjusted independently of other portions of the value. Values may also be divided into portions based on the significance. Other divisions of values into parts may also be determined. For example, the first portion of the value may be a ones bit column, the second portion may be a tens bit column, the third portion may be a hundreds bit column, and so on.
Proceeding to decision operation 420, a determination is made as to whether the value has more than one portion.
When the value does have more than one portion, the process proceeds to operation 430 where a UI element is displayed that includes a display of each portion at a different level of the UI element display.
When the value does not have more than one portion, the process proceeds to 440, where the UI element is displayed with a single level display.
Moving from operation 430 or operation 440 to operation 450, a swipe gesture is received.
Transitioning to operation 460, a determination is made as to what level of swipes were made. For example, a portion of the swipe adjustment value at the first level that is associated with that level. As discussed above, more than one portion of the value may be set using the same swipe gesture.
Proceeding to operation 470, the display of values is updated in response to the swipe gesture.
The process then flows to an end block and returns to processing other actions.
5-8 illustrate exemplary windows for adjusting values in-place using a UI that is displayed in response to selecting a value. Fig. 5-8 are for exemplary purposes, and are not limiting.
FIG. 5 illustrates a display for adjusting values in place within a spreadsheet. As shown, window 510 and window 520 each display a spreadsheet 512, spreadsheet 512 showing a Name (Name) column, a GPA column, and an inspection Date (Exam Date) column in which the user has selected cell 520 to adjust the current value of "2.6" in place using the UI element. More or fewer columns/regions including values may be included within windows 510 and 520. The window may be a window associated with a desktop application, a mobile application, and/or a web-based application (e.g., displayed via a browser). The window may be displayed on a limited display device (e.g., smartphone, tablet device) or on a larger screen device.
As shown, the selected cell 520 is displayed differently from other cells of the spreadsheet to indicate to the user that the cell is currently selected. Although the cell 520 is shown highlighted, other display options may also be used to indicate that the cell is selected (e.g., a border around the cell, a hash (hashing), a color change, a font change, etc.).
In response to determining that the cell 520 is to be edited (e.g., select, edit an action, select another UI element), the UI element 514 is displayed. In the current example, two levels are displayed within the UI element 514, as the GPA value includes two parts. According to one embodiment, a default portion is selected to adjust the value. For example, a second portion of the GPA value is selected by default as the portion of the value that is displayed on the first level. Turning to the UI element 514, the first level shows values 4, 5, 6, 7, and 8, while the second level shows values 1, 2, and 3. More or fewer possible values may be displayed at each level. For example, the second level of GPA values may show all possible values (0-4). In the current example, the user has selected cell 520 by tapping cell 520, and then drags their finger to the right to select "7" as the adjusted value. In response to the swipe gesture, the currently adjusted value is shown within the UI element 514 using a graphical indicator. In this example, the current value based on the current swipe gesture is shown larger. Other methods of indicating the current value may also be shown (e.g., changing the font, bounding around the value, changing the color of the value, etc.). When the user finishes swiping the gesture (e.g., removes his finger from the display, releases the mouse button, etc.), the value is adjusted in the cell. If the user were to release at the current point in this example, 2.7 would replace 2.6. The user may move further left/right in the UI element 514 to select a value not originally shown. For example, as the user moves through the value "7" or some other determined value on the first level, additional values may be shown within the UI element 514 (e.g., 9, 10, etc.). Similarly, when the user moves to the left, lower values may be shown in the UI element 514 (e.g., 3, 2, 1).
Window 520 shows the user adjusting the second portion of the value of GPA. To select the second level, the user has moved to the second level and has moved to the right, selecting the value 3 for GPA.
According to an embodiment, the contents of the cell and the surrounding cells (in this case the GPA columns) are used to help determine the possible values that may be included within the cell. This information may be used in determining how many levels/how many potential values to display within the UI element. As shown, the UI element 514 is displayed as alpha blending such that the portion of the content below the UI element remains visible. Different methods (e.g., no alpha blending, different colors, etc.) may also be used to display the UI element 514. The UI element 514 may be displayed in a different location. For example, the UI element 514 may be displayed to the left, right, and/or above the display of the selected cell.
FIG. 6 illustrates a display for adjusting values in place within a spreadsheet. As shown, window 610 and window 620 each include a spreadsheet that currently shows a Grade (Grade) column, a gender (sex) column, and sibling (siblings) columns. In the current example, the rank column may include a value selected from large one (freshman (fr)), large two (sonomore (so)), large three (junior (jr)), and large four (senior (sr)). The gender column may include a male (M) value or a female (F) value. The sibling column may include values starting from 0 and rising.
Window 610 shows user selection of cell D6, indicating that the cell is to be selected by displaying a thicker border around the cell D6. In response to the selection, a UI element 612 is displayed, the UI element 612 showing four possible different values that may be selected for the grade value. Fewer possible values (e.g., one, two, three) may be shown in the UI element 612. In the current example, the user has performed a swipe gesture in order to currently select the blank cell D6, the blank cell D6 being shown larger in the UI element 612. As can be seen in window 620, in response to ending the swipe gesture, cell D6 is updated to "four big". Prior to selection, cell D6 may/may not contain a value.
Window 620 shows user selection of cell F6, where the cell is indicated to be selected by displaying a fill within cell F6. In response to the selection, a UI element 622 is displayed, the UI element 622 showing possible values that may be selected for sibling (Siblings) values. In the current example, the user has performed a swipe gesture to currently select the value 3 (displayed within the bezel) for cell D6. As discussed above, if the user moves within the UI element past or near the end of the initially displayed value, more values may be displayed. For example, the UI element 624 may be displayed when the user moves near or at the end of the initially displayed value. As shown, when the user moves to the value "8" within the UI element 622, the values 9, 10, and 11 are displayed.
FIG. 7 illustrates a display for adjusting values in place within a document. As shown, window 710, window 720, window 730, and window 740 each include a display of a document containing values that can be selected to change the values in place.
Window 710 shows the user selecting a social security number within the document. According to an embodiment, any different value within the document may be selected.
The window 720 shows a UI element 722 that is displayed in response to selection of the social security number. As shown, the UI element 722 includes a display level of possible values for each portion of the social security number. The value may be divided into different portions. For example, a level may be displayed for each digit of the number, or for different chunks of the value (e.g., the 2211 portion of the displayed value may be shown as two different levels, each level having 2 digits). In the current example, a third portion of the social security number is highlighted to indicate that the third portion is the currently selected portion of the value for receiving the adjustment. The value 2210 is selected by the user moving the swipe gesture to the left. Further left shifting causes the UI element 722 to adjust the display of possible values (e.g., 2209, 2208, 2207, etc.). The user may select different levels by moving the swipe gesture to the desired level.
Window 730 shows the user selecting the integer value 2 in the document. In response to selecting the integer value, the UI element 732 is displayed with the possible values for selection.
Window 740 shows the user selecting a brand value of B1. For example, the value may be a car type that includes a limited number of possible values.
FIG. 8 illustrates a display for in-place adjustment of values within a document using a slide UI element.
Displays 808, 810, 811, 812, 814, and 816 show the user adjusting values in place using a slide UI element.
Display 808 shows the value 2 before being selected for in-place editing.
Display 810 shows an initial display of the UI element that is displayed in response to selecting the value of 2. As shown, the lines are placed to the left and right of the value for editing in place. Different methods may be used to display the line. For example, a portion of the line to the left of the number may be displayed using a first color, while a portion of the line to the right of the number may be displayed using a different color. A frame may be displayed (e.g., display 811) to show a slider, etc.
Display 812 shows the user sliding the value 2 to the right so that the current value is 6. According to one embodiment, the value itself may be moved along the line. According to an embodiment, the initial value may remain at the initial position, while the value of the currently edited value may be shown along the slider line. According to a further embodiment, the value is updated at the initially displayed position in response to the swipe gesture. For example, display 817 shows the user sliding their finger to the right of the number and updating the value in response to the slide. The value may be updated based on the distance of the gesture from the number and/or the speed of movement from the number. For example, the gesture moves farther/faster, the number changes faster. The value may stop changing in response to different actions. For example, the user may end the gesture by moving their finger away from the display or moving their finger back to the position of the original display, as shown in display 818.
Display 814 shows the user sliding the value back to the left to the current value of 1.
In response to the user ending the swipe gesture, the display 816 shows a value of 1 as the set value. The end swipe gesture shows that the last edited value appears to be "snap" in place (i.e., the value's position returns to the original position).
Window 820, window 830, and window 840 each show a user interface configured to receive a selection to set options for searching for hotels. The user interface may be configured to target other applications, as well as receive other values. As shown, window 820 shows options for selecting a hotel, setting a check-in date, setting a check-out date, setting a number of customers, setting a number of rooms, and searching options. In the current example, each option to set a value may be set using a UI element as described and illustrated herein.
For example, the user may select a customer value to adjust the value of the customer quantity using a sliding UI interface. The UI elements shown in fig. 4-7 may also be used. Furthermore, combinations of UI elements may also be used together.
Window 830 shows the user sliding the value to 4. In response to releasing the value and ending the swipe gesture, the value 4 is set to the adjusted value.
The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.

Claims (10)

1. A method for receiving input to adjust a value, comprising:
displaying a value within a graphical display;
determining when a value is selected;
in response to determining to edit the value, displaying a user interface element near the value, the user interface element displaying an indicator that is used with the swipe gesture to adjust the value to a new value determined from the different values;
receiving a swipe gesture to adjust the value;
adjusting display of the user interface element and display of the value in situ in response to the swipe gesture when the swipe gesture is received; and
the value is set to the adjusted value.
2. The method of claim 1, wherein the value appears to slide along a line in response to the swipe gesture.
3. The method of claim 1, wherein displaying the user interface element comprises: the possible values are displayed over a display of the values that includes the possible values for each portion of the display values that can be set at different levels.
4. The method of claim 1, wherein receiving the swipe gesture comprises: a level of the wiper is determined, and a portion of the value corresponding to the level of the wiper is adjusted.
5. The method of claim 1, wherein displaying the possible values comprises: displaying the possible values as an alpha blend such that the portion of the display that lies below the display of the possible values remains visible.
6. The method of claim 1, wherein adjusting the display of the user interface element and the display of the value in response to the swipe gesture comprises: the distance of the value from the current wiper position is determined, and the speed of change of the value is adjusted in response to the increase in position.
7. A computer-readable medium storing computer-executable instructions for in-place editing of a value, comprising:
displaying a value within a graphical display;
determining when a value is selected;
in response to determining that the value is selected, graphically indicating selection of the value, displaying a user interface element in proximity to the value, the user interface element displaying a possible value in proximity to the selected value and configured to receive a swipe gesture to adjust the value to a new value determined from the different values;
receiving a touch input as a swipe gesture to adjust the value;
adjusting display of the user interface element and display of the value in situ in response to the swipe gesture when the swipe gesture is received; and
the value is set to the adjusted value.
8. An apparatus for in-place editing of a value, comprising:
a display configured to receive touch input;
a processor and a memory;
an operating environment to be executed using the processor;
an application that includes a value that can be changed; and
a user interface manager operating in conjunction with the application, the user interface manager configured to perform actions comprising:
displaying a value on the display;
determining when a value on the display is selected;
in response to determining that the value is selected, graphically indicating selection of the value, displaying a user interface element in proximity to the value, the user interface element displaying a level of possible values for each different portion of the value, and the user interface element being configured to receive a touch input for adjusting the value to a new value determined from the different values;
receiving a touch input to adjust the value;
upon receiving the swipe gesture, adjusting the display of the user interface element to show a currently selected possible value; and
the value is set to the adjusted value.
9. The apparatus of claim 8, wherein displaying the user interface element comprises: the user interface element is displayed as alpha blending such that a portion of the display underlying the display of the user interface element remains visible.
10. The device of claim 8, wherein adjusting the display of the user interface element and the display of the value in response to the swipe gesture comprises: the distance of the value from the current wiper position is determined, and the speed of change of the value is adjusted in response to the increase in position.
HK13107051.6A 2011-09-22 2013-06-17 User interface for editing a value in place HK1180058A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/240,547 2011-09-22

Publications (1)

Publication Number Publication Date
HK1180058A true HK1180058A (en) 2013-10-11

Family

ID=

Similar Documents

Publication Publication Date Title
US10705707B2 (en) User interface for editing a value in place
US9383898B2 (en) Information processing apparatus, information processing method, and program for changing layout of displayed objects
US20130191785A1 (en) Confident item selection using direct manipulation
EP2774026B1 (en) Slicer elements for filtering tabular data
KR102130857B1 (en) Visual navigation of documents by object
US11720230B2 (en) Interactive data visualization user interface with hierarchical filtering based on gesture location on a chart
US10347027B2 (en) Animated transition between data visualization versions at different levels of detail
US20120179963A1 (en) Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display
US10347018B2 (en) Interactive data visualization user interface with hierarchical filtering based on gesture location on a chart
HK1180058A (en) User interface for editing a value in place
AU2016324602B2 (en) Interactive data visualization user interface with gesture-based data field selection
HK1181158A (en) Adjusting content to avoid occlusion by a virtual input panel