WO2002008881A2 - Interface homme-ordinateur - Google Patents
Interface homme-ordinateur Download PDFInfo
- Publication number
- WO2002008881A2 WO2002008881A2 PCT/GB2001/003153 GB0103153W WO0208881A2 WO 2002008881 A2 WO2002008881 A2 WO 2002008881A2 GB 0103153 W GB0103153 W GB 0103153W WO 0208881 A2 WO0208881 A2 WO 0208881A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- software
- library
- gestures
- computer interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- This invention relates to a multimedia human-computer interface, and also to a method of quickly inputting information into a computer. More specifically, it relates to a means of entering commands and data into a computer system using gestures applied to a panel. The invention also allows the operator to use speech to enter commands into the computer.
- the invention provides an interface between the operator and software running on the computer system, the software generally being connected with image analysis or manipulation. However, any programs having a requirement for selecting and manipulating data displayed on a screen may benefit from use of the invention. Text selection and manipulation may also be performed.
- Modern computer systems usually have attached, as a means for inputting information and performing this manipulation, a keyboard and a mouse. These tools are fine for many tasks, and work well, but for some tasks other methods of interacting with the computer may be preferable.
- the keyboard can be very awkward.
- the mouse solves many of the problems associated with image manipulation, but introduces problems of its own. It is only when performing a great number of image manipulation tasks that these problems come to light. For example, when it is required to move an object from point A on the screen to point B using the mouse, there are a great many steps to go through, when the problem is broken down.
- One method of solving this problem is to use speech, as discussed in US patent 5600765.
- the user can designate features on a display using touch and gesture, and, by speaking commands and indicating on the display, can enlarge, or move etc the item selected.
- speech alone for the inputting of commands can become wearisome and in high noise environments the speech recognition system can become confused by the speech input.
- This system uses gesture input for selecting and designating areas only, and speech only for the issuing of commands. This can be restrictive.
- the current invention solves these problems, and alleviates the stress to the operator caused by the mouse and other operations discussed above. It aims to reduce the number of small interactions that the operator needs to do when manipulating images and compiling reports on the images. It aims to provide a quicker, more intuitive interface that is easier to learn and faster to operate.
- a multimedia human computer interface that is useable on a computer that has installed software for enabling an operator the ability to directly manipulate images, comprising at least a position sensitive panel and a display, characterised in that one or more gestures applied anywhere on the panel are interpreted by gesture processing software according to the gesture made and are used to control operating modes of the image manipulation software.
- the current invention allows the operator to select areas on the screen, and to process those areas with only minimal movement of his gaze away from the area of the screen upon which he is currently working.
- the invention also provides the operator with a choice of media in which to interface to the computer. Thus the operator is able to concentrate for longer periods of time on the main purpose of the task, and not have to waste time and effort on other tasks such as those mentioned above.
- the first is known as a mode change gesture. This is used to switch the operating mode of the software from one to another. For example, if the display software is currently in zoom mode, such that it is being used to control the magnification of a portion of the image, and it was then switched into pan mode, this is called a mode change.
- a gesture used to do this is a mode change gesture.
- mode operation gestures When in a particular mode, any gestures that are used to operate on a parameter of that mode are called mode operation gestures.
- a gesture that, for example, selects the zoom factor, altering the magnification of the image is a mode operation gesture. «...
- a mode change gesture in the context of this specification, can consist of placing one or more pointers, which can be the operator's fingers, in contact with the panel, and then moving either some, or all of the pointers. The movement can either substantially maintain contact with the panel, or it can be off from the panel. The latter case signifies that whatever action is being performed is now completed.
- This type of gesture can be thought of as having an equivalent effect to the menu items on more traditional human- computer interfaces, such as the Windows operating system from Microsoft ® but which of course are chosen in a completely different manner.
- a mode operation gesture can differ from a mode change gesture, in that it does not necessarily finish once the pointer or pointers are removed from the touch panel.
- duririg a move operation
- a gesture incorporating multiple pointers that is used to simulate picking up an area of the image, moving it, and putting it down elsewhere on the image can lead to the pointers losing contact with the screen during the move phase of this operation. See below for more information on use of multiple pointers.
- Mode change gestures may be pre-set, or they may be changed by the user at will. Also, the gesture used to activate a particular function may be assigned by the user according to their preference. Different gestures can be programmed into the computer system by means of a training function. A library of currently programmed mode change gestures can then be created that holds samples of each gesture Here, a user would make several examples of the gesture currently being programmed. All would vary slightly from each other due to an inability to exactly reproduce a previous tracing on the touch panel. However, the gestures would be substantially alike. These gesture inputs would then be stored as templates. Any gesture then input after this training had been completed would then go to a gesture comparison engine, comprising a pattern matching algorithm.
- the gesture were substantially similar, it would be able to recognise it as that trained gesture.
- the function currently assigned to that gesture could then be activated.
- Spatial information i.e. the pattern of the reference gestures is stored in the library.
- Temporal information relating to the time taken to draw the gesture can also be stored. This allows the software to differentiate between a mode change gesture input, and a mode operation gesture (in cases where the context of the input required does not define the interpretation of the gesture), so long as the latter does not share the same, or substantially similar temporal parameters.
- the only limitation on use of a mode change gesture is that it must be able to be drawn on the touch panel without removing the pointer, or all pointers if more than one is used, from the screen.
- the prior art interfaces have menu options on the screen in set places. Even though the user may be able to alter where these places are, by moving menu bars around etc. the user still has to locate the place where he has put the menu item on the screen each time he wishes to use it.
- the current invention allows the user to draw mode change gestures anywhere on the touch panel. This provides the benefit that he does not need to look away from the particular area of the image he is currently working on to locate a menu option
- a threshold can be set for the certainty of the match, in percentage terms, and if two gestures are compared and have a certainty value over this given threshold then they will be regarded being a match.
- the operator was able to select areas on the screen using a touch sensitive display, but when he wanted to, say, move a selected area, he had to somehow select the operating mode that allowed items to be moved. This normally involved removing one's gaze from the selected area to look for the button or control that activated the move operating mode, or function, or speaking the command into the computer's microphone.
- the present invention enables, once the area has been selected, the move function to be chosen by means of the operator making a gesture on the screen that represents selecting the move function. Similarly, if the operator wants to copy the selected area, he will make a gesture that activated the copy function of the software.
- the number of functions that can be so represented is limited only by the number of unique, easily drawn gestures that the operator can devise. Functions that may advantageously be controlled by the present invention include, but are not limited to,
- - Image annotation functions such as adding labels to areas, or adding shapes such as rectangles or circles.
- Labels can be placed on to areas and defined as markers, that act like a bookmark in a book, such that the areas can be accessed by reference to the "bookmark" label.
- the starting point used when drawing a gesture can be used to make that gesture unique. For example, if a gesture consisted of drawing a horizontal line on the panel, then the computer system can detect whether that line is drawn from left to right, or from right to left. Depending on the one detected, one of two separate functions can be chosen. This can be achieved by examination of the temporal characteristics of the gesture. The start point will clearly have a timestamp earlier than that of the end point. This can also be achieved merely by looking at the co-ordinates of the points in the array data structure used to store the gesture information. The first co-ordinate in the array will obviously be the starting point, and the last co-ordinate will be the end point.
- gestures can be made with more than one pointer at the same time. For example, assuming an area of the image has been selected, and the move command chosen, two pointers can be placed on opposite sides of this area and this area moved by moving the two pointers in a similar manner to picking up a normal object between one's fingers and moving it. This method can also be used for copying, rotating, and other actions that may beneficially employ multiple pointers.
- the invention may be made even more flexible and easy to use if a speech recognition system is integrated in with the computer system. Using this, there is the option of inputting commands into the system by means of speech as well as gesture. Another benefit of a speech recognition system is that it may be used for the generation of report information spoken by the operator. Whilst analysing the image the operator can speak his findings into a microphone coupled to the speech recognition system, and the verbal report of the operator can be transcribed into a computer file.
- speech recognition brings extra functionality to the system. Some functions have more than one parameter that may be adjusted, and speech can be used to select the parameter. For example, if it is required to filter an area to highlight the edges of shapes on the image, the area could first be selected using a mode operation gesture (assuming the system is in "Select” mode), and the "Filter” option then chosen using a mode change gesture while simultaneously saying the word "Edge” into the microphone. In this way, the computer system knows that the edge detection filter is the one required rather than any other filter that may be applied.
- the operator can say the words “lat”, or "latitude”, followed by a reference number, and do the same for longitude, and the speech recognition software would convert this data into digital form. This would then be sent to the image manipulation software which would display the correct part of the image.
- the user after selecting pan mode, could say the word "bookmark” followed by the name of a previously stored bookmark, as described above, and the image would then pan to the area assigned to that bookmark.
- ViaVoice system from IBM consists of a speech recognition program that takes it's input from the sound interface card found in most modern computer systems.
- Dragon Systems also produce speech recognition systems that can be used to implement the speech interface of the current invention.
- the speech recognition system is used to generate written reports on the image displayed by the computer.
- a mode operation gesture as described above, the user draws a mode control gesture pre-programmed to invoke the report generator.
- the report generator will know that the report is to cover just that selected area. It will note the co-ordinates of the area and store these alongside the report itself.
- the report generator Once the report generator has been invoked, it will record all speech input in the report until a gesture is given indicating that the report is complete. If an area had not been selected before the report generator had been invoked, then the software would know that the report is to apply to the whole image.
- the mode change operation can be performed by means of touching menu options that can be shown on the display.
- menu options such as move, copy, zoom etc are shown, as buttons on the display.
- the buttons are removed from the display, and replaced by buttons representing any sub options for the function selected, plus an option to go up the hierarchy of the menu structure.
- these menu options are in turn replaced by any further options lower down in the structure. In this manner, superfluous options are not left on the display taking up space.
- a computer program product that can be used with a computer system having a display, a position sensitive panel, and a sound digitiser
- the computer program product comprising: a gesture recogniser and processor to detect and match, with pre- stored gestures stored in a gesture library, any gestures made on the position sensitive panel; a command processor that takes in commands from the gesture recogniser; image manipulation software that displays an image on the display and has manipulation functionality; whereby the gesture processor, on detecting a gesture that matches one in the library, sends a command associated with the library gesture to the command handler, which in turn sends it to the image manipulation software.
- Figure 1 shows, in block diagrammatic form, an example of a computer system upon which the current invention can be implemented.
- Figure 2 shows three examples of mode change gestures, that may be used to control program functionality.
- Figure 3 shows one of the gestures of Figure 2 that has been annotated with timestamp information.
- Figure 4 shows both a single, and multiple fingers being used to move an object.
- Figure 5 shows multiple fingers being used to rotate an object.
- Figure 6 shows the hierarchical menu structure as incorporated in the current embodiment of the invention.
- Figure 7 shows a high level block diagram indicating the connectivity of the software modules used to implement the invention.
- Figure 8 shows in more detail the software components that go to make up the Command Handler as shown in Figure 7
- Figure 9 shows a . user drawing on the touch panel, using his finger as the pointer.
- FIG. 1 shows the hardware items upon which the invention is currently implemented.
- a display device (liyama Prolite) 1 is connected to a processing unit 3 which additionally contains a sound processing card 4, a microphone 5 and the software to implement speech recognition (IBM ViaVoice 98) on the words spoken into the microphone 5.
- the display device 1 is overlaid with a thin panel that is touch sensitive, such that when it is touched by a pointer 2 of some type, it transmits the position on the display device 1 that was contacted back to the gesture recogniser software. This is in communication with software controlling the displayed image on the display device, and hence the display software will know what portion of the displayed image was nearest to the point on the display being touched.
- Figure 2 provides some examples of gesture traces that may be used as mode change gestures. They can all be drawn without removing the pointer from the touch panel. Note that in practice, there would be no need for the computer to actually display the trace of the gesture on the screen, except during the.gesture training phase. Seen as drawn in the figure, there is no information present that identifies the start and end points of the gestures. However, the computer stores the set of x-y reference points that make up the gesture in an array structure, a functionally contiguous set of memory locations that hold the information. The first location holds the first point recorded, the next holds the second, and so on. By means of this, it is possible to see where the gesture started and finished.
- FIG. 3a shows a similar gesture to that in Figure 2b, that has been annotated with simulated timestamp information.
- the time stamp recordings shown on Figure 3a show the point on the gesture that is drawn at the indicated time interval.
- the spatial distance between . timestamps is shorter at the curved area of the gesture, indicating that the gesture was drawn more slowly at that point.
- this information can be used to distinguish between this gesture, and a similar looking one that has been drawn in a different manner. For example, a gesture that is similar in shape to Figure 3a, but where all points are half the distance apart as those shown in the diagram will be regarded as a different gesture, as it would take twice as long to draw.
- Figure 4 shows a user operating the system using both single and multiple pointer mode operation gestures.
- Figure 4a shows a user moving an area 6 on the display 1 using a single pointer gesture. Once the area 6 is defined, and the system is in "move" mode, the user places the pointer 2 - his finger in this case - on the area 6, and moves it to where he wants the image in that area to be. The defined area 6 follows the position of his finger. The position of the area 6 after the move is shown as 6'. The movement is indicated by the arrow 7.
- Figure 4b shows the same operation taking place using multiple pointer mode operation gestures. These are beneficial in some instances.
- Figure 4b shows someone using their thumb 2a and forefinger 2b to "grasp" onto a selected area 6. They have put their fingers at opposite sides of the selected area, and the larger pair of arrows 8 indicate the next motion is to move them in towards each other in a pinching action, to simulate the grasping of an object.
- This pinching action is detected by the gesture recogniser as a multi-pointer gesture, and it knows, when the pointers 2a, 2b are lifted from the touch panel 1 , that the action is not yet complete.
- the gesture recogniser waits for the two pointers 2a, 2b to be placed back down on the touch panel 1. When it detects this, it knows that the position touched is the destination of the move or copy command action.
- the thick black arrow 9 shows the movement of the area.
- the new position of the area 6 is indicated as 6'.
- the small pair of arrows 10 indicate the movement of the fingers away from the area 6', indicating the completion of the move.
- Rotation of areas of the image also lends itself to the use of multi-pointer gestures. Grasping a selected area 6 as described above, followed by twisting the points on the touch panel as indicated by the arrows 11 will result in the selected area 6 rotating with the pointers.
- Figure 5a shows this function before the rotation, and Figure 5b shows it completed, with the area 6 having been rotated by some angle.
- Figure 6 shows the hierarchical menu system, with the Main Menu box showing all top level menu selections available. Some of these options, the File, the Annotate, the Region and the Adjust options have their own subcommands that effectively sit at a layer down in the hierarchy. When one of these is chosen, using voice, gesture or the on-screen menu options, the Main Menu on-screen menu options will be replaced by the sub-commands from the chosen option. Note that each of the sub-menu options have a command to go back to the menu one level up the hierarchy.
- Figure 7 is a representation of the major software modules that make up the system as currently implemented.
- the heart of the system is the Command Handler This takes the output of the gesture recogniser module, the speech recogniser,, and the menu items shown on the display, interprets this information, and issues the commands to the image display and manipulation (IDM) module.
- the IDM module then processes the displayed image according to the command received.
- the gesture recogniser is shown as part of the IDM module because certain gestures drawn onto the touch sensitive panel are interpreted directly by the IDM module and are not passed to the Command Handler. These gestures perform functions that interface directly to the operating system, such as opening, closing and repositioning windows on the display - functions that are not directly connected with the image manipulation software. Other gestures are passed to the Command Handler.
- gestures drawn within the image displayed by the module will be sent to the Command Handler.
- the image displayed by the IDM module will be maximised to fill the entire display area, so in this case all gestures would be sent to the Command Handler.
- the Command Handler also communicates with the speech recogniser and the menu display modules. This is because, according to the state the IDM module is in, some commands may not be relevant, as described above. The menu system will be redrawn on the display to reflect this. Also, depending on the IDM module state, certain speech commands may not be relevant, so the speech recognition process can be made more reliable by cutting down on the vocabulary active at any one point. This means that each word or phrase input to the speech recogniser needs be compared against fewer reference words, leading to an improved recognition rate.
- FIG 8 shows in greater detail the elements that make up the Command Handler.
- five local handler units and a command demultiplexer It is irrelevant to the local handler units how a particular command is input to the system.
- that command is examined as to whether, judged from the current state of the system, that command is appropriate. For example, say the last command to be received was the File Menu. The next valid command can only be one of Up, Open or Exit, as seen from Figure 6. If any other command is received, say by voice or gesture, the Menu local handler unit will ignore it.
- Each local handler knows what commands are valid at any one time. If a valid command is received, it will be sent to the IDM module to be processed.
- gesture and speech recogniser outputs go via the command demultiplexer, whereas the menu inputs do not.
- the gesture and speech recognisers send all validly received commands to the demultiplexer, which then examines each one and distributes it to the appropriate local handler unit.
- Each local handler unit is responsible for deciding which menus are to be displayed, and so only displays items that are to be handled by it
- Figure 9 shows a user manipulating an image using a computer equipped to run the current invention.
- a typical session might go as follows. Once the computer hardware and software is fully set up and working, the user will call up an image that he wishes to analyse, which will be shown on the screen. He may then wish to zoom in on a particular area, to examine it more closely. To do this, he needs to switch the operating mode to "zoom". Without taking his eye from the area of interest, he then draws the mode change gesture on any part of the screen to select the zoom function. The operator may then place his finger on the screen and, maintaining contact, slide his finger either up or down. Moving down will decrease magnification, and moving up will increase magnification. The operator may also speak the zoom factor required, if he knows the exact zoom ratio required. Thus, whilst in zoom mode he may say "200%", and the system will know to expand the current visible image to a size of 200% of the original.
- the currently implemented system is set up so that once a command has been executed, such as the zoom mode described above, the image manipulation software switched back to pan mode.
- This can be regarded as a "standby" mode. This is done because the application to which the invention is currently applied is most often used in pan mode. Thus it is more convenient for the user if this mode is entered after each command.
- Other applications of the invention may have another mode that is entered as the "standby" mode, or they may simply remain in a given mode until the user changes the mode to another.
- pan mode will be entered automatically when the zoom function is done.
- the image may then be panned by placing the finger on the screen and moving it.
- the software detects this movement and moves the image to try to keep the same image point underneath the operator's finger.
- pan mode the user then touches one of the command options shown on the display. If the menu of command options is not currently displayed, then speaking the word "menu" will show them.
- An alternative method of switching out of pan mode is to use a mode change gesture.
- the mode change gesture will be done quickly and decisively, and the operator's finger will be immediately released from the screen when it has been completed.
- a pan action even if it coincides with the shape of a mode change gesture will still be interpreted as a pan action if the operator draws the gesture in a manner differently to how he draws the mode change gesture - if he draws it much slower, or pauses half way through drawing it, for example.
- Analysis of the gesture's temporal parameters provides the information needed to make this decision.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Position Input By Displaying (AREA)
Abstract
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2001270808A AU2001270808A1 (en) | 2000-07-21 | 2001-07-19 | Human-computer interface |
| AU7080801A AU7080801A (en) | 2000-07-21 | 2001-07-24 | Human-computer interface |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB0017793A GB0017793D0 (en) | 2000-07-21 | 2000-07-21 | Human computer interface |
| GB0017793.1 | 2000-07-21 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2002008881A2 true WO2002008881A2 (fr) | 2002-01-31 |
| WO2002008881A3 WO2002008881A3 (fr) | 2002-08-22 |
Family
ID=9896004
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/GB2001/003153 Ceased WO2002008881A2 (fr) | 2000-07-21 | 2001-07-19 | Interface homme-ordinateur |
Country Status (3)
| Country | Link |
|---|---|
| AU (2) | AU2001270808A1 (fr) |
| GB (1) | GB0017793D0 (fr) |
| WO (1) | WO2002008881A2 (fr) |
Cited By (82)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE10244900A1 (de) * | 2002-09-26 | 2004-04-15 | Siemens Ag | Verfahren zur automatischen Erkennung von Benutzerbefehlen auf einer als Touchscreen ausgebildeten Benutzeroberfläche |
| WO2004107156A3 (fr) * | 2003-05-27 | 2005-04-28 | Koninkl Philips Electronics Nv | Commande a fonctions de commande multiples pour systeme d'imagerie diagnostique |
| WO2006023259A2 (fr) | 2004-08-16 | 2006-03-02 | Maw Wai-Lin | Dispositif d'entree clavier virtuel |
| WO2006066742A1 (fr) * | 2004-12-21 | 2006-06-29 | Daimlerchrysler Ag | Systeme de commande pour un vehicule |
| DE102006009291A1 (de) * | 2006-03-01 | 2007-09-06 | Audi Ag | Verfahren und Vorrichtung zum Betreiben von zumindest zwei Funktionskomponenten eines Systems, insbesondere eines Fahrzeugs |
| WO2007133483A1 (fr) | 2006-05-12 | 2007-11-22 | Microsoft Corporation | Utilisations multi-contact, gestes et implémentation |
| EP1517228A3 (fr) * | 2003-09-16 | 2008-04-02 | Smart Technologies, Inc. | Procédé de reconnaissance de gestes et système tactile comportant ce procédé |
| WO2007135536A3 (fr) * | 2006-05-23 | 2008-08-21 | Nokia Corp | appareil Électronique portable amÉliorÉ et procÉdÉ associÉ |
| EP1983416A1 (fr) | 2007-04-20 | 2008-10-22 | LG Electronics Inc. | Modification de données à l'aide d'un terminal de communication mobile |
| GB2451646A (en) * | 2007-08-07 | 2009-02-11 | Johnson Electric Sa | Touchless control system |
| WO2009111469A2 (fr) | 2008-03-04 | 2009-09-11 | Apple Inc. | Interface de programmation de modèle d'événement tactile |
| WO2009088808A3 (fr) * | 2007-12-31 | 2009-11-26 | Motorola, Inc. | Dispositif portatif, et procédé de fonctionnement d'une interface utilisateur tactile à pointeur unique |
| WO2009104062A3 (fr) * | 2008-02-18 | 2009-11-26 | Sony Ericsson Mobile Communications Ab | Sélection d'une disposition |
| WO2009069049A3 (fr) * | 2007-11-28 | 2009-11-26 | Koninklijke Philips Electronics N.V. | Dispositif et procédé de détection |
| WO2008030976A3 (fr) * | 2006-09-06 | 2009-11-26 | Apple Inc. | Dispositif à écran tactile, procédé et interface utilisateur graphique pour déterminer des instructions en appliquant des heuristiques |
| DE102008027954A1 (de) * | 2008-04-25 | 2009-11-26 | BenQ Corp., Neihu | Dialogelektronikgerät und dessen Dialogsmethode |
| WO2010027803A1 (fr) * | 2008-08-27 | 2010-03-11 | Apple Inc. | Détection de gestes omnidirectionnels |
| WO2010029506A1 (fr) * | 2008-09-12 | 2010-03-18 | Koninklijke Philips Electronics N.V. | Navigation dans une interface graphique d’utilisateur sur des appareils portatifs |
| EP2166436A1 (fr) * | 2008-09-12 | 2010-03-24 | Samsung Electronics Co., Ltd. | Système de saisie basé sur un capteur de proximité et son procédé de fonctionnement |
| WO2010049877A1 (fr) * | 2008-10-27 | 2010-05-06 | Nokia Corporation | Procédés et appareils pour faciliter une interaction avec des appareils à écran tactile |
| EP2203806A2 (fr) * | 2007-09-04 | 2010-07-07 | Apple Inc. | Interface utilisateur pour menu d'application |
| WO2010142543A1 (fr) * | 2009-06-12 | 2010-12-16 | Volkswagen Ag | Procédé pour commander une interface utilisateur graphique et un dispositif de commande pour une interface utilisateur graphique |
| WO2011025642A1 (fr) * | 2009-08-31 | 2011-03-03 | Qualcomm Incorporated | Procédés d'interface utilisateur fournissant une fonctionnalité de recherche |
| WO2011028944A1 (fr) | 2009-09-02 | 2011-03-10 | Amazon Technologies, Inc. | Interface d'utilisateur à écran tactile |
| EP2306288A1 (fr) | 2009-09-25 | 2011-04-06 | Research In Motion Limited | Dispositif électronique incluant un dispositif d'entrée sensible au toucher et procédé de contrôle correspondant |
| EP1645945A3 (fr) * | 2004-10-05 | 2011-05-25 | Sony Corporation | Appareil de traitement d'informations et programmes utilisés dans un appareil de traitement d'informations |
| JP2011227828A (ja) * | 2010-04-22 | 2011-11-10 | Canon Inc | 情報処理装置、その処理方法及びプログラム |
| US8265688B2 (en) | 2007-12-31 | 2012-09-11 | Motorola Mobility Llc | Wireless communication device and split touch sensitive user input surface |
| EP1770489A3 (fr) * | 2005-09-26 | 2012-09-19 | Samsung Electronics Co., Ltd. | Méthode de contrôle de données utilsant des fonctions d'une souris dans un terminal sans-fil |
| US8285499B2 (en) | 2009-03-16 | 2012-10-09 | Apple Inc. | Event recognition |
| US8370764B2 (en) | 2004-08-30 | 2013-02-05 | Microsoft Corporation | Scrolling web pages using direct interaction |
| EP2573662A1 (fr) * | 2011-09-23 | 2013-03-27 | Samsung Electronics Co., Ltd. | Appareil et procédé de contrôle de taille d'écran dans un terminal portable |
| US8411061B2 (en) | 2008-03-04 | 2013-04-02 | Apple Inc. | Touch event processing for documents |
| EP2214090A3 (fr) * | 2009-01-28 | 2013-06-05 | Sony Corporation | Appareil de traitement d'informations, procédé d'animation et programme |
| US8487938B2 (en) | 2009-01-30 | 2013-07-16 | Microsoft Corporation | Standard Gestures |
| US8493364B2 (en) | 2009-04-30 | 2013-07-23 | Motorola Mobility Llc | Dual sided transparent display module and portable electronic device incorporating the same |
| US8552999B2 (en) | 2010-06-14 | 2013-10-08 | Apple Inc. | Control selection approximation |
| US8566717B2 (en) | 2008-06-24 | 2013-10-22 | Microsoft Corporation | Rendering teaching animations on a user-interface display |
| US8566044B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
| US8566045B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
| EP2674845A1 (fr) * | 2012-06-14 | 2013-12-18 | ICT Automatisering N.V. | Interaction de l'utilisateur par l'intermédiaire d'un écran tactile |
| US8645827B2 (en) | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
| CN103959211A (zh) * | 2012-05-21 | 2014-07-30 | 宇龙计算机通信科技(深圳)有限公司 | 终端和应用功能界面的切换方法 |
| US9037995B2 (en) | 2007-01-07 | 2015-05-19 | Apple Inc. | Application programming interfaces for scrolling operations |
| CN104756184A (zh) * | 2012-08-30 | 2015-07-01 | 谷歌公司 | 选择用于自动话音识别的语言的技术 |
| EP2513760A4 (fr) * | 2009-12-18 | 2016-01-06 | Synaptics Inc | Procédé et appareil pour changement de modes de fonctionnement |
| US9311112B2 (en) | 2009-03-16 | 2016-04-12 | Apple Inc. | Event recognition |
| US9372620B2 (en) | 2007-01-07 | 2016-06-21 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content |
| US9384672B1 (en) | 2006-03-29 | 2016-07-05 | Amazon Technologies, Inc. | Handheld electronic book reader device having asymmetrical shape |
| US9448711B2 (en) | 2005-05-23 | 2016-09-20 | Nokia Technologies Oy | Mobile communication terminal and associated methods |
| US9465532B2 (en) | 2009-12-18 | 2016-10-11 | Synaptics Incorporated | Method and apparatus for operating in pointing and enhanced gesturing modes |
| US9524094B2 (en) | 2009-02-20 | 2016-12-20 | Nokia Technologies Oy | Method and apparatus for causing display of a cursor |
| US9529519B2 (en) | 2007-01-07 | 2016-12-27 | Apple Inc. | Application programming interfaces for gesture operations |
| US9619143B2 (en) | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
| US9619132B2 (en) | 2007-01-07 | 2017-04-11 | Apple Inc. | Device, method and graphical user interface for zooming in on a touch-screen display |
| US9626073B2 (en) | 2002-03-19 | 2017-04-18 | Facebook, Inc. | Display navigation |
| US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
| RU2623198C2 (ru) * | 2011-08-02 | 2017-06-27 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Жест скольжения по диагонали для выбора и перестановки |
| US9733812B2 (en) | 2010-01-06 | 2017-08-15 | Apple Inc. | Device, method, and graphical user interface with content display modes and display rotation heuristics |
| US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
| US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
| US9798459B2 (en) | 2008-03-04 | 2017-10-24 | Apple Inc. | Touch event model for web pages |
| CN103809908B (zh) * | 2008-03-04 | 2018-02-09 | 苹果公司 | 触摸事件模型编程接口 |
| US9933937B2 (en) | 2007-06-20 | 2018-04-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for playing online videos |
| US9933913B2 (en) | 2005-12-30 | 2018-04-03 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US9977518B2 (en) | 2001-10-22 | 2018-05-22 | Apple Inc. | Scrolling based on rotational movement |
| US10139870B2 (en) | 2006-07-06 | 2018-11-27 | Apple Inc. | Capacitance sensing electrode with integrated I/O mechanism |
| US10180732B2 (en) | 2006-10-11 | 2019-01-15 | Apple Inc. | Gimballed scroll wheel |
| US10254949B2 (en) | 2007-01-07 | 2019-04-09 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
| US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
| US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
| US10353565B2 (en) | 2002-02-25 | 2019-07-16 | Apple Inc. | Input apparatus and button arrangement for handheld device |
| US10613732B2 (en) | 2015-06-07 | 2020-04-07 | Apple Inc. | Selecting content items in a user interface display |
| US10620780B2 (en) | 2007-09-04 | 2020-04-14 | Apple Inc. | Editing interface |
| US10866718B2 (en) | 2007-09-04 | 2020-12-15 | Apple Inc. | Scrolling techniques for user interfaces |
| US10963142B2 (en) | 2007-01-07 | 2021-03-30 | Apple Inc. | Application programming interfaces for scrolling |
| US11153687B1 (en) | 2018-08-24 | 2021-10-19 | Apple Inc. | Wireless headphone interactions |
| US11169690B2 (en) | 2006-09-06 | 2021-11-09 | Apple Inc. | Portable electronic device for instant messaging |
| US11194467B2 (en) | 2019-06-01 | 2021-12-07 | Apple Inc. | Keyboard management user interfaces |
| US11216119B2 (en) | 2016-06-12 | 2022-01-04 | Apple Inc. | Displaying a predetermined view of an application |
| US11467722B2 (en) | 2007-01-07 | 2022-10-11 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying electronic documents and lists |
| DE102007026282B4 (de) | 2007-06-05 | 2023-05-11 | Volkswagen Ag | Verfahren zur Steuerung einer Vorrichtung und Steuervorrichtung |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8059099B2 (en) | 2006-06-02 | 2011-11-15 | Apple Inc. | Techniques for interactive input to portable electronic devices |
| US20070152983A1 (en) | 2005-12-30 | 2007-07-05 | Apple Computer, Inc. | Touch pad with symbols based on mode |
| US7748634B1 (en) | 2006-03-29 | 2010-07-06 | Amazon Technologies, Inc. | Handheld electronic book reader device having dual displays |
| US9360967B2 (en) | 2006-07-06 | 2016-06-07 | Apple Inc. | Mutual capacitance touch sensing device |
| US8743060B2 (en) | 2006-07-06 | 2014-06-03 | Apple Inc. | Mutual capacitance touch sensing device |
| US8416198B2 (en) | 2007-12-03 | 2013-04-09 | Apple Inc. | Multi-dimensional scroll wheel |
| US9454256B2 (en) | 2008-03-14 | 2016-09-27 | Apple Inc. | Sensor configurations of an input device that are switchable based on mode |
| US9354751B2 (en) | 2009-05-15 | 2016-05-31 | Apple Inc. | Input device with optimized capacitive sensing |
| US8872771B2 (en) | 2009-07-07 | 2014-10-28 | Apple Inc. | Touch sensing device having conductive nodes |
| US8451238B2 (en) | 2009-09-02 | 2013-05-28 | Amazon Technologies, Inc. | Touch-screen user interface |
| US8949735B2 (en) | 2012-11-02 | 2015-02-03 | Google Inc. | Determining scroll direction intent |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5252951A (en) * | 1989-04-28 | 1993-10-12 | International Business Machines Corporation | Graphical user interface with gesture recognition in a multiapplication environment |
| US5677710A (en) * | 1993-05-10 | 1997-10-14 | Apple Computer, Inc. | Recognition keypad |
| US5864635A (en) * | 1996-06-14 | 1999-01-26 | International Business Machines Corporation | Distinguishing gestures from handwriting in a pen based computer by stroke analysis |
-
2000
- 2000-07-21 GB GB0017793A patent/GB0017793D0/en not_active Ceased
-
2001
- 2001-07-19 AU AU2001270808A patent/AU2001270808A1/en not_active Abandoned
- 2001-07-19 WO PCT/GB2001/003153 patent/WO2002008881A2/fr not_active Ceased
- 2001-07-24 AU AU7080801A patent/AU7080801A/xx active Pending
Cited By (213)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9977518B2 (en) | 2001-10-22 | 2018-05-22 | Apple Inc. | Scrolling based on rotational movement |
| US10353565B2 (en) | 2002-02-25 | 2019-07-16 | Apple Inc. | Input apparatus and button arrangement for handheld device |
| US10055090B2 (en) | 2002-03-19 | 2018-08-21 | Facebook, Inc. | Constraining display motion in display navigation |
| US9886163B2 (en) | 2002-03-19 | 2018-02-06 | Facebook, Inc. | Constrained display navigation |
| US9626073B2 (en) | 2002-03-19 | 2017-04-18 | Facebook, Inc. | Display navigation |
| US9678621B2 (en) | 2002-03-19 | 2017-06-13 | Facebook, Inc. | Constraining display motion in display navigation |
| US9753606B2 (en) | 2002-03-19 | 2017-09-05 | Facebook, Inc. | Animated display navigation |
| US10365785B2 (en) | 2002-03-19 | 2019-07-30 | Facebook, Inc. | Constraining display motion in display navigation |
| US9851864B2 (en) | 2002-03-19 | 2017-12-26 | Facebook, Inc. | Constraining display in display navigation |
| DE10244900A1 (de) * | 2002-09-26 | 2004-04-15 | Siemens Ag | Verfahren zur automatischen Erkennung von Benutzerbefehlen auf einer als Touchscreen ausgebildeten Benutzeroberfläche |
| EP1408395A3 (fr) * | 2002-09-26 | 2015-11-25 | Siemens Aktiengesellschaft | Procédé pour reconnaitre automatiquement des commandes d'utilisateur sur une interface utilisateur à écran tactile |
| CN100367170C (zh) * | 2003-05-27 | 2008-02-06 | 皇家飞利浦电子股份有限公司 | 具有多种控制功能的诊断成像系统控制器 |
| WO2004107156A3 (fr) * | 2003-05-27 | 2005-04-28 | Koninkl Philips Electronics Nv | Commande a fonctions de commande multiples pour systeme d'imagerie diagnostique |
| EP1517228A3 (fr) * | 2003-09-16 | 2008-04-02 | Smart Technologies, Inc. | Procédé de reconnaissance de gestes et système tactile comportant ce procédé |
| US8325134B2 (en) | 2003-09-16 | 2012-12-04 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
| EP2270630A3 (fr) * | 2003-09-16 | 2011-03-02 | SMART Technologies ULC | Procédé de reconnaissance de gestes et système tactile comportant ce procédé |
| WO2006023259A2 (fr) | 2004-08-16 | 2006-03-02 | Maw Wai-Lin | Dispositif d'entree clavier virtuel |
| EP1779373A4 (fr) * | 2004-08-16 | 2011-07-13 | Maw Wai-Lin | Dispositif d'entree clavier virtuel |
| US8797192B2 (en) | 2004-08-16 | 2014-08-05 | Wai-Lin Maw | Virtual keypad input device |
| US8370764B2 (en) | 2004-08-30 | 2013-02-05 | Microsoft Corporation | Scrolling web pages using direct interaction |
| EP1645945A3 (fr) * | 2004-10-05 | 2011-05-25 | Sony Corporation | Appareil de traitement d'informations et programmes utilisés dans un appareil de traitement d'informations |
| US9052813B2 (en) | 2004-10-05 | 2015-06-09 | Sony Corporation | Information-processing apparatus and programs used in information-processing apparatus |
| US9342232B2 (en) | 2004-10-05 | 2016-05-17 | Sony Corporation | Information-processing apparatus providing multiple display modes |
| WO2006066742A1 (fr) * | 2004-12-21 | 2006-06-29 | Daimlerchrysler Ag | Systeme de commande pour un vehicule |
| US9448711B2 (en) | 2005-05-23 | 2016-09-20 | Nokia Technologies Oy | Mobile communication terminal and associated methods |
| US9785329B2 (en) | 2005-05-23 | 2017-10-10 | Nokia Technologies Oy | Pocket computer and associated methods |
| EP1770489A3 (fr) * | 2005-09-26 | 2012-09-19 | Samsung Electronics Co., Ltd. | Méthode de contrôle de données utilsant des fonctions d'une souris dans un terminal sans-fil |
| US11449194B2 (en) | 2005-12-30 | 2022-09-20 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US11650713B2 (en) | 2005-12-30 | 2023-05-16 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US10884579B2 (en) | 2005-12-30 | 2021-01-05 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US12026352B2 (en) | 2005-12-30 | 2024-07-02 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US10915224B2 (en) | 2005-12-30 | 2021-02-09 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US10359907B2 (en) | 2005-12-30 | 2019-07-23 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| US9933913B2 (en) | 2005-12-30 | 2018-04-03 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
| DE102006009291A1 (de) * | 2006-03-01 | 2007-09-06 | Audi Ag | Verfahren und Vorrichtung zum Betreiben von zumindest zwei Funktionskomponenten eines Systems, insbesondere eines Fahrzeugs |
| US9384672B1 (en) | 2006-03-29 | 2016-07-05 | Amazon Technologies, Inc. | Handheld electronic book reader device having asymmetrical shape |
| WO2007133483A1 (fr) | 2006-05-12 | 2007-11-22 | Microsoft Corporation | Utilisations multi-contact, gestes et implémentation |
| US9996176B2 (en) | 2006-05-12 | 2018-06-12 | Microsoft Technology Licensing, Llc | Multi-touch uses, gestures, and implementation |
| US9063647B2 (en) | 2006-05-12 | 2015-06-23 | Microsoft Technology Licensing, Llc | Multi-touch uses, gestures, and implementation |
| US9811186B2 (en) | 2006-05-12 | 2017-11-07 | Microsoft Technology Licensing, Llc | Multi-touch uses, gestures, and implementation |
| EP2027525A4 (fr) * | 2006-05-12 | 2012-01-04 | Microsoft Corp | Utilisations multi-contact, gestes et implémentation |
| WO2007135536A3 (fr) * | 2006-05-23 | 2008-08-21 | Nokia Corp | appareil Électronique portable amÉliorÉ et procÉdÉ associÉ |
| US10890953B2 (en) | 2006-07-06 | 2021-01-12 | Apple Inc. | Capacitance sensing electrode with integrated I/O mechanism |
| US10359813B2 (en) | 2006-07-06 | 2019-07-23 | Apple Inc. | Capacitance sensing electrode with integrated I/O mechanism |
| US10139870B2 (en) | 2006-07-06 | 2018-11-27 | Apple Inc. | Capacitance sensing electrode with integrated I/O mechanism |
| WO2008030976A3 (fr) * | 2006-09-06 | 2009-11-26 | Apple Inc. | Dispositif à écran tactile, procédé et interface utilisateur graphique pour déterminer des instructions en appliquant des heuristiques |
| CN101861562B (zh) * | 2006-09-06 | 2016-05-25 | 苹果公司 | 通过应用启发法来确定命令的触摸屏设备、方法和图形用户界面 |
| US12028473B2 (en) | 2006-09-06 | 2024-07-02 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
| KR101462363B1 (ko) * | 2006-09-06 | 2014-11-17 | 애플 인크. | 휴리스틱스를 적용하여 명령을 판단하기 위한 터치 스크린 장치, 방법 및 그래픽 사용자 인터페이스 |
| US11240362B2 (en) | 2006-09-06 | 2022-02-01 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
| US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
| US11169690B2 (en) | 2006-09-06 | 2021-11-09 | Apple Inc. | Portable electronic device for instant messaging |
| US12236080B2 (en) | 2006-09-06 | 2025-02-25 | Apple Inc. | Device, method, and medium for sharing images |
| US9952759B2 (en) | 2006-09-06 | 2018-04-24 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
| US9335924B2 (en) | 2006-09-06 | 2016-05-10 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
| US11762547B2 (en) | 2006-09-06 | 2023-09-19 | Apple Inc. | Portable electronic device for instant messaging |
| US11736602B2 (en) | 2006-09-06 | 2023-08-22 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
| US10778828B2 (en) | 2006-09-06 | 2020-09-15 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
| KR101476019B1 (ko) * | 2006-09-06 | 2014-12-23 | 애플 인크. | 휴리스틱스를 적용하여 명령을 판단하기 위한 터치 스크린 장치, 방법 및 그래픽 사용자 인터페이스 |
| US10180732B2 (en) | 2006-10-11 | 2019-01-15 | Apple Inc. | Gimballed scroll wheel |
| US9665265B2 (en) | 2007-01-07 | 2017-05-30 | Apple Inc. | Application programming interfaces for gesture operations |
| US9639260B2 (en) | 2007-01-07 | 2017-05-02 | Apple Inc. | Application programming interfaces for gesture operations |
| US10175876B2 (en) | 2007-01-07 | 2019-01-08 | Apple Inc. | Application programming interfaces for gesture operations |
| US12474828B2 (en) | 2007-01-07 | 2025-11-18 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying a vertical bar with scrollable content |
| US10228824B2 (en) | 2007-01-07 | 2019-03-12 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content |
| US10254949B2 (en) | 2007-01-07 | 2019-04-09 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
| US12175069B2 (en) | 2007-01-07 | 2024-12-24 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
| US9760272B2 (en) | 2007-01-07 | 2017-09-12 | Apple Inc. | Application programming interfaces for scrolling operations |
| US10409461B2 (en) | 2007-01-07 | 2019-09-10 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content |
| US10481785B2 (en) | 2007-01-07 | 2019-11-19 | Apple Inc. | Application programming interfaces for scrolling operations |
| US11972103B2 (en) | 2007-01-07 | 2024-04-30 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying electronic documents and lists |
| US11954322B2 (en) | 2007-01-07 | 2024-04-09 | Apple Inc. | Application programming interface for gesture operations |
| US10606470B2 (en) | 2007-01-07 | 2020-03-31 | Apple, Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
| US11886698B2 (en) | 2007-01-07 | 2024-01-30 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
| US10613741B2 (en) | 2007-01-07 | 2020-04-07 | Apple Inc. | Application programming interface for gesture operations |
| US10732821B2 (en) | 2007-01-07 | 2020-08-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
| US9037995B2 (en) | 2007-01-07 | 2015-05-19 | Apple Inc. | Application programming interfaces for scrolling operations |
| US9619132B2 (en) | 2007-01-07 | 2017-04-11 | Apple Inc. | Device, method and graphical user interface for zooming in on a touch-screen display |
| US9575648B2 (en) | 2007-01-07 | 2017-02-21 | Apple Inc. | Application programming interfaces for gesture operations |
| US9529519B2 (en) | 2007-01-07 | 2016-12-27 | Apple Inc. | Application programming interfaces for gesture operations |
| US10817162B2 (en) | 2007-01-07 | 2020-10-27 | Apple Inc. | Application programming interfaces for scrolling operations |
| US10963142B2 (en) | 2007-01-07 | 2021-03-30 | Apple Inc. | Application programming interfaces for scrolling |
| US11586348B2 (en) | 2007-01-07 | 2023-02-21 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
| US10983692B2 (en) | 2007-01-07 | 2021-04-20 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
| US11467722B2 (en) | 2007-01-07 | 2022-10-11 | Apple Inc. | Portable electronic device, method, and graphical user interface for displaying electronic documents and lists |
| US11461002B2 (en) | 2007-01-07 | 2022-10-04 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
| US11449217B2 (en) | 2007-01-07 | 2022-09-20 | Apple Inc. | Application programming interfaces for gesture operations |
| US11269513B2 (en) | 2007-01-07 | 2022-03-08 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
| US9448712B2 (en) | 2007-01-07 | 2016-09-20 | Apple Inc. | Application programming interfaces for scrolling operations |
| US9372620B2 (en) | 2007-01-07 | 2016-06-21 | Apple Inc. | Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content |
| US11169691B2 (en) | 2007-01-07 | 2021-11-09 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
| EP1983416A1 (fr) | 2007-04-20 | 2008-10-22 | LG Electronics Inc. | Modification de données à l'aide d'un terminal de communication mobile |
| US8856689B2 (en) | 2007-04-20 | 2014-10-07 | Lg Electronics Inc. | Editing of data using mobile communication terminal |
| DE102007026282B4 (de) | 2007-06-05 | 2023-05-11 | Volkswagen Ag | Verfahren zur Steuerung einer Vorrichtung und Steuervorrichtung |
| US9933937B2 (en) | 2007-06-20 | 2018-04-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for playing online videos |
| US10761691B2 (en) | 2007-06-29 | 2020-09-01 | Apple Inc. | Portable multifunction device with animated user interface transitions |
| US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
| US11507255B2 (en) | 2007-06-29 | 2022-11-22 | Apple Inc. | Portable multifunction device with animated sliding user interface transitions |
| US12131007B2 (en) | 2007-06-29 | 2024-10-29 | Apple Inc. | Portable multifunction device with animated user interface transitions |
| GB2451646A (en) * | 2007-08-07 | 2009-02-11 | Johnson Electric Sa | Touchless control system |
| EP2203806A2 (fr) * | 2007-09-04 | 2010-07-07 | Apple Inc. | Interface utilisateur pour menu d'application |
| US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
| US12474817B2 (en) | 2007-09-04 | 2025-11-18 | Apple Inc. | Editing interface |
| US11010017B2 (en) | 2007-09-04 | 2021-05-18 | Apple Inc. | Editing interface |
| US12159028B2 (en) | 2007-09-04 | 2024-12-03 | Apple Inc. | Scrolling techniques for user interfaces |
| US10620780B2 (en) | 2007-09-04 | 2020-04-14 | Apple Inc. | Editing interface |
| US11861138B2 (en) | 2007-09-04 | 2024-01-02 | Apple Inc. | Application menu user interface |
| US11604559B2 (en) | 2007-09-04 | 2023-03-14 | Apple Inc. | Editing interface |
| US10866718B2 (en) | 2007-09-04 | 2020-12-15 | Apple Inc. | Scrolling techniques for user interfaces |
| WO2009069049A3 (fr) * | 2007-11-28 | 2009-11-26 | Koninklijke Philips Electronics N.V. | Dispositif et procédé de détection |
| EP2752738A3 (fr) * | 2007-11-28 | 2017-02-15 | Koninklijke Philips N.V. | Dispositif et procédé de détection |
| US8525805B2 (en) | 2007-11-28 | 2013-09-03 | Koninklijke Philips N.V. | Sensing device and method |
| WO2009088808A3 (fr) * | 2007-12-31 | 2009-11-26 | Motorola, Inc. | Dispositif portatif, et procédé de fonctionnement d'une interface utilisateur tactile à pointeur unique |
| KR101217934B1 (ko) * | 2007-12-31 | 2013-01-02 | 모토로라 모빌리티 엘엘씨 | 단일 포인터 터치 센시티브 사용자 인터페이스를 동작시키는 방법 및 핸드헬드 디바이스 |
| RU2503989C2 (ru) * | 2007-12-31 | 2014-01-10 | Моторола Мобилити, Инк. | Портативное устройство и способ работы с сенсорным интерфейсом пользователя с одним указателем |
| US8707215B2 (en) | 2007-12-31 | 2014-04-22 | Motorola Mobility Llc | Hand-held device and method for operating a single pointer touch sensitive user interface |
| US8265688B2 (en) | 2007-12-31 | 2012-09-11 | Motorola Mobility Llc | Wireless communication device and split touch sensitive user input surface |
| US10628028B2 (en) | 2008-01-06 | 2020-04-21 | Apple Inc. | Replacing display of icons in response to a gesture |
| US9619143B2 (en) | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
| WO2009104062A3 (fr) * | 2008-02-18 | 2009-11-26 | Sony Ericsson Mobile Communications Ab | Sélection d'une disposition |
| WO2009111469A3 (fr) * | 2008-03-04 | 2010-01-14 | Apple Inc. | Interface de programmation de modèle d'événement tactile |
| US10521109B2 (en) | 2008-03-04 | 2019-12-31 | Apple Inc. | Touch event model |
| US8645827B2 (en) | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
| EP2405344A3 (fr) * | 2008-03-04 | 2012-02-15 | Apple Inc. | Interface de programmation de modèle d'évènement tactile |
| EP2405345A3 (fr) * | 2008-03-04 | 2012-02-15 | Apple Inc. | Interface de programmation de modèle d'évènement tactile |
| JP2010521037A (ja) * | 2008-03-04 | 2010-06-17 | アップル インコーポレイテッド | タッチイベントモデルプログラミングインターフェイス |
| US9971502B2 (en) | 2008-03-04 | 2018-05-15 | Apple Inc. | Touch event model |
| EP2405343A3 (fr) * | 2008-03-04 | 2012-02-15 | Apple Inc. | Interface de programmation de modèle d'événement tactile |
| EP2405346A3 (fr) * | 2008-03-04 | 2012-02-15 | Apple Inc. | Interface de programmation de modèle d'évènement tactile |
| US8411061B2 (en) | 2008-03-04 | 2013-04-02 | Apple Inc. | Touch event processing for documents |
| US12236038B2 (en) | 2008-03-04 | 2025-02-25 | Apple Inc. | Devices, methods, and user interfaces for processing input events |
| CN103761044A (zh) * | 2008-03-04 | 2014-04-30 | 苹果公司 | 触摸事件模型编程接口 |
| US8723822B2 (en) | 2008-03-04 | 2014-05-13 | Apple Inc. | Touch event model programming interface |
| US10936190B2 (en) | 2008-03-04 | 2021-03-02 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
| CN103809908A (zh) * | 2008-03-04 | 2014-05-21 | 苹果公司 | 触摸事件模型编程接口 |
| US9798459B2 (en) | 2008-03-04 | 2017-10-24 | Apple Inc. | Touch event model for web pages |
| WO2009111469A2 (fr) | 2008-03-04 | 2009-09-11 | Apple Inc. | Interface de programmation de modèle d'événement tactile |
| US8836652B2 (en) | 2008-03-04 | 2014-09-16 | Apple Inc. | Touch event model programming interface |
| CN103809908B (zh) * | 2008-03-04 | 2018-02-09 | 苹果公司 | 触摸事件模型编程接口 |
| US11740725B2 (en) | 2008-03-04 | 2023-08-29 | Apple Inc. | Devices, methods, and user interfaces for processing touch events |
| US9690481B2 (en) | 2008-03-04 | 2017-06-27 | Apple Inc. | Touch event model |
| US9720594B2 (en) | 2008-03-04 | 2017-08-01 | Apple Inc. | Touch event model |
| DE102008027954A1 (de) * | 2008-04-25 | 2009-11-26 | BenQ Corp., Neihu | Dialogelektronikgerät und dessen Dialogsmethode |
| US8566717B2 (en) | 2008-06-24 | 2013-10-22 | Microsoft Corporation | Rendering teaching animations on a user-interface display |
| WO2010027803A1 (fr) * | 2008-08-27 | 2010-03-11 | Apple Inc. | Détection de gestes omnidirectionnels |
| US8645858B2 (en) | 2008-09-12 | 2014-02-04 | Koninklijke Philips N.V. | Navigating in graphical user interface on handheld devices |
| CN102150160B (zh) * | 2008-09-12 | 2015-05-20 | 皇家飞利浦电子股份有限公司 | 手持设备上图形用户界面中的导航 |
| JP2012502386A (ja) * | 2008-09-12 | 2012-01-26 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 携帯用デバイスのグラフィカルユーザインタフェースにおけるナビゲーション |
| KR101611601B1 (ko) * | 2008-09-12 | 2016-04-12 | 코닌클리케 필립스 엔.브이. | 핸드헬드 디바이스들 상의 그래픽 이용자 인터페이스에서의 네비게이팅 |
| CN102150160A (zh) * | 2008-09-12 | 2011-08-10 | 皇家飞利浦电子股份有限公司 | 手持设备上图形用户界面中的导航 |
| WO2010029506A1 (fr) * | 2008-09-12 | 2010-03-18 | Koninklijke Philips Electronics N.V. | Navigation dans une interface graphique d’utilisateur sur des appareils portatifs |
| EP2166436A1 (fr) * | 2008-09-12 | 2010-03-24 | Samsung Electronics Co., Ltd. | Système de saisie basé sur un capteur de proximité et son procédé de fonctionnement |
| WO2010049877A1 (fr) * | 2008-10-27 | 2010-05-06 | Nokia Corporation | Procédés et appareils pour faciliter une interaction avec des appareils à écran tactile |
| EP2214090A3 (fr) * | 2009-01-28 | 2013-06-05 | Sony Corporation | Appareil de traitement d'informations, procédé d'animation et programme |
| US8487938B2 (en) | 2009-01-30 | 2013-07-16 | Microsoft Corporation | Standard Gestures |
| US9524094B2 (en) | 2009-02-20 | 2016-12-20 | Nokia Technologies Oy | Method and apparatus for causing display of a cursor |
| US11755196B2 (en) | 2009-03-16 | 2023-09-12 | Apple Inc. | Event recognition |
| US9285908B2 (en) | 2009-03-16 | 2016-03-15 | Apple Inc. | Event recognition |
| US8566044B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
| US9311112B2 (en) | 2009-03-16 | 2016-04-12 | Apple Inc. | Event recognition |
| US10719225B2 (en) | 2009-03-16 | 2020-07-21 | Apple Inc. | Event recognition |
| US8566045B2 (en) | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
| US9483121B2 (en) | 2009-03-16 | 2016-11-01 | Apple Inc. | Event recognition |
| US9965177B2 (en) | 2009-03-16 | 2018-05-08 | Apple Inc. | Event recognition |
| US8285499B2 (en) | 2009-03-16 | 2012-10-09 | Apple Inc. | Event recognition |
| US8428893B2 (en) | 2009-03-16 | 2013-04-23 | Apple Inc. | Event recognition |
| US12265704B2 (en) | 2009-03-16 | 2025-04-01 | Apple Inc. | Event recognition |
| US11163440B2 (en) | 2009-03-16 | 2021-11-02 | Apple Inc. | Event recognition |
| US8493364B2 (en) | 2009-04-30 | 2013-07-23 | Motorola Mobility Llc | Dual sided transparent display module and portable electronic device incorporating the same |
| US8910086B2 (en) | 2009-06-12 | 2014-12-09 | Volkswagen Ag | Method for controlling a graphical user interface and operating device for a graphical user interface |
| WO2010142543A1 (fr) * | 2009-06-12 | 2010-12-16 | Volkswagen Ag | Procédé pour commander une interface utilisateur graphique et un dispositif de commande pour une interface utilisateur graphique |
| WO2011025642A1 (fr) * | 2009-08-31 | 2011-03-03 | Qualcomm Incorporated | Procédés d'interface utilisateur fournissant une fonctionnalité de recherche |
| CN102483679B (zh) * | 2009-08-31 | 2014-06-04 | 高通股份有限公司 | 提供搜索功能性的用户接口方法 |
| CN102483679A (zh) * | 2009-08-31 | 2012-05-30 | 高通股份有限公司 | 提供搜索功能性的用户接口方法 |
| KR101675178B1 (ko) * | 2009-09-02 | 2016-11-10 | 아마존 테크놀로지스, 인크. | 터치-스크린 사용자 인터페이스 |
| KR20120073223A (ko) * | 2009-09-02 | 2012-07-04 | 아마존 테크놀로지스, 인크. | 터치-스크린 사용자 인터페이스 |
| WO2011028944A1 (fr) | 2009-09-02 | 2011-03-10 | Amazon Technologies, Inc. | Interface d'utilisateur à écran tactile |
| EP2473897A4 (fr) * | 2009-09-02 | 2013-01-23 | Amazon Tech Inc | Interface d'utilisateur à écran tactile |
| EP2306288A1 (fr) | 2009-09-25 | 2011-04-06 | Research In Motion Limited | Dispositif électronique incluant un dispositif d'entrée sensible au toucher et procédé de contrôle correspondant |
| EP2513760A4 (fr) * | 2009-12-18 | 2016-01-06 | Synaptics Inc | Procédé et appareil pour changement de modes de fonctionnement |
| US9465532B2 (en) | 2009-12-18 | 2016-10-11 | Synaptics Incorporated | Method and apparatus for operating in pointing and enhanced gesturing modes |
| US9733812B2 (en) | 2010-01-06 | 2017-08-15 | Apple Inc. | Device, method, and graphical user interface with content display modes and display rotation heuristics |
| US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
| US10732997B2 (en) | 2010-01-26 | 2020-08-04 | Apple Inc. | Gesture recognizers with delegates for controlling and modifying gesture recognition |
| US12061915B2 (en) | 2010-01-26 | 2024-08-13 | Apple Inc. | Gesture recognizers with delegates for controlling and modifying gesture recognition |
| JP2011227828A (ja) * | 2010-04-22 | 2011-11-10 | Canon Inc | 情報処理装置、その処理方法及びプログラム |
| US8552999B2 (en) | 2010-06-14 | 2013-10-08 | Apple Inc. | Control selection approximation |
| US10216408B2 (en) | 2010-06-14 | 2019-02-26 | Apple Inc. | Devices and methods for identifying user interface objects based on view hierarchy |
| RU2623198C2 (ru) * | 2011-08-02 | 2017-06-27 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Жест скольжения по диагонали для выбора и перестановки |
| CN103106023A (zh) * | 2011-09-23 | 2013-05-15 | 三星电子株式会社 | 用于控制便携式终端中的显示尺寸的装置和方法 |
| EP2573662A1 (fr) * | 2011-09-23 | 2013-03-27 | Samsung Electronics Co., Ltd. | Appareil et procédé de contrôle de taille d'écran dans un terminal portable |
| US9471218B2 (en) | 2011-09-23 | 2016-10-18 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling display size in portable terminal |
| CN103959211A (zh) * | 2012-05-21 | 2014-07-30 | 宇龙计算机通信科技(深圳)有限公司 | 终端和应用功能界面的切换方法 |
| EP2674845A1 (fr) * | 2012-06-14 | 2013-12-18 | ICT Automatisering N.V. | Interaction de l'utilisateur par l'intermédiaire d'un écran tactile |
| CN104756184B (zh) * | 2012-08-30 | 2018-12-18 | 谷歌有限责任公司 | 选择用于自动话音识别的语言的技术 |
| CN104756184A (zh) * | 2012-08-30 | 2015-07-01 | 谷歌公司 | 选择用于自动话音识别的语言的技术 |
| US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
| US11429190B2 (en) | 2013-06-09 | 2022-08-30 | Apple Inc. | Proxy gesture recognizer |
| US12379783B2 (en) | 2013-06-09 | 2025-08-05 | Apple Inc. | Proxy gesture recognizer |
| US10613732B2 (en) | 2015-06-07 | 2020-04-07 | Apple Inc. | Selecting content items in a user interface display |
| US11918857B2 (en) | 2016-06-11 | 2024-03-05 | Apple Inc. | Activity and workout updates |
| US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
| US12274918B2 (en) | 2016-06-11 | 2025-04-15 | Apple Inc. | Activity and workout updates |
| US11161010B2 (en) | 2016-06-11 | 2021-11-02 | Apple Inc. | Activity and workout updates |
| US11660503B2 (en) | 2016-06-11 | 2023-05-30 | Apple Inc. | Activity and workout updates |
| US11148007B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Activity and workout updates |
| US11216119B2 (en) | 2016-06-12 | 2022-01-04 | Apple Inc. | Displaying a predetermined view of an application |
| US12256205B2 (en) | 2018-08-24 | 2025-03-18 | Apple Inc. | Wireless headphone interactions |
| US11153687B1 (en) | 2018-08-24 | 2021-10-19 | Apple Inc. | Wireless headphone interactions |
| US11863954B2 (en) | 2018-08-24 | 2024-01-02 | Apple Inc. | Wireless headphone interactions |
| US11194467B2 (en) | 2019-06-01 | 2021-12-07 | Apple Inc. | Keyboard management user interfaces |
| US11842044B2 (en) | 2019-06-01 | 2023-12-12 | Apple Inc. | Keyboard management user interfaces |
| US11620046B2 (en) | 2019-06-01 | 2023-04-04 | Apple Inc. | Keyboard management user interfaces |
Also Published As
| Publication number | Publication date |
|---|---|
| AU7080801A (en) | 2002-02-05 |
| WO2002008881A3 (fr) | 2002-08-22 |
| GB0017793D0 (en) | 2000-09-06 |
| AU2001270808A1 (en) | 2002-02-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2002008881A2 (fr) | Interface homme-ordinateur | |
| US7849421B2 (en) | Virtual mouse driving apparatus and method using two-handed gestures | |
| JP5249788B2 (ja) | マルチポイント感知装置を用いたジェスチャリング | |
| US7461352B2 (en) | Voice activated system and methods to enable a computer user working in a first graphical application window to display and control on-screen help, internet, and other information content in a second graphical application window | |
| JP5456529B2 (ja) | グラフィカル・ユーザ・インターフェース・オブジェクトを操作する方法及びコンピュータシステム | |
| US8130211B2 (en) | One-touch rotation of virtual objects in virtual workspace | |
| US8159469B2 (en) | User interface for initiating activities in an electronic device | |
| US9239673B2 (en) | Gesturing with a multipoint sensing device | |
| US9292112B2 (en) | Multimodal interface | |
| US9086794B2 (en) | Determining gestures on context based menus | |
| US7533352B2 (en) | Method and apparatus for providing context menus on a hand-held device | |
| CN104216600B (zh) | 一种提供应用程序的功能的方法及触屏智能终端设备 | |
| EP0660218A1 (fr) | Clavier graphique | |
| US20120119995A1 (en) | Keyboardless text entry | |
| JP2004152217A (ja) | タッチパネル付き表示装置 | |
| EP3491506B1 (fr) | Systèmes et procédés pour une interface utilisateur d'écran tactile pour un outil d'édition collaboratif | |
| CN102822771A (zh) | 基于眼球跟踪器的上下文动作 | |
| US11150797B2 (en) | Method and device for gesture control and interaction based on touch-sensitive surface to display | |
| JP2010517197A (ja) | マルチポイント感知装置でのジェスチャー | |
| WO2020232912A1 (fr) | Procédé d'exploitation d'écran tactile, dispositif électronique et support d'informations | |
| JP2009140390A (ja) | 指示デバイスおよび指紋認証半導体回路 | |
| WO2018156912A1 (fr) | Système d'interaction de regard | |
| CN106293430A (zh) | 虚拟滑鼠控制系统及其控制方法 | |
| US10228892B2 (en) | Information handling system management of virtual input device interactions | |
| JPH09237151A (ja) | グラフィカルユーザインタフェース |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
| AK | Designated states |
Kind code of ref document: A3 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
| 122 | Ep: pct application non-entry in european phase | ||
| ENP | Entry into the national phase |
Ref document number: 2003134363 Country of ref document: RU Kind code of ref document: A Format of ref document f/p: F |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |