[go: up one dir, main page]

HK1164498B - Touch event model - Google Patents

Touch event model Download PDF

Info

Publication number
HK1164498B
HK1164498B HK12105027.2A HK12105027A HK1164498B HK 1164498 B HK1164498 B HK 1164498B HK 12105027 A HK12105027 A HK 12105027A HK 1164498 B HK1164498 B HK 1164498B
Authority
HK
Hong Kong
Prior art keywords
touch
view
exclusive
touches
views
Prior art date
Application number
HK12105027.2A
Other languages
Chinese (zh)
Other versions
HK1164498A1 (en
Inventor
J.C.比弗
A.普拉齐
Original Assignee
苹果公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/042,318 external-priority patent/US8645827B2/en
Application filed by 苹果公司 filed Critical 苹果公司
Publication of HK1164498A1 publication Critical patent/HK1164498A1/en
Publication of HK1164498B publication Critical patent/HK1164498B/en

Links

Abstract

The present invention relates to a touch event model. Embodiments of the present invention are directed to methods, software, devices and APIs for defining touch events for application level software. Furthermore, some embodiments are directed to simplifying the recognition of single and multiple touch events for applications running in multi-touch enabled devices. To simplify the recognition of single and multiple touch events, each view within a particular window can be configured as either a multi-touch view or a single touch view. Furthermore, each view can be configured as either an exclusive or a non-exclusive view. Depending on the configuration of a view, touch events in that and other views can be either ignored or recognized. Ignored touches need not be sent to the application.; Selectively ignoring touches can allow for simpler software elements that do not take advantage of advanced multi touch features to be executed at the same device and time as more complex software elements.

Description

Touch event model
The present application is a divisional application of patent applications having application number 200910118596.4, application date 2009, 3/4, and invention name "touch event model".
Technical Field
The present application relates generally to multi-point and multi-touch enabled devices (multi-point and multi-touch enabled devices), and more particularly to identifying single and multiple click and touch events in a multi-point and multi-touch enabled device.
Background
Multi-touch enabled devices (or multi-touch enabled devices) are known in the art. A multi-touch enabled device is a device that is capable of sensing multiple touches simultaneously. Thus, the multi-touch enabled device may, for example, sense two touch events caused by two fingers pressing against a panel that occur simultaneously at two different locations on a multi-touch panel (multi-touch panel). An example of a MULTI-TOUCH enabled device is discussed in U.S. patent application No.11/649,998 entitled "Multi-TOUCH sensing detection AND Deodulination" filed on 3.1.2007, which is hereby incorporated herein by reference in its entirety. While a multi-point enabled device defines a broader set of devices including multi-touch enabled devices and similar devices such as the multi-proximity sensor device discussed in the above-mentioned U.S. patent application 11/649,998.
While the benefits to multi-touch enabled interfaces (multi-touch enabled interfaces) are known, these devices present certain interface design challenges. Existing interface designs have assumed, approximately qualitatively, that a single point indicating user input device (single pointing user input device) is used, specifying a single location at a time. Examples include a mouse or a touch pad.
More specifically, many existing Graphical User Interface (GUI) systems provide a user interface in which portions of a display are associated with separate software elements. Thus, for example, portions of a display may be associated with a window, and the window may be associated with a particular software application and/or process. The mouse may be used to interact with a window and an application or process associated with the window. The mouse pointer may then be moved to another window to interact with another application or process. Since only a single point of pointing device is used, interaction with only a single window and application or process can occur at a time.
The assumption of separate interaction with the window at any one time can greatly simplify the user interface design. An application and/or process running within a window may operate under the assumption that the detected interaction with a particular window is the only input received. Thus, the application and/or process itself need not be concerned with the possibility of other user interactions occurring in other portions of the display outside of the window. Additionally, a window may be additionally partitioned into different cells, where each cell is associated with a particular portion of the window. Each unit may be implemented by a separate software unit, e.g. a software object. Likewise, each software object can handle interactions that occur within its associated region without itself needing to be concerned with interactions that may occur simultaneously elsewhere.
On the other hand, if a multi-touch interface is used, it is possible for two or more touch events to occur simultaneously on different portions of the display. It is difficult to divide the display into different sections and to achieve different independent software cell process interactions associated with each section. Furthermore, even if the display is divided into different sections, multiple touch events may occur in a single section. Thus, a single application, process, or other software element may need to process multiple simultaneous touch events. However, if each application, process, or other software element needs to account for multiple touch interactions, the overall cost and complexity of software running on a multi-touch enabled device can be prohibitive. More specifically, each application may need to process a large amount of incoming touch data. This may require a high degree of complexity in what appears to be a functionally simple application and may make programming a multi-touch enabled device difficult and expensive. Furthermore, existing software that employs a single point pointing device is difficult to convert or port to a version that can work on a multi-point or multi-touch enabled device.
Disclosure of Invention
Embodiments of the present invention are directed to methods, software, devices and APIs for defining touch events for application level software. Furthermore, certain embodiments are directed to simplifying the identification of single and multiple touch events for applications running in a multi-touch enabled device. To simplify the identification of single and multiple touch events, each view within a particular window may be configured as either a multi-touch view or a single-touch view. Furthermore, each view may be configured as an exclusive (exclusive) or non-exclusive (non-exclusive) view. Depending on the configuration of the view, touch events in that view and other views may be ignored or identified. The ignored touches need not be sent to the application. By selectively ignoring touches, simpler applications or software elements that do not take advantage of advanced multi-touch features can be made to execute on the same device (even at the same time) as more complex applications or software elements.
Drawings
FIG. 1 is a diagram of an illustrative input/output processing stack of a multi-touch capable device (multi-touch capable device) in accordance with one embodiment of the invention.
FIG. 2A is a diagram of an exemplary multi-touch enabled device, according to one embodiment of the invention.
FIG. 2B is a diagram of another exemplary multi-touch enabled device, in accordance with one embodiment of the present invention.
FIG. 3 is a diagram of an exemplary multi-touch display in accordance with one embodiment of the present invention.
FIG. 4 is a flow chart showing an exemplary method of operation of a multi-touch marker in accordance with one embodiment of the invention.
Fig. 5A and 5B are flow diagrams illustrating an exemplary method of operation of an exclusive touch flag according to one embodiment of the invention.
Detailed Description
In the following description of the preferred embodiments, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the preferred embodiments of the present invention.
The present application relates to a touch event model for simplifying the recognition of single and multiple touch events for user interface applications running in multi-point and multi-touch enabled devices. To simplify the identification of single and multiple touch events, each view within a particular window may be configured as either a multi-touch view or a single-touch view. Further, each view may be configured as an exclusive or non-exclusive view. Depending on the configuration of the view, touch events in the view and other views may be ignored or identified.
Although embodiments of the present invention are described and illustrated herein with respect to a particular multi-touch capable device, it should be understood that embodiments of the present invention are not limited to such devices, but may be broadly applied to any multi-touch capable device. Furthermore, embodiments of the present invention are not limited to multi-touch devices, but also encompass multi-touch devices, such as the multi-proximity sensor device discussed in the aforementioned U.S. application No.11/649,998.
Certain embodiments relate to APIs. Generally, an API is a source code interface provided by a computer system to support requests for services from software operations. The API is specified in a programming language that can be interpreted or compiled at system build time, rather than in an explicit low-level description of how the data is arranged in memory. Software that provides API functionality is referred to as an implementation of an API. Various devices, such as computer systems, electronic devices, portable devices, and handheld devices, have software applications. The device interfaces between software applications and user interface software to provide specific features and operations for a user of the device.
At least some embodiments of the invention may include one or more APIs in an environment where user interface software interacts with software applications. Various function calls (calls) or messages are communicated via the API between the user interface software and the software application. The transfer of the function call or message may include the issuing, launching, invoking (invoke), or receiving of the function call or message. An illustrative API may include the transmission of touch event information. In addition, an API may also implement functions with parameters, variables, or pointers. The API may receive the disclosed parameters or other parameter combinations. In addition to the disclosed APIs, other APIs may also perform similar functions to the disclosed APIs, alone or in combination.
FIG. 1 is a diagram of an input/output processing stack of an exemplary multi-touch capable device according to some embodiments of the invention. The hardware 100 may be set at a base level of the multi-touch enabled device. It may include various hardware interface elements, such as a multi-touch enabled panel 101 and/or an accelerometer 102. A multi-touch panel may include a display and a panel that senses multiple touches simultaneously. Examples of such panels are discussed in more detail in the above-mentioned application 11/649,998. An accelerometer may be a hardware device for sensing acceleration of a multi-touch enabled device. It can be used to sense when the device is moving, just how it is moving, whether it is falling, etc. In addition, other hardware interface devices such as gyroscopes, speakers, buttons, Infrared (IR) sensors, etc. (not shown) may also be included.
Communicating with the hardware 100 may be one or a set of drivers 103. The driver may receive and process input data received from the hardware. A core Operating System (OS)104 may communicate with the one or more drivers. The core OS may process raw input data received from the one or more drivers. In some embodiments, the driver may be considered part of the core OS.
A set of OS Application Programming Interfaces (APIs) 105 may communicate with the core OS. These APIs may be a set of APIs typically included with operating systems (e.g., Linux or UNIX APIs). The user interface API 106(UI API) may include a set of APIs designed for use by applications running on the device. These UI APIs may use OS APIs. The application 107 running on the device may communicate with the user using an API in the UI API. The UI APIs, in turn, can communicate with lower level elements, ultimately with multi-touch panel 101 and various other user interface hardware. While each layer may utilize the layer below it, this is not always necessary. For example, in some embodiments, the application 107 may occasionally communicate with the OS API105, and the APIs 105 and 106 may include respective sets of application programming interfaces and their respective implementations. For example, the UI APIs 106 may also include User Interface (UI) software for implementing the UI APIs.
Fig. 2A and 2B are diagrams of two exemplary multi-touch enabled devices according to some embodiments of the invention. Fig. 2A shows an exemplary device 200. The device 200 may include a CPU 201 and a memory 202 connected by a bus 204. The bus may also be connected to a multi-touch display 203. The multi-touch display may include a multi-touch panel and a display. By combining a multi-touch panel and a display, a multi-touch display 203 may be formed. The multi-touch display may correspond to multi-touch panel 101 within hardware layer 100 of FIG. 1. The CPU may be used to execute software stored in the memory. The software executed by the CPU may include layers 103-109 of FIG. 1. Thus, the software may include drivers, OS, various APIs, and applications.
Fig. 2B shows an alternative device 210. Device 210 may be similar to device 200. However, device 210 may include a separate multi-touch panel (212) and display (211) instead of a single unit of device 200. Thus, there is no need for the device 210 to interact with the multi-touch panel by touching the display. Device 210 may be, for example, a laptop computer equipped with a multi-touch track pad (the multi-touch panel acts as the track pad).
The multi-touch panel and/or display of fig. 2A and 2B may also utilize other sensor technologies, such as proximity sensing discussed in the aforementioned U.S. application No.11/649,998. Generally, a multi-point panel (multi-point panel) and/or a display may be used for the apparatus of fig. 2A and 2B. The multi-point panel and/or display may be characterized by various types of sensor technologies. For example, it may have the following features: pure multi-touch technology (thus yielding a multi-touch panel and/or display), multi-proximity sensing technology, a combination of the two, or other types of multi-touch technology.
The devices of fig. 2A and 2B may include a variety of different types of multi-touch enabled devices. For example, they may include mobile phones, portable video game consoles, electronic music players, electronic books, PDAs, electronic organizers (organizers), electronic mail appliances, laptop or other personal computers, kiosk computers (kiosks), vending machines, and the like.
FIG. 3 is an illustration of an exemplary multi-touch display 300. The multi-touch display may be the display 203 of FIG. 2A or the display 211 of FIG. 2B. The display may display various user interface elements (such as graphics, etc.) generated by software running in a device incorporating the display (e.g., device 200 of fig. 2A or device 210 of fig. 2B). A user may interact with various user interface elements to interact with the software. When using the device of fig. 2A, a user may interact with these user interface elements by touching them directly on the display. When using the device of FIG. 2B, a user may touch the single multi-touch pad 212 to move and control one or more cursors on the display 211 that are used to interact with the software.
The user interface elements presented on the display 300 may include one or more views. Each view may represent a graphical user interface element that is processed by a separate software element. Individual software elements may include different applications, different processes or threads (even within the same application), different routines or subroutines, different objects, and so forth. In some embodiments, each individual software element may create a user interface element for a respective portion of its display, and may also receive and process touch input for that portion of the display. The touch input may be processed by the various layers discussed in connection with FIG. 1, which may then send the processed touch input data to a software element (which may be part of the application 109). The processed touch input data may be referred to as one or more touch events and may be in a format that is easier to process than the raw touch data generated by the multi-touch panel. For example, each touch event may include a set of coordinates at which a touch is currently occurring. In some embodiments, the set of coordinates may correspond to a touch centroid. For simplicity, the following discussion may refer to a software element associated with a view by simply referring to the view itself.
The views may be nested. In other words, one view may contain other views. Thus, the software elements associated with the first view may include or be linked to one or more software elements associated with views within the first view. Some views may be associated with applications, while other views may be associated with high-level OS elements, such as graphical user interfaces, window managers, and so forth.
The exemplary display of fig. 3 shows a music browsing application. The display may include a status bar view 301 that indicates the overall status of the device. The status bar view may be part of the OS. A title view 302 may also be included. The title view itself may include several other views, such as a center title view 310, a back button 312, and a forward button 311. A table view 303 may also be included. The form view 303 may include one or more form cell views, such as form cell view 304. It can be seen that in one embodiment, the table cell view can be the track title. A button bar view 305 may additionally be included. The button bar view may include buttons 306-309.
Each view and its associated software elements are capable of receiving, processing and manipulating touch events that occur on the particular view. Thus, for example, if a user touches the track title view 304, the software element associated with the view may receive a touch event indicating that the view has been touched, process the event, and respond accordingly. For example, the software element may change the graphical representation of the view (i.e., highlight the view), and/or cause other actions, such as playing a track associated with the touched view.
In some embodiments, touch events are processed at the lowest level in the view hierarchy. Thus, for example, if a user touches the title bar 302, the touch event need not be processed directly by the software cell associated with the title bar view, but may be processed by the software cell associated with the view included in the title bar view in which the touch occurred (i.e., the software cell associated with one of the views 310, 311, and 312). In some embodiments, some higher level views may also process touch events. Furthermore, various software elements that are not associated with the view being touched may still be alerted, or may find that the view is being touched.
Since display 300 is a multi-touch display, multiple touches may occur at the same time. The multiple touches may occur in the same view or in two or more different views. In addition, the user may also perform gestures having a predetermined meaning (e.g., pressing one or more fingers and moving the fingers). A multi-TOUCH gesture (or "multi-TOUCH gesture") is discussed in more detail in U.S. patent application No.10/903,964 entitled "GESTURES FOR TOUCH SENSITIVE INPUTTEVICES," filed on 30.7.2004, and which is hereby incorporated by reference in its entirety.
A view may receive touch events that begin within the view. If the user keeps pressing a certain finger on the display, the view may then receive multiple touch events indicating a persistent touch. If the user moves a depressed finger, the view may receive multiple touch events indicating touch movement. If the user moves the depressed finger out of view, the view may still receive touch events associated with the movement (and the view to which the finger has moved need not receive those touch events). Thus, a view may receive an event associated with a gesture or movement that begins at the view, even if the event continues outside of the view.
Touch may refer to an action that begins with pressing a finger or another body part or object over a multi-touch panel (or multi-touch display) surface and ends with the finger or object moving away from the display. Thus, the touch may include both a movement of a finger or object and a holding of the finger or object at the same position for a period of time.
Touch events may be sent to the view (or a software element implementing the view) by one or more APIs (and their corresponding implementations). One example of an API for handling touch events is provided in appendix A below. According to the API in appendix A, the API can send to each view a touch event data structure that contains one or more single touch data structures (or touch data structures). Each touch event data structure may define the current state of all touches that occur on the view at a particular time. A respective touch data structure within the touch event data structure can define a current state of one or more respective single touches at a particular time. Thus, if three touches occur in a particular view at a particular time, a touch event data structure containing three touch data structures defining five touch states can be sent to the view. In some embodiments, the touch data structure may be sent even if the touch associated therewith is no longer occurring, thereby alerting the view that the touch has terminated.
As described above, touches may include actions that do not need to occur simultaneously. For example, a touch may include the act of moving a finger or holding a finger down the display for a period of time. However, the touch data structure defines a touch state at a particular time. In this way, multiple touch data structures can be associated with a single touch, whereby a single touch at different times can be defined.
Each touch data structure may include various fields. The "first view touch" field may indicate whether the touch data structure defines a first touch to a particular view (starting when the software element implementing the view is instantiated). The "timestamp" field may indicate the particular time that the touch data structure relates to.
The "info" field may be used to indicate whether the touch is a basic gesture (rudimentary gesturing). For example, the "info" field may indicate whether the touch is a swipe (swipe), and if so, the direction of the swipe is determined. A swipe is a motion that rapidly drags one or more fingers in a linear direction. The API implementation may determine whether a touch is a swipe and may pass this information to the application through the "info" field, thereby relieving the application of some of the data processing burden necessary if the touch is a swipe.
The "tap count" field may indicate how many taps are sequentially made at the touch position. A tap may be defined as an action of quickly pressing and lifting a finger at a particular location on a panel. If the finger is again pressed and released in rapid succession at the same location on the panel, multiple sequential taps may occur. The API implementation can then count taps for various applications and relay this information through the "tap count" field. Sometimes, multiple taps of the same location may be considered a very useful and easy to remember command for a touch-enabled interface. Thus by counting taps, the API can also relieve some of the data processing burden of the application.
The "phase" field may indicate the particular phase in which the touch is currently located. The phase field may have various values, such as a "touch start phase" that may indicate that the touch data structure defines a new touch that has not been referenced by a previous touch data structure. The "touch move phase" value may indicate that the defined touch has moved from a location defined in the previous touch data structure. The "touch fix phase" value may indicate that the touch was left in the same location from the time the last touch data structure for the touch was generated. The "touch end phase" value may indicate that the touch has ended (e.g., the user has lifted their finger off the surface of the multi-touch display). The "touch cancel stage" value may indicate that the touch has been cancelled by the device. The cancelled touch may be a touch that does not have to be ended by the user, but can be determined by the device to be ignored. For example, the device may determine that the touch was inadvertently generated (i.e., treated as a result of the portable multi-touch enabled device being placed in a pocket) and thus ignore the touch. Each value of the "phase field" may be an integer.
Thus, each touch data structure may define how the touch is at a particular time (e.g., whether the touch is stationary, moved, etc.), but may also define other information (e.g., location) associated with the touch. Accordingly, each touch data structure can define the state of a particular touch at a particular time. One or more touch data structures referencing the same time may be added to the touch event data structure, where the touch event data structure may define the state of all touches being received at a particular view at a certain time (as described above, some touch data structures may also reference touches that have ended and are no longer being received). Over time, multiple touch event data structures may be sent to software implementing the view to provide the software with continuous information describing touches that occurred on the view. One or more elements of the device, such as hardware 100, drivers 103, kernel OS 104, OS API105, and UI APIs, may detect touches on the multi-touch panel 101 and may generate various touch event data structures defining the touches.
The ability to handle multiple touches and multi-touch gestures (multi-touch gestures) may increase the complexity of the various software elements. In some cases, this additional complexity may be necessary to achieve advanced and desired interface features. For example, games may require the ability to handle multiple touches that occur simultaneously in different views, as games typically require multiple buttons to be pressed simultaneously. However, some simpler applications and/or views (and their associated software elements) do not necessarily require advanced interface features. For example, a simple button (e.g., button 306) may work satisfactorily with a single touch, thus eliminating the need for multi-touch functionality. In these cases, the underlying OS may send unnecessary or excessive touch data (e.g., multi-touch data) to the software cell associated with a view that is intended to be operated by only a single touch (e.g., a button). Since the software cell may need to process this data, it may need to feature all the complexity of a software cell that processes multiple touches, even if it is associated with only views that involve a single touch. Thus, the cost of developing software for a device is increased because software elements that are conventionally easily programmed in a mouse interface environment (i.e., various buttons, etc.) may become much more complex in a multi-touch environment.
Embodiments of the present invention aim to solve the above problems by selectively providing touch data to various software elements according to predetermined settings. Thus, a simpler interface may be provided for selected software elements, while other software elements may utilize more complex multi-touch inputs.
Embodiments of the present invention may rely on one or more markers associated with one or more views, where each marker and combinations thereof indicate a touch event processing mode for a particular view. For example, multi-touch and/or exclusive touch flags may be used. The multi-touch indicia may indicate whether a particular view has the ability to receive multiple simultaneous touches. The exclusive touch flag may indicate whether a particular view allows other views to receive a touch event while it is receiving a touch event.
FIG. 4 is a flow chart showing a multi-touch marking operation according to one embodiment of the present invention. At step 400, a user may touch a view at a first location within the view. It may be assumed that no other touches are present on the multi-touch display when the touch of step 400 is received. At step 402, the OS may send a touch event defining the received touch to the software element associated with the touch location.
At step 404, the user may touch the view at a second location while not releasing the first touch (i.e., while holding the finger down on the first location). Thus, for example, the user may touch the right portion of the form cell view 304 at step 400 and touch the left portion of the form cell view 304 without releasing their finger from the right portion at step 404. Thus, the second touch is simultaneous with the first touch (thus taking advantage of the multi-touch capability of the display 300).
At step 406, the OS may determine whether the multi-touch flag for the view being touched is set. If the multi-touch flag is set, the view may be one that is capable of handling multiple simultaneous touches. Thus, a second touch event for the second touch may be sent to the software element associated with the view in step 408. It should be noted that a new instance (instance) of the first touch event may also be sent to indicate that the first touch event is still in progress (i.e., the finger in the first position has not been lifted). If the finger in the first position is moved away from the position without lifting the finger (that is, if the finger is "dragged" on the display surface), then a new instance of the first touch event may specify a different position.
On the other hand, if the multi-touch flag is not set, the OS may ignore or block the second touch. If the second touch is ignored, it may result in no touch events associated with the second touch being sent to the software element associated with the touched view. In some embodiments, the OS may alert other software units of the second touch if necessary.
Thus, for relatively simple software cells that are programmed to handle only a single touch at a time, embodiments of the present invention may allow the software cells to keep their multi-touch markers unasserted and thereby ensure that touch events are not sent to them as part of multiple simultaneous touches. At the same time, a more complex software cell that can handle multiple simultaneous touches can assert its multi-touch flag and can receive touch events related to all touches that occur on its associated view. This reduces the cost of developing simple software cells while also providing advanced multi-touch functionality for more complex cells.
Fig. 5A and 5B are flow diagrams illustrating an exemplary method of operation of an exclusive touch flag according to one embodiment of the invention. At step 500, a user may touch a first view. At step 502, the OS may send a touch event to a first software element associated with the first view. At step 504, the user may touch the second view without releasing the first touch.
At step 506, the OS may check whether the exclusive touch flag for the first view is asserted. If the flag is set (asserted), this means that the first view needs to receive a touch in an exclusive manner, and no other touches are sent to other views. Thus, if the exclusive touch flag is set, the OS may ignore (or block) the second touch and not send it to any software cell. If the exclusive touch flag is not set, then the process may continue to step 510 in FIG. 5B.
At step 510, the OS may determine whether an exclusive view flag for the second view is set. If the flag is set, the second view can only receive exclusive touch events. Thus, if there is another touch event that has already been received by another view (i.e., the first view), then the second view will not be able to receive the touch event and the OS may ignore the second touch (step 512). However, if the exclusive touch flag for the second touch is not set (not asserted), then the OS may send the touch event associated with the second touch to the second view. More specifically, the OS may send the touch event associated with the second touch to the software element associated with the second view (step 514).
Thus, the exclusive touch flag may ensure that the view marked as exclusive uniquely receives a touch event when it is the only view on the display that receives the touch event. The exclusive touch flag is very useful for simplifying application software running on the multi-touch enabled device. In some situations, allowing multiple views to receive touches simultaneously may create complex conflicts and errors. For example, if the delete track button and the play track button are pressed simultaneously, an error may be caused. To avoid such conflicts, complex and expensive software may be required. However, embodiments of the present invention may reduce the need for such software by providing an exclusive touch flag, as providing an exclusive touch flag may ensure that the view in which the flag is set will only receive touch events if it is the only view that is receiving touch events. Alternatively, one or more views may have their exclusive touch flag unasserted, thereby allowing multiple simultaneous touches on two or more of the views.
In some embodiments, the exclusive flag may represent exclusivity for the entire display. Thus, when the view with the exclusive flag set is receiving a touch event, all other views in the display are prevented from receiving any touch events. In alternative embodiments, the exclusive flag may represent exclusivity within a smaller area, such as within a single application or a single window. For example, when a first view whose exclusivity flag is set is receiving a touch event, it may prevent other views in the same window from receiving any touch events, but not prevent views in other windows.
The exclusive touch flag and the multi-touch flag may be combined. Thus, the one or more views being displayed may each include two flags, a multi-touch flag and an exclusive touch flag. In some embodiments, all displayed views may include both markers. The value of one of the markers does not necessarily depend on the value of the other marker. In one example, a view with both exclusive and multi-touch flags set may allow multiple touches within the view, but receive touches in a unique manner (i.e., touches to other views may be blocked while the view is receiving touches). A view with neither flag asserted may block multiple touches within the view, but allow a single touch within the view, even if touches in other views occur simultaneously. A view with a multi-touch flag unasserted and an exclusive touch flag asserted may then allow a unique single touch within that view when no other touch occurs in any other view. A view for which the multi-touch flag is asserted and the exclusive touch flag is not asserted may then be allowed to receive all touches for it. For a view with both flags asserted, the view may allow multiple touches in the view while no other touches occur with respect to other views.
Alternate embodiments may feature only one of the two labels (and associated functionality). Thus, some embodiments may use only the multi-touch flag or only the exclusive touch flag. In some embodiments, different views may use different combinations of indicia.
The various functions performed by the OS in fig. 4, 5A, and 5B may instead be performed by other software, such as various utility software (utility software). These functions may be performed in software at any of the layers 103-108 of FIG. 1. In alternative embodiments, these functions may even be performed by hardware 100.
An exemplary set of code is provided below that shows a method for an exemplary software cell associated with a view according to some embodiments of the invention. Those skilled in the art will recognize that other code may be used to implement the above described functionality.
While the above discussion has focused on multi-touch displays and panels, the invention is not limited to multi-touch devices, but may include various multi-touch devices as described above (e.g., including multi-proximity sensor devices). Both multipoint and exclusive point markers may be used for multipoint devices. These flags may operate in a similar manner as the multi-touch and exclusive-touch flags described above.
Although the present invention has been fully described herein with reference to the accompanying drawings and examples, it is to be noted that various changes and modifications will be apparent to those skilled in the art. Such changes and modifications are to be understood as included within the scope of the present invention as defined by the appended claims.
Appendix A
Exemplary UI API code

Claims (26)

1. A method for processing touch events on a multi-touch device, comprising:
displaying a user interface comprising a plurality of views, each view corresponding to a respective user interface element;
executing one or more software elements, each software element associated with a particular view;
associating indicia comprising a multi-touch indicia and an exclusive touch indicia with one or more views, wherein each indicia indicates a mode of touch event processing for a particular view;
receiving one or more touches in the plurality of views; and
selectively sending one or more touch events to one or more software elements associated with one or more views on which the one or more touches are received based on a value of a multi-touch flag and a value of an exclusive touch flag, each touch event describing the received touch, wherein the multi-touch flag indicates whether a particular view has the ability to receive multiple simultaneous touches and the exclusive touch flag indicates whether the particular view allows other views to receive touch events while the particular view is receiving touch events;
wherein, in accordance with a determination that the multi-touch indicia associated with a particular view is asserted, the particular view associated with the multi-touch indicia is capable of receiving multiple simultaneous touches within the particular view; and
in accordance with a determination that the exclusive touch flag associated with a particular view is not asserted, the particular view associated with the exclusive touch flag allows views other than the particular view to receive touch events while the particular view is receiving touch events.
2. The method of claim 1, further comprising: any view that is not associated with a multi-touch marker is prevented from receiving more than one touch within that view.
3. The method of claim 1, wherein the one or more touch events are selectively received based on values of the multi-touch flag and exclusive touch flag.
4. An apparatus for processing touch events on a multi-touch device, comprising:
means for displaying a user interface comprising a plurality of views, each view corresponding to a respective user interface element;
means for executing one or more software elements, each software element being associated with a particular view;
means for associating indicia including a multi-touch indicia and an exclusive touch indicia with one or more views, wherein each indicia indicates a mode of touch event processing for a particular view;
means for receiving one or more touches in the plurality of views; and
means for selectively sending one or more touch events to one or more software elements associated with one or more views on which the one or more touches are received based on a value of a multi-touch flag and a value of an exclusive touch flag, each touch event describing the received touch, wherein the multi-touch flag indicates whether a particular view has the ability to receive multiple concurrent touches and the exclusive touch flag indicates whether the particular view allows other views to receive touch events while the particular view is receiving touch events;
wherein, in accordance with a determination that the multi-touch indicia associated with a particular view is asserted, the particular view associated with the multi-touch indicia is capable of receiving multiple simultaneous touches within the particular view; and
in accordance with a determination that the exclusive touch flag associated with the particular view is not asserted, the particular view associated with the exclusive touch flag allows views other than the particular view to receive components of a touch event while the particular view is receiving a touch event.
5. The apparatus of claim 4, further comprising: means for preventing any view not associated with a multi-touch marker from receiving more than one touch within the view.
6. The apparatus of claim 4, wherein the one or more touch events are selectively received based on values of the multi-touch flag and exclusive touch flag.
7. A method for processing touch events on a multi-touch device, comprising:
displaying a user interface comprising a plurality of views, each view corresponding to a respective user interface element;
executing one or more software elements, each software element associated with a particular view;
associating indicia comprising a multi-touch indicia and an exclusive touch indicia with one or more views, wherein each indicia indicates a mode of touch event processing for a particular view;
receiving two or more touches in at least one view and another view of the plurality of views; and
selectively sending one or more touch events to one or more software elements associated with the view on which the two or more touches are received, each touch event describing the received touch, based on a value of a multi-touch flag and a value of an exclusive touch flag, wherein the multi-touch flag indicates whether a particular view has the ability to receive multiple concurrent touches and the exclusive touch flag indicates whether the particular view allows other views to receive touch events while it is receiving touch events;
wherein, in accordance with a determination that the multi-touch indicia associated with a particular view is asserted, the particular view associated with the multi-touch indicia is capable of receiving multiple simultaneous touches within the particular view; and
in accordance with a determination that the exclusive touch flag associated with a particular view is not asserted, the particular view associated with the exclusive touch flag allows views other than the particular view to receive touch events while the particular view is receiving touch events.
8. The method of claim 7, further comprising:
receiving a touch event located in a first view; and
in accordance with a determination that the multi-touch flag associated with the first view is asserted, other touch events that occur concurrently with the touch event received on the first view are allowed to be sent to software elements associated with the other views.
9. The method of claim 7, wherein the multi-touch indicia associated with the particular view, when asserted, enables the particular view to receive multiple simultaneous touch events located in the particular view, and when not asserted, prevents more than one touch event within the particular view from being sent to the particular view.
10. The method of claim 7, wherein the exclusive touch flag associated with the particular view, when asserted, prevents views other than the particular view from receiving touch events while the particular view is receiving touch events.
11. An apparatus for processing touch events on a multi-touch device, comprising:
means for displaying a user interface comprising a plurality of views, each view corresponding to a respective user interface element;
means for executing one or more software elements, each software element being associated with a particular view;
means for associating indicia including a multi-touch indicia and an exclusive touch indicia with one or more views, wherein each indicia indicates a mode of touch event processing for a particular view;
means for receiving two or more touches in at least one view and another view of the plurality of views; and
means for selectively sending one or more touch events to one or more software elements associated with the view on which the two or more touches are received based on a value of a multi-touch flag and a value of an exclusive touch flag, each touch event describing the received touch, wherein the multi-touch flag indicates whether a particular view has the ability to receive multiple simultaneous touches and the exclusive touch flag indicates whether the particular view allows other views to receive touch events while it is receiving touch events;
wherein, in accordance with a determination that the multi-touch indicia associated with a particular view is asserted, the particular view associated with the multi-touch indicia is capable of receiving multiple simultaneous touches within the particular view; and
in accordance with a determination that the exclusive touch flag associated with the particular view is not asserted, the particular view associated with the exclusive touch flag allows views other than the particular view to receive touch events while the particular view is receiving touch events.
12. The apparatus of claim 11, further comprising:
means for receiving a touch event located in a first view; and
means, enabled in accordance with a determination that the multi-touch flag associated with the first view is asserted, for allowing other touch events that occur concurrently with the touch event received on the first view to be sent to software elements associated with other views.
13. The apparatus of claim 11, wherein the multi-touch indicia associated with the particular view, when asserted, enables the particular view to receive multiple simultaneous touch events located in the particular view, and when not asserted, prevents more than one touch event within the particular view from being sent to the particular view.
14. The apparatus of claim 11, wherein the exclusive touch flag associated with the particular view, when asserted, prevents views other than the particular view from receiving touch events while the particular view is receiving touch events.
15. A method for processing touch events on a multi-touch device, comprising:
displaying one or more views;
executing one or more software units, each software unit associated with a first view of the one or more views;
associating a multi-touch indicia with the first view, wherein the multi-touch indicia indicates whether the first view has the ability to receive multiple simultaneous touches;
receiving two or more touches on a first view; and
selectively transmit one or more touch events based on a value of a multi-touch indicia associated with the first view, each touch event describing a respective touch of the two or more touches, the one or more touch events being transmitted to at least one of the one or more software elements associated with the first view on which the respective touch was received.
16. The method of claim 15, comprising:
it is determined whether the multi-touch indicia associated with the first view indicates that the first view is a multi-touch view.
17. The method of claim 16, comprising:
in accordance with a determination that the first view is a multi-touch view, sending two or more touch events corresponding to the two or more touches to the at least one of the one or more software elements associated with the first view; and
in accordance with a determination that the first view is not a multi-touch view, sending a first touch event of the one or more touch events to the at least one of the one or more software elements associated with the first view and ignoring sending the remaining touch events of the one or more touch events to the at least one of the one or more software elements associated with the first view.
18. An apparatus for processing touch events on a multi-touch device, comprising:
means for displaying one or more views;
means for executing one or more software units, each software unit associated with a first view of the one or more views;
means for associating a multi-touch indicia with the first view, wherein the multi-touch indicia indicates whether the first view has the ability to receive multiple simultaneous touches;
means for receiving two or more touches on a first view; and
means for selectively transmitting one or more touch events based on a value of a multi-touch indicia associated with a first view, each touch event describing a respective touch of the two or more touches, the one or more touch events being transmitted to at least one of the one or more software elements associated with the first view on which the respective touch was received.
19. The apparatus of claim 18, comprising: means for determining whether the multi-touch indicia associated with the first view indicates that the first view is a multi-touch view.
20. The apparatus of claim 19, comprising:
means, enabled in accordance with a determination that the first view is a multi-touch view, for sending two or more touch events corresponding to the two or more touches to the at least one of the one or more software elements associated with the first view; and
means, in accordance with a determination that the first view is not a multi-touch view, for sending a first one of the one or more touch events to the at least one of the one or more software elements associated with the first view and ignoring sending remaining ones of the one or more touch events to the at least one of the one or more software elements associated with the first view.
21. A method for processing touch events on a multi-touch device, comprising:
displaying views, wherein a first view of the views is associated with one or more software elements and a second view of the views, different from the first view, is associated with one or more software elements;
associating a first exclusive touch flag with the first view;
receiving one or more touches on a first view; and
while receiving the one or more touches on the first view, receiving the one or more touches on a second view, wherein:
when the value of the first exclusive touch flag indicates that the first view is not an exclusive view,
when the value of the exclusive touch flag associated with the second view indicates that the second view is not an exclusive view, one or more touch events describing the one or more touches on the second view are sent to a software element associated with the second view, an
When the value of the exclusive touch flag associated with the second view indicates that the second view is an exclusive view, the one or more touches received on the second view are ignored; and
when the value of the first exclusive touch flag indicates that the first view is an exclusive view, the one or more touches received on the second view are ignored or blocked and are not sent to any software cells.
22. The method of claim 21, comprising:
one or more touch events are sent to at least one of the one or more software elements associated with the first view, each touch event describing a respective touch of the one or more touches on the first view.
23. The method of claim 21, comprising:
it is determined whether an exclusive touch flag associated with the first view indicates that the first view is an exclusive touch view.
24. An apparatus for processing touch events on a multi-touch device, comprising:
means for displaying views, wherein a first one of the views is associated with one or more software elements and a second one of the views, different from the first view, is associated with one or more software elements;
means for associating a first exclusive touch flag with a first view;
means for receiving one or more touches on a first view;
means, enabled when receiving one or more touches on a first view, for receiving one or more touches on a second view, wherein:
when the value of the first exclusive touch flag indicates that the first view is not an exclusive view,
when the value of the exclusive touch flag associated with the second view indicates that the second view is not an exclusive view, one or more touch events describing the one or more touches on the second view are sent to a software element associated with the second view, an
When the value of the exclusive touch flag associated with the second view indicates that the second view is an exclusive view, the one or more touches received on the second view are ignored; and
when the value of the first exclusive touch flag indicates that the first view is an exclusive view, the one or more touches received on the second view are ignored or blocked and are not sent to any software cells.
25. The apparatus of claim 24, comprising:
means for sending one or more touch events to at least one of the one or more software elements associated with the first view, each touch event describing a respective touch of the one or more touches on the first view.
26. The apparatus of claim 24, comprising:
means for determining whether an exclusive touch flag associated with the first view indicates that the first view is an exclusive touch view.
HK12105027.2A 2008-03-04 2012-05-23 Touch event model HK1164498B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/042,318 US8645827B2 (en) 2008-03-04 2008-03-04 Touch event model
US12/042,318 2008-03-04

Publications (2)

Publication Number Publication Date
HK1164498A1 HK1164498A1 (en) 2012-09-21
HK1164498B true HK1164498B (en) 2015-07-17

Family

ID=

Similar Documents

Publication Publication Date Title
JP6659648B2 (en) Touch event model
AU2020270466B2 (en) Touch event model
HK1164498B (en) Touch event model
HK1135210B (en) Touch event model
AU2011101155B4 (en) Touch event model
AU2011101156A4 (en) Touch event model
AU2011101154B4 (en) Touch event model
HK1172971A (en) Touch event model
HK1166147A (en) Touch event model
HK1156126B (en) Touch event model
HK1166146A (en) Touch event model
HK1172970A (en) Touch event model
HK1166149A (en) Touch event model
HK1166150A (en) Touch event model
HK1166148A (en) Touch event model
AU2011265335A1 (en) Touch event model