HK1180415A - Mode sensitive processing of touch data - Google Patents
Mode sensitive processing of touch data Download PDFInfo
- Publication number
- HK1180415A HK1180415A HK13107610.0A HK13107610A HK1180415A HK 1180415 A HK1180415 A HK 1180415A HK 13107610 A HK13107610 A HK 13107610A HK 1180415 A HK1180415 A HK 1180415A
- Authority
- HK
- Hong Kong
- Prior art keywords
- control
- instance
- touch
- touch data
- instances
- Prior art date
Links
Description
The present application is a divisional application of an invention patent application having an application date of 2008/6/13, an application number of 200810125593.9, and an invention name of "pattern sensitive processing of touch data".
Technical Field
The present invention relates to user interfaces, and more particularly, to user interfaces for devices having multi-touch displays.
Background
The mouse is a well-known and relatively simple user interface tool used in many computing devices. The input provided by a mouse may be relatively simple — the position and state of the various buttons or wheels that the mouse has. Many existing touch screen devices provide functionality similar to that of a mouse by allowing a user to press a stylus (stylus) or finger thereon to designate a single specific location.
Existing Operating Systems (OS) may provide software applications running thereon with various tools for facilitating user interaction through a graphical user interface and mouse or mouse-like user input. For example, the OS utility may allow a software application to define and register widgets (e.g., buttons or scroll bars). The OS utility may track when the user mouse clicks on a widget and may send an alert to the software application. This makes the development of software applications easy and simplified, as each application does not need to keep track of mouse movements.
Recent advances in user interface technology have proposed multi-touch panels (multi-touch panels). An exemplary Multi-touch panel is described in U.S. patent application No. 11/649,998 entitled "Proximity and Multi-touch sensor Detection and modulation," filed on 3.1.2007 (which is incorporated herein by reference in its entirety).
One of the advantages of a multi-touch panel is that it detects multiple touch events at multiple locations on the panel simultaneously. Thus, a multi-touch panel can provide not only a single interaction location (as with many existing touch panels), but also a map of all portions of the panel that are currently being touched. This makes it possible to provide a much richer user interaction than previous input devices.
However, the multi-touch panel also requires much more data to be processed by its various applications. In particular, applications that utilize a multi-touch panel may need to process an entire map that specifies the currently touched location, rather than a single mouse click location. This can result in much higher processing requirements for running applications on a multi-touch enabled device.
Disclosure of Invention
The present invention relates to a multi-touch enabled device that contains a hardware or software utility layer that can perform application-aware processing on touch data. In particular, various applications executing on the device may send to the utility layer definitions of the types of touch data they need from the multi-touch enabled display. The utility layer may then process the incoming touch data in association with these definitions and send the resulting data back to the application in the format requested by the application. Thereby, the computational load related to the processing of the touch data can be reduced. Also, in some cases, applications may obtain more accurate data than is provided in existing systems.
Applications executing on a multi-touch enabled device may define the types of touch data that these applications require in terms of a control instance (controllingstance). The control instances may define various ways in which a user may communicate with or control applications running on the multi-touch enabled device. Examples of controls may be, for example, buttons, sliders, knobs, navigation pads, and the like. Each control instance, along with the associated control type, may define the type of results needed for that control instance and how those results are to be computed.
Thus, the application can pass one or more control instances to the utility layer, which can then process the touch data per control instance and provide the results computed from the control instances to the application. Thus, for example, an application may receive a simple indication of whether a button is touched or whether and how far a slider is moved without having to process geometric touch data to obtain this information.
Drawings
FIG. 1 is a schematic diagram of an exemplary multi-touch enabled device according to one embodiment of this disclosure.
Fig. 2 is a flow diagram illustrating an exemplary method of operation of an application and MTL parser (parser) layer according to one embodiment of the invention.
FIG. 3 is a diagram illustrating various exemplary control instances of different control types displayed on a screen according to one embodiment of the invention.
FIG. 4 is a diagram illustrating the transparency and/or opacity of an exemplary control according to one embodiment of the present invention.
FIG. 5 illustrates the processing of touch data for an exemplary Control of an exemplary Multi-DOF Control Type (Multi-DOF Control Type) according to one embodiment of the invention.
FIG. 6 illustrates later processing of the touch data of FIG. 5 for an exemplary incremental control according to one embodiment of the invention.
FIG. 7 is a schematic diagram illustrating an exemplary incremental change in which the touch area may move and a new contact patch may appear, according to one embodiment of the invention.
FIG. 8 is a schematic diagram of an exemplary multi-touch enabled device according to one embodiment of this disclosure.
Detailed Description
In the following description of the preferred embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the preferred embodiments of the present invention.
This is related to the processing of touch data on the lower layers of the multi-touch enabled device (e.g., on the OS) to form relatively simple touch events, such that processing and communication bandwidth requirements on the application layer may be reduced.
FIG. 1 is a schematic diagram of an exemplary multi-touch enabled device according to an embodiment of the present invention. More specifically, FIG. 1 is a layer diagram of a touch data processing module of a multi-touch device. The lowest layer may be the physical multi-touch sensor 100. The physical sensor may be, for example, a multi-touch panel that senses touch events based on measurements of mutual capacitance (e.g., the multi-touch panel of U.S. patent application No. 11/649,998 discussed above). The multi-touch panel can be stacked onto the display or even integrated into the display so that a user can interact with the device by touching the display. U.S. patent application No. 11/649,998 entitled "Proximity and multi-Touch Sensor Detection and modulation," filed on 3.1.2007 (which is incorporated herein by reference in its entirety), teaches combining a multi-Touch panel with a display. The physical sensor may also include circuitry for processing and/or digitizing data obtained by the multi-touch panel. In some embodiments, the physical sensors may be configured to sense whether certain predefined touch pixels are being touched. In other embodiments, the physical sensor may also sense the pressure or intensity with which the pixels are touched.
The error removal and activity detection module 101 may receive data from the physical sensors and perform various error removal operations thereon. Error removal may include removing data that is not generally caused by an intended touch event. In addition, module 101 may also perform activity detection. Thus, it can detect whether any touch activity is occurring and, if this is not the case, remove incoming touch data (i.e., not pass to the next layer). Thus, by avoiding unnecessary processing of the touch data, power can be saved. Layers 100 and 101 may be portions of a hardware layer.
Layers 102 and 106 may be portions of a hardware abstraction layer (hardware abstraction layer). A hardware abstraction layer may be provided to give higher layers more useful multi-touch data. Layers 102 and 106 may be hardware or software. The multi-touch language processing layer 102 may be used to process raw data representing sensed voltages (which themselves represent mutual capacitances at each touch pixel) into processed touch data. The processed touch data may be based on coordinates of the touch pixels and may include a binary value indicating whether the pixel is being touched. In other embodiments, the processed touch data may include other or additional data, such as values that indicate, for individual pixels, the amount of force used to touch the pixel. The processed touch data may be viewed as an image, where each pixel of the image may indicate whether the corresponding pixel is being touched (or how strongly it is being touched).
The display graphics/input surface coordinate converter layer 106 may be used to convert the processed touch data from touch panel coordinates to display coordinates. For practical reasons, the smallest area (e.g., a touch pixel) in which a touch event can be sensed may be larger than a display pixel. In general, the touch resolution does not need to be as high as the display resolution due to the size of a human finger. However, in order to associate a touch event with an element (e.g., button, etc.) displayed on the screen, it is useful to base the touch data on the same coordinate system as the display data. For this reason, a display graphics/input surface coordinate converter may be used to convert touch data to display coordinates. The display graphic/input surface coordinate converter may transmit the converted touch data to an MTL parser (parser) 103. The data received by the MTL parser may be raster data. In other words, it may include one or more arrays of touch values associated with each touch pixel.
The multi-touch language (MTL) parser layer 103 can display coordinate-based touch data and utilize it to provide a high-level control-based interface to the application layer 105. The application layer 105 may include one or more applications, such as a phone book, email, map application, video or picture viewer, and the like.
The operation of the application and MTL parser layers is described in more detail in FIG. 2. In step 200, one or more applications may define one or more control instances and send them to the MTL parser. The control instances may be elements of user interaction. It may be a button, knob, slider, etc. The control instance may contain a visual representation and bear touch functionality with it, i.e., in order to communicate with the application that created the control instance, the user may touch the control instance appearing on the display. Thus, the user may touch the buttons to press it, drag the slider to move it, or place his/her fingers over the knobs and then rotate them to rotate the knobs.
The control instances created by the application may be instances of one or more data types. These types may correspond to various types of controls such as knobs, buttons, sliders, and the like. These instances may contain data identifying the size/shape of the control, the position of the control, and the like. In some embodiments, these instances may also contain data defining the visual appearance of the control. In other embodiments, the application may communicate with other modules, such as, for example, core graphics 104, in order to define the visual appearance of the control. The application may continuously define new control instances and/or move or delete older control instances by communicating with the MTL parser.
In step 202, the MTL parser can store all current control instances. In step 204, the MTL parser may receive processed touch data from an underlying layer (such as a display graphics/input surface converter). In step 206, to determine interaction with the instance (i.e., whether the button is pressed, the knob is rotated, etc.), the MTL parser can apply the received data to the stored control instance.
For example, the MTL parser can check the area defined by the button instance and check whether the processed touch data indicates any touch events within the area. In some instances, to determine how the user interacts with the instance, the MTL parser may need to apply the control instance to the historical touch data as well as the current touch data. For example, for a knob, the MTL parser may need to examine the previous locations of touches on and around the knob as well as their current locations to determine if the knob is being rotated. The MTL parser may do this by storing the historically processed touch data and processing it in step 206. Alternatively, the MTL parser can store intermediate data specific to each control type. For example, if there is a single control instance of the knob type, the MTL parser may only maintain historical touch data for the area defined by that instance, and only maintain historical data for a particular past period of time necessary to determine the knob rotation (e.g., it may only maintain one previous frame of data). Some controls that use historical data often use only historical data from previous frames and thus measure delta changes, and therefore they may also be referred to as delta controls.
In step 208, the MTL parser can send the various result data obtained in step 206 to various applications. The result data may relate to the control instance that the application sent to the parser. In particular, the result data can be related to the control type that defines the control instance. Thus, for example, a simple button type may define a binary value as its result data that indicates whether the button is being pressed. The knob control type may define an integer indicating a rotation angle of the knob as its result data.
Thus, by providing a lower level MTL parser layer that sends brief and easy to use result data to an application, embodiments of the invention can greatly simplify application programming and reduce the amount of data that an application must process. Further, since the MTL parser can be aware of all control instances, it can track the area of the display where touch data is relevant to the application (i.e., where the control instances exist) as well as areas where it is not. Thus, the MTL parser can improve efficiency by not processing touch data for unrelated regions. In addition, the MTL parser can improve the efficiency of layers below it by indicating that they are not processing unrelated data. In some embodiments, the MTL parser may even instruct the touch panel itself not to process touch data from extraneous regions of the panel. This may save power because the touch panel may turn off the stimulation signal (which may be necessary for touch sensing according to some embodiments).
In previous systems, the MTL parser processes touch data without knowledge of the various control elements being displayed by different applications. Thus, the parser can process touch data in a standard format. For example, the parser may group pixels that have been touched into touch regions, fit the touch regions into ellipses (and/or other easily defined shapes), and send data defining the various ellipses or other shapes to the application. The application must then process these shapes and compare them to the controls that the application is displaying on the screen to determine if and how the user interacts with these controls. To support legacy applications and/or to support some instances where control-type based functionality may not be optimal, some embodiments of the invention may incorporate the functionality described below as well as the advanced control-type based functionality discussed above.
While at first glance it might seem that conventional systems may allow for a higher level of accuracy due to the communication of actual touch data to the application, this is not always the case. For example, in some cases, the control instance-based system of the present invention may determine the intent of the user more accurately than conventional systems. To be practical, conventional systems typically compress the touch data (e.g., by converting it into an ellipse or other shape as described above) before passing it to an application. However, the compressed data may incorrectly convey the user's intent. On the other hand, according to embodiments of the present invention, the MTL parser can handle control instances of each control type differently. Thus, the control type may be predefined to most correctly interpret the user's intent. Thus, even though embodiments of the present invention may not actually pass touch data to the application, they allow the application to more accurately interpret the user's intent.
Furthermore, those skilled in the art will recognize that embodiments of the present invention may require significantly less processing than conventional systems. In conventional systems, the parser may need to process and send all or most of the incoming touch data to the application, as it does not know what type of data the application needs. In addition, the application must again process the data received by the parser to determine how it applies to the particular controls used by the application. In embodiments of the invention, the parser knows what touch data is needed by the application and can do only those processes that are relevant to the application. In addition, the parser sends data to the application that is already relevant to the application's controls, thereby minimizing or eliminating entirely the processing that the application needs to do with incoming touch data.
FIG. 3 illustrates various control instances of different control types displayed on a screen. For example, one or more button type instances (such as button 300) may be used. The button may be used to detect whether the user presses or depresses the button. Slider controls 301 and 302 may also be used. The slider can detect when the user slides his/her finger along the control. A rotary or knob control (such as knob 303) may also be used. The knob control can detect rotation of a finger pressed against the knob. The slider and knob controls may be of the type that requires historical data. Thus, the MTL parser can compare the previous touch state of the slider or knob to the current touch state to determine if a slide and/or rotation is occurring. In addition, a touch pad control 304 may be used. The touchpad control is intended to emulate a computer notebook touchpad and can be used to detect various touch events on the touchpad. Touchpad controls may also provide functionality beyond that of a common notebook computer touchpad by detecting more complex events such as the deployment of a finger or the like. Thus, for example, the touchpad control can be a navigation surface that detects lateral movement of a hand, pressure of the hand against the surface, expansion and contraction of the hand, rotation of the hand on the surface, and the number of contact patches on the surface.
FIG. 4 illustrates transparency and/or opacity options for a control according to an embodiment of the present invention. According to some embodiments, various controls may be superimposed or defined such that they may cover the same area. Thus, for example, controls 400 and 401 may cover intersection region 403. Some embodiments allow the controls to be defined as transparent or opaque. In the case of a transparent control, both controls detect touch events in a common area in the same manner as if there were no overlay. Thus, if controls 400 and 401 are both buttons and a finger is pressed in area 402, both controls can detect a touch by the finger.
According to some embodiments, the controls may be opaque. When opacity is used, various control instances may contain a hierarchy parameter. An opaque control instance may obscure any control instances that are below it and that intersect at least a portion of its area. In some embodiments, the instances with the lowest level parameters may be considered the highest level (i.e., all other instances are below them), but other configurations are possible. Thus, assuming instance 400 can be opaque and at a higher level than instance 401, instance 400 can obscure instance 401 (i.e., prevent registration of touch events at instance 401). In some embodiments, shading will only occur in the area where an opaque instance overlaps an instance below it. Thus, for the example of FIG. 4, shadowing can occur at the superimposed region 403 (thereby shadowing the touch 402), but not at the region 404 that is not superimposed.
In some embodiments, where the strength or pressure of a touch is sensed in addition to a touch event, the opacity of the control may be partial. A partially opaque control may not completely obscure the control below it, but simply reduce the strength of the control below it from a touch event.
The following description discusses in detail the scope of embodiments of the present invention. According to these embodiments, the MTL control can use the following definitions:
definition of the region of the control: "control area";
-a definition of the type of control, including what type of output data is desired from the control: "control type definitions" (which reference control region definitions);
definition of control instances with X, Y positions for each control: a "control instance," or simply "control," that references a control type.
Once the control instance is defined, control output data (or result data) can be generated from any control instance currently undergoing user activity. In general, a control that is not actively used may be "quiet" and not output data.
The control output data may include:
-the number of currently active controls;
-for each active control:
CTRLID-control ID code associating output data with a control instance
Data set-a data set providing additional information about the state of the control or the incremental state of the control since the last output data report. For the special case of a button, a data set is not necessary, since the presence of a ctrld is sufficient to indicate that the button is currently pressed.
The control instance may be an actual control such as a button. The control instance can reference a control type that can be defined prior to the definition of the control instance. For example, a control type may describe the shape and behavior of a button. Many control instances may reference the same control type (e.g., many control instances may reference a button in order to build a virtual keyboard).
The controls may overlap each other in a similar manner as overlapping graphical objects on the display. The behavior of overlapping controls can be determined using the order (defined in the control instance) and opacity parameters (defined in the control type) as follows:
for opaque controls, the first-processed controls (lowest order) extract the signal from the original image according to their region mask (mask) and subtract the used signal so that later-processed controls (higher order) are inaccessible.
For transparent controls, they extract the signal from the original image according to the region mask, but they do not subtract the used signal, so that the used signalCan be used forIs reused by a later-processed control.
The analogy to overlapping objects in the display can be expanded: the opaque controls with the lowest ordinal numbers may be similar to the display objects closest to the user, which obscure visibility of display objects further away from the user. The transparent control may be similar to the transparent display object since display objects further away from the user may still be seen through the transparent control.
Table 1 (control example parameters)
The control region may be a reusable definition of a shape that may be referenced by a control type. More than one control type may reference the same control region. The coordinates may be relative to a local origin that may be specified in the control instance. The local origin may be defined as the lower left corner of the bounding box.
The rectangular area definition may define a simple rectangular area. All regions within the rectangle may be active for input of the type specified by the control type.
Watch 2 (rectangular area)
Binary mask region definitions may define arbitrary shapes within a simple rectangular bounding box. For input of the type specified by the control type, all regions associated with a "1" may be active. For an input, all regions associated with a "0" may be inactive.
Table 3 (bounding mask)
The hierarchical mask region definition may define an arbitrary shape with a hierarchical sensitivity within a simple rectangular bounding box. On each element, a scalar value may determine the relative sensitivity at that point. For an input, all pixels with "0" may be inactive. A pixel with "255" may be fully sensitive. A pixel with "127" may be 50% sensitive.
Watch 4 (grading shield)
The control type can reference a region definition and add functional (functional) requirements to form a reusable control type that can be used to invoke functionality of multiple control instances. An example of a specific control type definition may be a description of a keyboard button. There may be two main families of control types: (i) a button control type and (ii) a multi-DOF control type.
A multi-DOF control type refers to a control type that allows multiple degrees of freedom. They may provide a configurable set of output parameters that can be classified into group outputs and multipoint outputs:
the group output may be a parameter that utilizes all contact patches found on the control area. These outputs may be optimized for smooth pressure tracking, smooth XY tracking, smooth rotation tracking, smooth R tracking, and estimated number of blobs (blobs) tracking. Where contact blocks merge and separate, they may perform better than multi-point output.
The multi-point output may be a list of contact patches with attributes of each contact patch. When the contact blocks are separated, they may work well, but in the case of contact block merger and separation, may not perform as well as the bank output.
In addition to group-wise multipoint, the output can be broadly categorized as instantaneous or incremental. The instantaneous output need not rely on previous image data. Some instantaneous outputs may change abruptly when new contact patches are included, excluded, incorporated into, or separated from a common area. The incremental output may use the current image as well as the last image. The goal of the incremental output may be that they continue to provide continuous, smooth, meaningful data that truly represents the user's intent, regardless of variations due to the addition, subtraction, merging, or separation of contact patches. The various outputs and associated computations may be enabled/disabled via Class (Class) parameters defined in the control type definition and are shown in the following table.
TABLE 5 (types of output)
The following table may illustrate an exemplary control application and associated recommendations for control types and example parameter mappings:
table 6 (exemplary control application)
The button control type may be a relatively simple control intended for an on-screen keyboard.
Table 7 (button control type)
For the special case of a button, a data set is not necessary, since the presence of a CTRLID in the output report list is sufficient to indicate that the button is currently pressed.
Some exemplary algorithms for processing touch data for button control types are described below. The image data in the region after error removal may be represented by an unsigned byte array D (p, q), where p may represent a row number from 0 (bottom) to M-1 (top) and q may represent a column number from 0 (left) to N-1 (right). When D (p, q) ═ 0, no input signal is sensed at the pixel. When D (p, q) ═ 255, the maximum signal can be sensed at the pixel. For a rectangular shield:
Z(p,q)=D(p,q)
binary mask data may be stored in an array M (i, j) of bits, where i may represent a row number from 0 (bottom) to YLen-1 (top) and j may represent a column number from 0 (left) to Xlen-1 (right). Thus, for a binary mask:
Z(p,q)=D(p,q)*M(p,q)
hierarchical mask data may be stored in an unsigned byte array G (i, j), where i may represent a row number from 0 (bottom) to YLen-1 (top) and j may represent a column number from 0 (left) to Xlen-1 (right).
Ztot can be calculated as follows:
the button threshold determination may be performed as follows:
if(BUTTON==OFF)
if Ztot>Zon then BUTTON=ON
if(BUTTON==ON)
if Ztot<Zoff then BUTTON=OFF
multiple MOF control types can be defined as follows:
table 8 (Multi DOF control type)
Some algorithms for processing touch data for multi-DOF control types are described below. The image data in the region after error removal may be represented by an array D (p, q), where p may represent a row number from 0 (bottom) to M-1 (top) and q may represent a column number from 0 (left) to N-1 (right).
The following discusses algorithms for Class0 and Class1 output parameters Ztot, XG, YG, AG. For a rectangular shield:
Z(p,q)=D(p,q)
binary mask data may be stored in an array M (i, j) of bits, where i may represent a row number from 0 (bottom) to YLen-1 (top) and j may represent a column number from 0 (left) to Xlen-1 (right). Thus, for a binary mask:
Z(p,q)=D(p,q)*M(p,q)
the hierarchical mask data may be in an array G (i, j) of unsigned bytes, i may represent row numbers from 0 (bottom) to YLen-1 (top), and j may represent column numbers from 0 (left) to Xlen-1 (right). Thus, for a graded mask:
ztot can be calculated as follows:
the control activation threshold determination may be performed according to the following code:
if(CONTROL-ACTIVE==OFF)
if Ztot>Zon then CONTROL-ACTIVE=ON
if(CONTROL-ACTIVE==ON)
if Ztot<Zoff then CONTROL-ACTIVE=OFF
AG can be calculated as follows:
Peak=max(Z(p,q))
thresh=0.50
HiFlag(p,q)=if(Z(p,q)>thresh,1,0)
FIG. 5 illustrates processing of touch data for an exemplary control of an exemplary multi-DOF control type. For example, FIG. 5 may relate to a knob control type. The region 500 may be a region sensed as being touched first. The area 500 may be obtained by detecting a touch to every single pixel within the area or detecting a touch to certain pixels of the area and connecting these pixels to form a continuous area.
Parameters XG and YG may be the X and Y coordinates of the centroid 501 of the touch area 500. The centroid may be defined as the point having the average X and Y coordinates of the area 500. RG may be defined as the average radius 502 of the area 500. ThetaG can be defined as the angle 503 of the first principal axis of inertia 504 with respect to the X-axis of the device. The second principal axis of inertia can be perpendicular to the first principal axis of inertia.
The centroid calculation for XG, YG can be as follows:
the Class2 output parameter RG can be derived as shown below. It should be noted that the Class1 parameter depends on the Class0 parameter. The moment of inertia about the centroid x-axis of the stack 500 may be:
the moment of inertia about the centroid y-axis of the group may be:
the polar moments of inertia around the centroid of the group may be:
IpG=IxG+IyG
the radius of the pole of gyration around the centroid may be:
the output parameter NBG may be an estimate of the number of "bumps" in the group. This can be determined by spatial frequency analysis, such as a discrete cosine transform algorithm. It may be the same or different from the number of different contact blocks.
The algorithm for calculating the Class4 output parameters ThetaG, ECG is shown below. It should be noted that the Class3 parameter depends on the Class1 parameter. The product of inertia around the centroid can be calculated as follows:
the first principal axis may be calculated as:
note that: theta1 is in the range of-90 degrees to +90 degrees.
The second principal axis can be calculated as:
Theta2=Thetal+90
note that: theta2 ranges from 0 degrees to +180 degrees.
The first principal moment of inertia may be calculated as:
the second principal moment of inertia may be calculated as:
the ratio of the principal moments of inertia (1 st/2 nd) may be:
class5, Class6, and Class7 output parameters Xginc, YGinc, ZTotinc, AGinc, RGinc, ThetaGinc may be incremental parameters. In other words, they are intended for measuring incremental changes in touch data. For example, FIG. 6 illustrates the use of an incremental parameter. In particular, FIG. 6 illustrates later processing of the touch data of FIG. 5 for an exemplary incremental control. The initial touch area 500 of fig. 5 may later become the touch area 600. The new touch area 600 may have a new centroid 601 (with new coordinates XG and YG), a new mean radius 602 (RG), and a new angle 603 of the first principal axis of inertia (ThetaG).
When the feature set (number of contact patches/approximate location of contact patches) does not change, the incremental parameters XGinc, YGinc, RGinc, ThetaGinc may be calculated as follows:
XGinc=XG-XG_old
YGinc=YG-YG_old
RGinc=RG-RG_old
ThetaGinc=ThetaG-ThetaG_old
among them, XG _ old, YG _ old, RG _ old, and ThetaG _ old may be output parameters from a previous stage (see fig. 5). Note that ThetaGinc may be adjusted to account for sudden changes due to overflow/underflow of ThetaG beyond +90 degrees or-90 degrees.
FIG. 7 is an incremental change showing that the touch area is movable and that a new contact patch may appear. The calculation of the incremental parameters may be changed whenever the set of features (e.g., the number and relative positioning of the contact patches) changes due to the addition, subtraction, merging, or separation of the contact patches. Fig. 7 shows an example of how the delta parameters are calculated in the first frame after the appearance of a new contact block. In fig. 7, the position and angle of the image 500 is changed to form an image 700 with a centroid 701 and a new contact patch 702 appears. Assuming that XG _ OLD, YG _ OLD, RG _ OLD, ThetaG _ OLD are outputs from the last image 500 with OLD feature set and XG2, YG2, ThetaG2 are outputs calculated from the NEW image based on the OLD feature set (i.e., ignoring NEW patch 702), incremental values XGinc, YGinc, Rginc and ThetaGinc are obtained as follows:
XGinc=XG2-XG_old
YGinc=YG2-YG_old
RGinc=RG2-RG_old
ThetaGinc=ThetaG2-ThetaG_old
note that ThetaGinc should be adjusted to cope with sudden changes due to overflow/underflow of ThetaG beyond +90 degrees or-90 degrees.
This method of calculating incremental parameters provides continuity regardless of the addition, subtraction, merging and separation of contact blocks.
The incremental parameter calculation in the first image after contact patch subtraction can be done in a similar way. Thus, assuming that XG _ OLD, YG _ OLD, RG _ OLD, ThetaG _ OLD are outputs from the NEW image having the OLD feature set, and XG3, YG3, RG3, ThetaG3 are outputs calculated from the OLD image based on the NEW feature set, the increment values are obtained as shown below:
XGinc=XG-XG3
YGinc=YG-YG3
RGinc=RG-RG3
ThetaGinc=ThetaG-ThetaG3
note that ThetaGinc may be adjusted to account for sudden changes due to overflow/underflow of ThetaG beyond +90 degrees or-90 degrees.
The following discusses the algorithm for Class8 to output parameters NPTS, PTID, Z, X, Y, A, R, Theta, EC.
NPTS represents the number of contact blocks in the control area. At a high level, general methods such as the following may be used:
-passing the touch data through a spatial low pass filter;
-filtering (threshold out) the noise by a threshold;
-dividing into a plurality of zones; and
-limiting the respective region size to pixels above 50% of the peak value of the region.
After the contact block area is defined, the individual contact block parameters of the individual contact blocks may be processed with an algorithm similar to the algorithm for the set of parameters already detailed above:
table 9 (calculating each contact block parameter Class 8)
| Parameters of each contact block | Using an algorithm similar to the one below |
| Z | ZG |
| A | AG |
| X | XG |
| Y | YG |
Once the contact patch parameters are established as shown above, the PTID for each contact can be found by comparing the new and last X, Y, Z and A parameters and the previous time derivative of these parameters.
The algorithm for the Class9 output parameters R, Theta, EC is shown below. The individual contact block parameters of the individual contact blocks may be processed with algorithms similar to those used for the set of parameters already detailed above:
table 10 (calculating each contact block parameter-Class 9)
| Parameters of each contact block | Use ofAlgorithm similar to the following |
| R | RG |
| Theta | ThetaG |
| EC | ECG |
The embodiments discussed above make it possible to provide an SDK to make it easy to develop software for a multi-touch device. In particular, the SDK may allow developers to create classes and instances of control types and embed them in applications using an intuitive graphical environment. Thus, developers who are less familiar with the details of multi-touch data processing can still exploit the benefits of multi-touch displays to develop applications.
Multi-touch enabled displays may be used in a variety of devices. Thus, embodiments of the invention include, but are not limited to, devices such as cellular telephones, portable music players, GPS devices, PDAs, portable email devices, electronic kiosks (electronic kiosks), computers, and other devices that utilize multi-touch displays.
FIG. 8 is a schematic diagram of a multi-touch enabled device according to an embodiment of the invention. Multi-touch enabled device 800 can include a panel 801 that performs display and multi-touch functions. The apparatus may also include multi-touch processing circuitry 802 for processing multi-touch data, memory 803 for storing instructions and/or data (the memory may be RAM or other types of memory including non-volatile memory), processor 804, and bus 806 for connecting the various elements. The modules shown in fig. 1 may be implemented in panel 801, circuitry 802 and/or by instructions stored in memory and executed by a processor.
Although embodiments of the present invention are described herein with respect to a mutual capacitance based multi-touch panel, it should be understood that the present invention is not limited to these embodiments, but is generally applicable to all multi-touch devices and other devices that may receive similar pixel-based inputs in which different pixels may be active at the same time, such as, for example, proximity sensor (proximity sensor) devices, camera devices, and the like.
Although the present invention has been fully described in connection with the embodiments thereof with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the present invention as defined by the appended claims.
Claims (42)
1. A method for operating a multi-touch enabled device, comprising:
generating, by one or more applications executing on the multi-touch enabled device, one or more control instances;
sending a control instance from the one or more applications to a multi-touch utility;
receiving, by a multi-touch utility, touch data;
processing, by the multi-touch utility, only touch data relating to at least one control instance to obtain one or more result sets, wherein each result set is associated with a respective control instance; and
sending the one or more result sets to the one or more applications.
2. The method of claim 1, further comprising:
displaying one or more interface images on a display of a multi-touch enabled device, each interface image associated with a control instance; and
detecting a touch on one or more interface images,
wherein the touch data comprises data indicative of a touch on the one or more interface images; and
wherein the step of processing touch data comprises: the touch data is processed in a manner related to the control instance to obtain a result set.
3. The method of claim 1, wherein the one or more control instances define a format of a result set.
4. The method of claim 1, wherein the result set does not contain touch data.
5. The method of claim 1, wherein the step of processing touch data comprises: processing current touch data and processing historical touch data.
6. The method of claim 5, wherein the step of processing touch data comprises: the current touch data is compared to the historical touch data to obtain an incremental result.
7. The method of claim 5, further comprising:
determining, by the multi-touch utility based on the control instance, which received touch data is likely to be needed in the future as historical touch data;
saving received touch data that may be needed in the future; and
all other received touch data is discarded.
8. The method of claim 1, wherein:
the step of processing touch data further comprises: deriving intermediate touch data, the intermediate touch data being in a form other than a raster form;
the method further comprises the following steps:
saving the intermediate touch data; and
incremental results are obtained in the future using the intermediate touch data.
9. The method of claim 1, wherein the touch data comprises a plurality of binary values, each binary value indicating whether a particular pixel was touched.
10. The method of claim 1, wherein the touch data comprises a plurality of values, each value indicating a force or pressure of a touch to a particular pixel.
11. The method of claim 1, wherein each control instance defines a relevant area of the display, and wherein the results associated with the respective control instance are derived from touch data from the display for the relevant area of the control instance.
12. The method of claim 11, wherein:
one or more control instances contain a level parameter indicating a virtual level for each control instance;
one or more control instances are defined as opaque; and
the step of processing touch data further comprises: for each opaque instance, all touch events for the area of the instance at the lower level of the opaque instance that is covered by the area of the opaque instance are removed.
13. The method of claim 12, wherein one or more control instances are defined as transparent and instances covered by transparent instances are unaffected in processing touch data.
14. The method of claim 1, wherein each control instance is associated with a control type of the one or more control types.
15. The method of claim 1, wherein one of the instances of controls is an instance of a button control, and the result associated with the instance of the button control indicates whether a button appearing on the display and associated with the instance of the button control was pressed.
16. The method of claim 1, wherein one of the instances of controls is an instance of a slider control, and the results associated with the instance of the slider control indicate incremental changes in the position of one or more touches along a slider that appears on the display and is associated with the instance of the slider control.
17. The method of claim 1, wherein one of the instances of controls is an instance of a knob control, and the results associated with the instance of knob control indicate incremental changes in rotational orientation of one or more touches along a knob appearing on the display and associated with the instance of knob control.
18. The method of claim 1, wherein one of the instances of controls is an instance of a navigation surface control and the results associated with the instance of the navigation surface control indicate incremental changes in relative positions of several fingers along a navigation surface that appears on the display and is associated with the instance of the navigation surface control.
19. A method for operating a multi-touch enabled device, the method comprising:
receiving one or more control instances from one or more applications;
receiving touch data;
processing only touch data relating to at least one control instance to obtain one or more result sets, wherein each result set is associated with a respective control instance; and
sending the one or more result sets to the one or more applications.
20. A method for operating a multi-touch enabled device, the method comprising:
sending, by one or more application modules, one or more control instances to a multi-touch utility module, the multi-touch utility module operable as a processing layer between the application modules and touch data, each control instance defining a user interface element;
displaying on a display a user interface element defined by a control instance;
processing touch events captured in the touch data;
generating a result by processing only touch data related to at least one control instance; and
sending results indicating touch events on the interface element and associated with respective control instances to the one or more application modules.
21. The method of claim 20, wherein each control instance further contains data defining a method for processing incoming touch data for that instance.
22. An apparatus for operating a multi-touch enabled device, comprising:
means for generating one or more control instances by one or more applications executing on the multi-touch enabled device;
means for sending a control instance from the one or more applications to a multi-touch utility;
means for receiving touch data by a multi-touch utility;
means for processing, by the multi-touch utility, only touch data relating to at least one control instance to obtain one or more result sets, wherein each result set is associated with a respective control instance; and
means for sending the one or more result sets to the one or more applications.
23. The apparatus of claim 22, further comprising:
means for displaying one or more interface images on a display of the multi-touch enabled device, each interface image associated with a control instance; and
means for detecting a touch on one or more interface images,
wherein the touch data comprises data indicative of a touch on the one or more interface images; and
wherein the means for processing touch data comprises: means for processing the touch data in a manner related to the control instance to obtain a result set.
24. The device of claim 22, wherein the one or more control instances define a format of a result set.
25. The device of claim 22, wherein the result set does not contain touch data.
26. The apparatus of claim 22, wherein the means for processing touch data comprises: means for processing current touch data and processing historical touch data.
27. The apparatus of claim 26, wherein the means for processing touch data comprises: means for comparing the current touch data to historical touch data to obtain an incremental result.
28. The apparatus of claim 26, further comprising:
means for determining, by the multi-touch utility, which received touch data is likely to be needed in the future as historical touch data based on the control instance;
means for saving received touch data that may be needed in the future; and
means for discarding all other received touch data.
29. The apparatus of claim 22, wherein:
the apparatus for processing touch data further comprises: means for deriving intermediate touch data, the intermediate touch data being in a form other than a raster form;
the apparatus further comprises:
means for saving intermediate touch data; and
means for obtaining incremental results using the intermediate touch data in the future.
30. The device of claim 22, wherein the touch data comprises a plurality of binary values, each binary value indicating whether a particular pixel was touched.
31. The device of claim 22, wherein the touch data comprises a plurality of values, each value indicating a force or pressure of a touch to a particular pixel.
32. The device of claim 22, wherein each control instance defines a relevant area of the display, and wherein the results associated with the respective control instance are derived from touch data from the display for the relevant area of the control instance.
33. The apparatus of claim 32, wherein:
one or more control instances contain a level parameter indicating a virtual level for each control instance;
one or more control instances are defined as opaque; and
the apparatus for processing touch data further comprises: for each opaque instance, means for removing all touch events for an area of the instance at a lower level of the opaque instance that is covered by the area of the opaque instance.
34. The device of claim 33, wherein one or more control instances are defined as transparent and instances covered by transparent instances are unaffected in processing touch data.
35. The device of claim 22, wherein each control instance is associated with a control type of the one or more control types.
36. The device of claim 22, wherein one of the instances of controls is an instance of a button control, and the result associated with the instance of the button control indicates whether a button appearing on the display and associated with the instance of the button control was pressed.
37. The device of claim 22, wherein one of the instances of controls is an instance of a slider control, and the results associated with the instance of the slider control indicate incremental changes in the position of one or more touches along a slider appearing on the display and associated with the instance of the slider control.
38. The device of claim 22, wherein one of the instances of controls is an instance of a knob control, and the results associated with the instance of knob control indicate incremental changes in rotational orientation of one or more touches along a knob appearing on the display and associated with the instance of knob control.
39. The device of claim 22, wherein one of the instances of controls is an instance of a navigation surface control and the results associated with the instance of the navigation surface control indicate incremental changes in relative positions of several fingers along a navigation surface that appears on the display and is associated with the instance of the navigation surface control.
40. An apparatus for operating a multi-touch enabled device, the apparatus comprising:
means for sending, by one or more application modules, one or more control instances to a multi-touch utility module, the multi-touch utility module operable as a processing layer between the application modules and touch data, each control instance defining a user interface element;
means for displaying on a display a user interface element defined by a control instance;
means for processing touch events captured in the touch data;
means for generating a result by processing only touch data related to at least one control instance; and
means for sending results to the one or more application modules indicating touch events on interface elements and related to respective control instances.
41. The device of claim 40, wherein each control instance further contains data defining a method for processing incoming touch data for that instance.
42. An apparatus for operating a multi-touch enabled device, the apparatus comprising:
means for receiving one or more control instances from one or more applications;
means for receiving touch data;
means for processing only touch data relating to at least one control instance to obtain one or more result sets, wherein each result set is associated with a respective control instance; and
means for sending the one or more result sets to the one or more applications.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US11/818,334 | 2007-06-13 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| HK1180415A true HK1180415A (en) | 2013-10-18 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9052817B2 (en) | Mode sensitive processing of touch data | |
| CN105677130B (en) | Pressure sensitivity touch control method, pressure sensitivity touch device and pressure-sensitive touch screen | |
| US8154529B2 (en) | Two-dimensional touch sensors | |
| US7103852B2 (en) | Dynamic resizing of clickable areas of touch screen applications | |
| US9575562B2 (en) | User interface systems and methods for managing multiple regions | |
| US9158454B2 (en) | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices | |
| US8754862B2 (en) | Sequential classification recognition of gesture primitives and window-based parameter smoothing for high dimensional touchpad (HDTP) user interfaces | |
| EP3106967B1 (en) | Multipoint touch screen | |
| US8847904B2 (en) | Gesture recognition method and touch system incorporating the same | |
| US20100079391A1 (en) | Touch panel apparatus using tactile sensor | |
| EP1513050A1 (en) | Information processing method for specifying an arbitrary point in 3-dimensional space | |
| US20080309626A1 (en) | Speed/positional mode translations | |
| US20120192119A1 (en) | Usb hid device abstraction for hdtp user interfaces | |
| US20120044151A1 (en) | Sorting touch position data | |
| WO2012171116A1 (en) | Visual feedback by identifying anatomical features of a hand | |
| CN103384862A (en) | Information processing terminal and method for controlling same | |
| CN107678540A (en) | Virtual touch screen man-machine interaction method, system and device based on depth transducer | |
| US9235338B1 (en) | Pan and zoom gesture detection in a multiple touch display | |
| US9501210B2 (en) | Information processing apparatus | |
| HK1180415A (en) | Mode sensitive processing of touch data | |
| KR20100081383A (en) | Multi-touch screen system, touch screen apparatus and method for dividing touch screen | |
| US10303299B2 (en) | Use of groove analysis in a touch screen device to determine occurrence of an elongated touch by a single finger | |
| KR20180015400A (en) | Semiconductor device | |
| CN113805722B (en) | Touch processing method, device and touch system | |
| CN115657877A (en) | 3D touch display system and control method thereof |