[go: up one dir, main page]

WO2010070528A1 - Method of and apparatus for emulating input - Google Patents

Method of and apparatus for emulating input Download PDF

Info

Publication number
WO2010070528A1
WO2010070528A1 PCT/IB2009/055573 IB2009055573W WO2010070528A1 WO 2010070528 A1 WO2010070528 A1 WO 2010070528A1 IB 2009055573 W IB2009055573 W IB 2009055573W WO 2010070528 A1 WO2010070528 A1 WO 2010070528A1
Authority
WO
WIPO (PCT)
Prior art keywords
pointing
emulated
signal
pointing signal
signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2009/055573
Other languages
French (fr)
Inventor
Krzysztof Choma
Anand Reddy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Inc
Original Assignee
Nokia Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Inc filed Critical Nokia Inc
Publication of WO2010070528A1 publication Critical patent/WO2010070528A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • Example embodiments of the invention relate to receiving multiple pointing signals from one or more pointing devices and generating a corresponding single emulated pointing signal.
  • Example embodiments of the invention relate to a method comprising: receiving concurrently a plurality of pointing signals in a computing device; and, generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device, wherein: one of the received pointing signals is selected to provide the emulated pointing signal and said selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules.
  • the received pointing signal selected to provide the emulated pointing signal may be changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of said pointing signals continues.
  • the method may further comprise controlling the computing device using the emulated pointing signal.
  • the received pointing signals may relate to at least one instruction and the emulated pointing signal may relate to the same at least one instruction.
  • the emulated pointing signal may relate to data of at least two of the received pointing signals.
  • Data of each pointing signal may comprise data relating to gestures of the user and the emulated pointing signal may relate to gestures of at least two of the received pointing signals.
  • Data of each pointing signal may comprise data relating to gestures of a user, and the set of predefined emulation rules may be created according to a predefined set of interpretation rules which define how gestures are interpreted as instructions.
  • the method may further comprise providing the emulated pointing signal to a software application of the computing device, said software application being operable only with a single pointing signal from a single pointing device.
  • the computing device may receive the plurality of pointing signals from a touch-screen or touch-pad having a multi-touch capability.
  • the computing device may receive a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad.
  • An operating system of the computing device may generate the emulated pointing signal.
  • Example embodiments of the invention relate to an apparatus comprising: a processor, memory including computer program code, the memory and the computer program code configured to, with the processor cause the apparatus at least to perform: receiving concurrently a plurality of pointing signals; and, generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device, wherein: one of the received pointing signals is selected to provide the emulated pointing signal and said selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules.
  • the received pointing signal selected to provide the emulated pointing signal may be changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of said pointing signals continues.
  • the apparatus may further comprise a controller configured to control the computing device using the emulated pointing signal.
  • the received pointing signals may relate to at least one instruction and the emulated pointing signal may relate to the same at least one instruction.
  • the emulated pointing signal may relate to data of at least two of the received pointing signals.
  • Data of each pointing signal may comprise data relating to gestures of the user and the emulated pointing signal may relate to gestures of at least two of the received pointing signals.
  • Data of each pointing signal may comprise data relating to gestures of a user, and the set of predefined emulation rules may be created according to a predefined set of interpretation rules which define how gestures are interpreted as instructions.
  • the emulated pointing signal may be provided to a software application of the apparatus, said software application being operable only with a single pointing signal from a single pointing device.
  • the apparatus may receive the plurality of pointing signals from a touch-screen or touch-pad having a multi-touch capability.
  • the apparatus may receive a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad.
  • An operating system of the computing device may generate the emulated pointing signal.
  • Example embodiments of the invention relate to an apparatus comprising: one or more input devices configured to generate corresponding pointing signals, a processor, memory including computer program code, the memory and the computer program code configured to, with the processor cause the apparatus at least to perform: production of an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal.
  • the emulated pointing signal may be produced by first selecting a first pointing signal as said emulated pointing signal and thereafter selecting a second of said pointing signals as said emulated pointing signal.
  • Example embodiments of the invention relate to a computer program comprising: code for concurrently processing a plurality of pointing signals; and code for producing an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal, when the computer program is run on a processor.
  • the computer program may be a computer program product comprising a computer-readable medium bearing computer program code embodied therein for use with a computer.
  • Example embodiments of the invention relate to a computer readable medium encoded with instructions that, when executed by a computer, perform: receiving a plurality of pointing signals from one or more input devices; production of an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal.
  • Figure 1 is a representation of a smartphone computing device
  • Figure 2 is a schematic of some of the internal elements of the smartphone of Figure 1;
  • Figure 3 is another schematic of some of the internal elements of the smartphone of Figure 1;
  • Figure 4 is a state diagram of a pointing device of the smartphone of Figure 1;
  • Figure 5 is a schematic of some of the internal elements of the smartphone of Figure 1;
  • Figure 6 is a flow diagram defining a set of emulation rules for providing an emulated pointing signal
  • Figures 7 to 12 are flow diagrams defining sets of emulation rules for providing an emulated pointing signal.
  • Figure 1 represents a smartphone 2 according to an example embodiment which comprises a keypad 4, a touch-screen 6, a microphone 8, a speaker 10 and an antenna 12.
  • the touchscreen 6 of this embodiment provides a pointing device of the smartphone 2.
  • the smartphone 2 of this embodiment is capable of being operated by a user to perform a variety of different functions, such as, for example, hosting a telephone call, browsing the internet or sending an email.
  • FIG. 1 is a mobile computing device
  • FIG. 1 further embodiments of the invention extend to computing devices which are not mobile and those which do not function primarily, or at all, as communication devices.
  • FIG. 2 shows a schematic view of some of the internal hardware elements of the known smartphone 2 of this embodiment.
  • the smartphone 2 of the example embodiment comprises hardware to perform telephony functions, together with an application processor 108 and corresponding support hardware to enable the phone to have other functions which are desired by a smartphone, such as messaging, internet browsing, email functions and the like.
  • the telephony hardware is represented by the RF processor 102 which provides an RF signal to the antenna 12 for the transmission of telephony signals, and processes RF signals received by the antenna 12.
  • a baseband processor 104 provides signals to and receives signals from the RF Processor 102 in this embodiment.
  • the baseband processor 104 of this embodiment also interacts with a subscriber identity module 106 in a known manner.
  • the keypad 4 and the touch-screen 6 of this embodiment are controlled by an application processor 108.
  • a power and audio controller 109 is provided in this embodiment to supply power from a battery (not shown) to the telephony subsystem, the application processor 108, and the other hardware. Additionally, the power and audio controller 109 of this embodiment also controls input from the microphone 8, and audio output via the speaker 10.
  • the smartphone 2 here includes Random Access Memory (RAM) 112 connected to the application processor 108 into which data and program code can be written and read from at will. Code placed anywhere in RAM 112 of this embodiment can be executed by the application processor 108 from the RAM 112. RAM 112 represents a volatile memory of the smartphone 2.
  • RAM Random Access Memory
  • the smartphone 2 of this embodiment is provided with a long-term storage 114 connected to the application processor 108.
  • the long-term storage 114 of this embodiment comprises three partitions, an operating system (OS) partition 116, a system partition 118 and a user partition 120.
  • the long-term storage 1 14 represents a non- volatile memory of the smartphone 2 of this embodiment.
  • the OS partition 116 includes an operating system.
  • An operating system is necessary in this embodiment in order for the application processor 108 to operate and therefore, the operating system is initialised when the smartphone system 2 is first switched on.
  • these resources include the application processor 108, the RAM 1 12, and the long-term storage 114.
  • the operating system of this embodiment helps to provide a stable, consistent way for software applications running on the smartphone 2 to deal with the hardware resources of the smartphone 2 without the application needing to know all the details of the physical resources available to the hardware.
  • Other computer programs may also be stored on the long-term storage 114 of this embodiment, such as application programs, and the like.
  • application programs which are mandatory to the device such as, in the case of a smartphone, communications applications and the like are stored in the system partition 1 18 of this embodiment.
  • the application programs stored on the system partition 118 of this embodiment are those which are bundled with the smartphone by the device manufacturer when the phone is first sold.
  • Application programs which are added to the smartphone by the user are stored in the user partition 120 of this embodiment.
  • the representation of Figure 2 is schematic. In practise, the various functional components illustrated may be substituted into one and the same component.
  • the long-term storage 114 may comprise NAND flash, NOR flash, a hard disk drive or a combination of these.
  • the processes and functionality of the depicted processors may be fulfilled by more or fewer processors.
  • Figure 3 shows another schematic diagram of the smartphone 2 of this embodiment which depicts certain of the hardware components mentioned above with respect to Figure 2. More particularly, Figure 3 shows an operating system (OS) 122 in communication with the application processor 108. Figure 3 also shows the touch-screen 6 and the various memory elements of the smartphone 2, as described above. As mentioned previously, the OS 122 is stored on the OS partition 118 and controls the operation of the application processor 108 in this embodiment. In particular, the OS 122 controls the application processor 108 to provision access of software stored on the memory elements 1 12 and 1 14 to the input devices, such as the touch-screen 6 of this embodiment.
  • OS operating system
  • the software of the smartphone 2 During operation, it is often necessary for the software of the smartphone 2 to receive commands or instructions from a user (not shown) of the smartphone 2. In some circumstances such instructions can be specifically requested by the software from the user, other times, such instructions are issued by the user without being specifically requested.
  • the user may provide instructions to the software using one or more of the input devices of the smartphone 2, such as, the keypad 4, the touch-screen 6, or the microphone 8. However, if a pointing device input is required by the software then the user must use the touch-screen 6 as this is the only pointing device of the smartphone 2 of this embodiment.
  • the application processor 108 (controlled by the OS 122) receives the pointing signal relating to the user's instruction from the touch-screen 6 for provision to the relevant software at the appropriate place in memory. For example, if the pointing signal has been provided for a software application stored on the user partition 120, the application processor 108 receives the pointing signal from the touch-screen 6 for provision to the user partition 120. It is also the case that the software of the smartphone 2 of this embodiment can issue instructions to the touch-screen 6 via the application processor 108. For example, such instructions may result in displaying a particular window or dialogue box on the touch-screen 6, or to highlight a selection in response to a previously received gesture from the user.
  • the touch-screen 6 of this embodiment has a multi-touch capability and therefore can recognise more than one finger (or other pointing instrument) simultaneously contacting the touch-screen 6. Accordingly, the touch-screen 6 a multipoint device.
  • the touch-screen 6 of this embodiment is capable of providing a multipoint signal comprising one pointing signal for each finger (or other interface device) concurrently contacting the screen.
  • the touch-screen 6 of this embodiment is provided with a proximity capability which enables the touch-screen 6 to recognise not only a finger contacting its screen but also a finger close the screen. Accordingly, the touch-screen 6 of this embodiment is capable of providing a multipoint signal comprising multiple pointer signals wherein each pointer signal corresponds to a different finger concurrently contacting, or in close proximity to, the screen.
  • the touch-screen 6 of this embodiment can also identify the pressure with which a finger (or other pointing instrument) contacts the screen. Therefore, for each finger contacting the screen, the touch-screen 6 provides a pressure rating for the pointing signal which corresponds to the pressure that the finger applies to the screen.
  • Figure 4 depicts a state model of the touch-screen 6 of this embodiment which defines its operation in relation to gestures of a single finger (or other pointing instrument or interface device). Corresponding state models exist for each of the pointing signals which the touchscreen 6 of this embodiment is capable of generating.
  • the following three states are defined, an initial state 200, an OutOfRange (OOR) state 202, an UP state 204 and a DOWN state 206. Connecting each state to other states in this embodiment are labelled arrows, each of which represents an event.
  • the touch-screen 6 of this embodiment can be in any one of the three defined states and then changes state when an event corresponding to the arrows is performed, as described below.
  • the touch-screen 6 of this embodiment starts in the initial state 200 and changes to the OOR state 202 when the touch-screen 6 is first operated, for example, when the smartphone 2 is turned on.
  • the OOR state 202 of this embodiment represents the state where a finger is not detected by the touch-screen 6. If a finger moves into detection range of the touch-screen 6 of this embodiment, this represents a MOVE event and causes a state change from the OOR state 202 to the UP state 204.
  • Each MOVE event is included in the pointing signal corresponding to the detected finger (or other interface device) together with locations corresponding to the physical locations of the detected finger in relation to the screen.
  • the UP state 204 represents a state where a finger is in detectable range but is not contacting the touch-screen 6 of this embodiment. If the detected finger contacts the screen, this represents a DOWN event and causes a state change from the UP state 204 to the DOWN state 206 in this embodiment.
  • Each DOWN event is included in the pointing signal corresponding to the detected finger (or other interface device) together with a location corresponding to the initial contact point between the finger and the screen.
  • each DRAG event of this embodiment is included in the pointing signal corresponding to the detected finger together with locations corresponding to the various physical locations of the detected finger on the screen. If the finger moves out of range while still in the DOWN state 206 this represents an OOR event in this embodiment and causes a state change from the DOWN state 206 to the OOR state 202. Each OOR event is included in the pointing signal corresponding to the finger together with the physical location where the finger was last detected in the DOWN state 206 in this embodiment.
  • each UP event of this embodiment is included in the pointing signal corresponding to the detected finger together with the physical location on the screen where the finger lost contact.
  • the finger changes location with respect to the touch-screen 6 MOVE events are issued as appropriate in this embodiment.
  • From the UP state 204 if the finger moves out of range, or in other words, out of detectable range of the touch screen 6 this represents an OOR event in this embodiment and causes a state change from the UP state 204 to the OOR state 202.
  • Each OOR event of this embodiment is included in the pointing signal corresponding to the finger together with the physical location where the finger was last detected.
  • the OS 122 (via the application processor 108) maintains an up-to-date state model for each finger (or other interface device) detected by the touch-screen 6.
  • the state model(s) relating to each of the pointing signals of the multipoint signal are updated appropriately.
  • a multipoint signal from the touch-screen 6 of this embodiment is provided to software of the smartphone 2 which is only designed to operate with a single pointing signal (i.e. a signal from a single-pointer device) then the logic of the software may cause the smartphone 2 to malfunction or fail to function at all.
  • standards to determine the emulation may be prescribed which may suit certain hardware and/or software arrangements, providing more efficient emulation.
  • an emulator 124 is connected in between the OS 121 and the various memory elements 112 to 120.
  • the emulator 124 of this embodiment receives data from the OS 121 and provides data to the various memory elements 112 to 120. More specifically, an input of the emulator 124 of this embodiment receives data from the OS 121 in the form of a multipoint signal which the OS 121 receives via the application processor 108 from the touch-screen 6.
  • An emulated signal generator of the emulator 124 of this embodiment is then capable of generating a corresponding single pointing signal (hereinafter called an 'emulated pointing signal') in dependence on the received multipoint signal.
  • the emulated pointing signal of this embodiment provides instructions which correspond to the instructions provided by the received multipoint signal.
  • the emulated pointing signal can then be accessed from the emulator 124 by software running on any memory element of the smartphone 2 of this embodiment.
  • the emulated pointing signal would be used by single pointer software of the smartphone 2 of this embodiment. In such cases, the single pointer software would disregard multipoint signals provided by the application processor 108 and instead only concern itself with the emulated pointing signal provided by the emulator 124.
  • the operating system 122 would provide similar functionality.
  • the received pointing signals relate to at least one instruction and the emulated pointing signal relates to the same at least one instruction. It is an advantage of these embodiments that the emulated pointing signal provides the same essential information as the plurality of pointing instructions so that the emulated pointing signal can be used to replace the plurality of pointing instructions effectively.
  • a multipoint signal can comprise multiple pointing signals and the emulated pointing signal of embodiments of the invention comprises a single pointing signal
  • the multipoint signal provides a richer information bearer than the emulated pointing signal. Accordingly, the information carried by the multipoint signal must be interpreted by the emulator 124 of the embodiment of Figure 5 so that the corresponding emulated pointing signal contains at least the essential information of the multipoint signal. Some of the information carried by the multipoint signal will not be contained in the emulated pointing signal.
  • the emulator 124 of this embodiment interprets the multipoint signal according to a set of predefined rules called 'gesture interpretations'. More specifically, the gesture interpretations define how to interpret the user's gestures as defined by the multipoint signal in order to establish which pointing signal or pointing signals of the multipoint signal represent the user's intended instruction. Accordingly, the gesture interpretations define how to interpret the pointing signals of the multipoint signal to help identify which one of the pointing signals to use as the emulated pointing signal. According to this example embodiment, the emulator 124 applies the following gesture interpretations.
  • the first gesture interpretation relates to 'touching' events, i.e. when a user moves a finger from the UP state 204 to the DOWN state 206
  • the second gesture interpretation relates to 'lifting' events, i.e. when a user moves a finger from the DOWN state 206 to the UP state 204.
  • the following rules are applied:
  • the focus is moved to the signal generated by this finger.
  • the action performed with the previous Touching finger should be cancelled or ignored.
  • the action performed with the new finger should be performed.
  • Lifting the focus should be moved to one of the other touching fingers (as considered below).
  • the emulator 124 of this embodiment is able to receive a multipoint signal comprising multiple pointing signals and interpret that multipoint signal to identify which pointing signal to use as the emulated pointing signal.
  • a second set of predefined rules called 'emulation rules' are created based on the gesture interpretations and it is the emulation rules which are applied directly to a multipoint signal in order to generate the emulated pointing signal in this embodiment.
  • Figure 6 represents the emulation rules of this embodiment in a flow diagram.
  • the emulator 124 of this embodiment When a multipoint signal is provided by the touch-screen 6, one of the pointing signals of the multipoint signal may be designated as the 'primary' pointing signal. The primary pointing signal then provides the emulated pointing signal. All events from pointing signals of the multipoint signal, other that the primary pointing signal, are not included in the emulated pointing signal. The only exception to this is for some transition events from non-primary pointing signals which are included in the emulated pointing signal when the primary pointing signal is changed.
  • the flow diagram of Figure 6 defines this operation in more detail.
  • the flow diagram of Figure 6 begins at block 302 wherein the emulator 124 of this embodiment identifies if a primary pointing signal of the multipoint signal is assigned. If a primary pointing signal is not assigned then processing flows to block 304, alternatively, if a primary pointing signal is assigned then processing flows to block 308, which will be discussed later.
  • the emulator 124 of this embodiment determines if a multipoint signal comprising at least one pointing signal is present. Practically, in this block the emulator 124 of this embodiment determines if one or more fingers are detected by the touchscreen 6. If a multipoint signal comprising at least one pointing signal is detected (i.e.
  • processing waits at block 304 until one is detected, after which processing flows to block 306. In either case, once a pointing signal of the multipoint signal has been designated primary at block 306, processing flows to block 308 of this embodiment.
  • the received pointer signal which provides the emulated pointer signal is switched when it becomes less suitable for providing the emulated pointing signal than a different received pointing signal. This dynamic operation ensures that the emulated pointer signal always provides the most relevant information from the plurality of received pointing signals.
  • a pointing signal of the multipoint signal is designated as the primary pointing signal. Accordingly, the emulated pointing signal will be set equal to the primary pointing signal. Whether or not the pointing signal designated as primary is maintained as the primary pointing signal in this embodiment will depend on the behaviour of the finger relating to the primary pointing signal and the behaviour of any other fingers which are detected by the touch-screen 6. More specifically, at block 308 the emulator 124 of this embodiment detects if any finger moves into the DOWN state, other than the finger corresponding to the primary pointing signal. In order to perform this operation, the emulator 124 of this embodiment detects if any pointing signal issues a DOWN event, other then the primary pointing signal.
  • the emulator 124 does not detect a DOWN event from a non-primary pointing signal at block 308, processing flows to block 314, which will be discussed later.
  • the emulator detects the current state of the finger represented by the primary pointing signal. If the finger represented by the primary pointing signal is in the DOWN state then processing flows from block 310 back to block 308. Alternatively, if the finger represented by the primary pointing signal is in any state other than the DOWN state, processing of this embodiment flows from block 310 to block 312.
  • the emulator 124 of the embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at block 308 to primary status (in other words, the "focus" is changed).
  • the emulator 124 of this embodiment also includes, in the emulated signal, the DOWN event that was issued by the new primary pointing signal prior to it becoming the primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing of this embodiment flows from block 312 back to block 308.
  • the emulator 124 of this embodiment detects if the finger represented by the primary pointing signal moves to an OOR state based on whether an OOR event is issued by the primary pointing signal. If the emulator 124 does not detect an OOR event in the primary pointing signal, processing flows back to block 308. Alternatively, if the emulator 124 of this embodiment detects an OOR event in the primary pointing signal, processing flows to block 316.
  • the emulator 124 includes the OOR event issued at block 314 in the emulated pointing signal, then demotes the primary pointing signal from its primary status and leaves the primary status unassigned. Processing then flows from block 316 back to the block 302, as discussed above.
  • the operation of this example embodiment, as described above with respect to Figure 6, can be modified as follows in a manner which is not shown by Figure 6. Operation of this embodiment is identical to the above embodiment with the following alterations. After processing flows from block 316 to block 302 and then to block 304, no pointing signal which was detected by the emulator 124 when the primary status was unassigned at block 316 is promoted to primary status.
  • a new pointing signal which starts to be detected after processing at block 316 has finished is promoted to primary status.
  • a pointing signal which is detected by the emulator 124 when the primary status is unassigned at block 316 moves out of range and therefore ceases to be detected.
  • the same pointing signal of this example is then detected again by the emulator 124 as it moves back into range.
  • the pointing signal qualifies as a new pointing signal when it moves back into range, and after that time it is eligible for promotion to primary status.
  • blocks 306, 312 and 316 indicate instructions which are performed by the emulator 124 of this embodiment.
  • blocks 302, 304 and 310 indicate tests wherein the emulator examines the current state of an element, for example, the current state of the finger represented by the primary input signal in block 310.
  • blocks 308 and 314 indicate tests wherein the emulator reacts to an event, for example, the receipt of a DOWN event from a non- primary pointing signal in block 308. Accordingly, while the primary pointer signal is assigned and unchanged, processing of the emulator 124 with respect to Figure 6 flows in a loop consisting of blocks 308 and 314. Processing only breaks from this loop when an event that is tested for by either block 308 or 314 occurs.
  • the emulated pointing signal of this embodiment is defined as one of the pointing signals of the multipoint signal received by the emulator 124 from the touch-screen 6. Accordingly, the software of the smartphone 2, in particular the single pointer software of the smartphone 2, can receive the emulated pointing signal in preference to the multipoint signal which may cause the software and smartphone 2 to malfunction. It is an advantage of certain embodiments that legacy and new single pointer software can operate on the smartphone 2. Additionally, it is an advantage that the same emulated pointing signal can be used by each software component of the smartphone 2, irrespective of where each software component is stored in the memory of the smartphone 2. This is particularly beneficial as it means that the same approach to emulation may be adopted for all software applications which use the emulated pointing signal. A further benefit is that the development of new single pointer software is simplified because software developers do not need to consider how to handle multipoint signals. Instead, they can simply build new single pointer software to use the emulated pointer signal.
  • Figure 7 shows an embodiment incorporating a first set of alternative emulation rules which have been created based on the same gesture interpretations as described above.
  • a difference between the emulation rules of Figure 7 and those of the above embodiments is the addition of a block 400 in between the block 310 and the block 308.
  • the following describes how this new block is integrated into the flow diagram of the embodiment of Figure 7.
  • the emulator 124 of this embodiment detects the current state of the finger represented by the primary pointing signal. If the finger is in a DOWN state, processing flows from block 310 to block 400 and not back to block 308.
  • the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at block 308 to primary status.
  • the emulator 124 of this embodiment also includes in the emulated pointing signal the first UP event sent by either the old primary signal or the new primary signal. However, in either case, the emulator of this embodiment includes this UP event as issued by the new primary pointing signal. The emulator then includes in the emulated pointing signal the DOWN event that was issued by the new primary pointing signal at block 308 of this embodiment. Thereafter, the emulated pointing signal matches the primary pointing signal and processing flows from block 400 back to block 308 of this embodiment.
  • Figure 8 shows a second set of alternative emulation rules which have been created based on the same lifting gesture interpretation as the above described embodiment and the following alternative touching gesture interpretation.
  • a difference between the emulation rules of Figure 8 and those of the embodiment of Figure 6 is the addition of a block 402 in between the block 310 and the block 308.
  • the following describes how this new block is integrated into the flow diagram of Figure 8.
  • the emulator 124 detects the current state of the finger represented by the primary pointing signal. If the finger is in a DOWN state then processing of this embodiment flows from block 310 to block 402 and not back to block 308.
  • the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non- primary pointing signal which issued the DOWN event at block 308 of this embodiment.
  • the emulator 124 also includes in the emulated pointing signal a MOVE event specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal of this embodiment. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from block 402 back to block 308 in this embodiment.
  • Figure 9 shows an embodiment incorporating a third set of alternative emulation rules which have been created based on the same lifting gesture interpretation as the embodiment of Figure 6 and the following alternative touching gesture interpretation.
  • the user's focus is always on the first finger that touched the screen and does Touching not change until this finger is lifted.
  • processing of this embodiment flows from block 404 to block 406.
  • the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at block 404.
  • the emulator 124 of this embodiment also includes in the emulated pointing signal the DOWN event that was issued by the new primary pointing signal at block 404. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing of this embodiment flows from block 406 back to block 404.
  • Figure 10 shows an embodiment incorporating a fourth set of alternative emulation rules which have been created based on the same gesture interpretations as the embodiment of Figure 6. Differences between the emulation rules of Figure 10 and those of the embodiment of Figure 6 are the addition of new blocks 408 to 424.
  • the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in a DOWN state. More specifically, the closest finger in this embodiment is taken to be the finger in closest proximity to the touch-screen 6 or, if more than one finger is contacting the touch-screen 6, the finger contacting with the greatest pressure.
  • the emulator 124 then includes in the emulated pointing signal a MOVE event specifying a move from the last recorded position of the old primary pointing signal to the current position of the new primary pointing signal in this embodiment. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 412 back to block 308.
  • Block 416 is a test block wherein the emulator 124 of this embodiment examines if the state of the finger represented by the primary pointing signal was in a DOWN state before the OOR event was sent at block 314. Also at block 416, the emulator of this embodiment examines if the state of the closest finger is in the DOWN state. If neither condition is true, processing flows from block 416 to block 420, which is discussed below. Alternatively, if both conditions are true, processing in this embodiment flows from block 416 to block 418. At block 418, the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger.
  • the emulator 124 of this embodiment also includes in the emulated pointing signal a DRAG event specifying a drag from the last recorded position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 418 back to block 308.
  • Block 420 of this embodiment provides another test block wherein the emulator 124 examines if the state of the finger represented by the primary pointing signal was in a DOWN state before the OOR event was sent at block 314. Also at block 420, the emulator 124 of this embodiment examines if the state of the closest finger is in the UP state. If both conditions are true then processing in this embodiment flows to block 422, otherwise processing flows to block 424. At block 422, an UP event is included in the emulated pointing signal from the current primary pointing signal. The emulator 124 of this embodiment then demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger.
  • the emulator 124 of this embodiment then includes a MOVE event in the emulated pointing signal specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal of this embodiment. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 422 back to block 308.
  • the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger.
  • the emulator 124 then includes a MOVE event in the emulated pointing signal of this embodiment specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 424 back to block 308.
  • Figure 11 shows an embodiment incorporating a fifth set of alternative emulation rules which have been created based on the same touching gesture interpretation as the embodiment of Figure 6 and the following alternative lifting gesture interpretation.
  • the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state.
  • the emulator 124 of this embodiment then includes in the emulated pointing signal the first UP event sent by either the old primary or the new primary signals. However, in either case, the emulator of this embodiment includes this UP event as issued by the new primary pointing signal.
  • the emulator 124 of this embodiment then generates and includes in the emulated pointing signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the primary pointing signal and processing flows from block 426 back to block 308.
  • the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger.
  • the emulator 124 of this embodiment also includes in the emulated pointing signal an UP event from the new primary pointing signal.
  • the emulator 124 then generates and includes in the emulated signal of this embodiment a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 428 back to block 308.
  • Figure 12 shows an embodiment incorporating a sixth set of alternative emulation rules which have been created based on the same touching gesture interpretation as the embodiment of Figure 6 and the following alternative lifting gesture interpretation.
  • the emulator 124 of this embodiment includes in the emulated pointing signal the first UP event sent by either the old primary signal or the new primary signal. However, in either case, the emulator of this embodiment includes this UP event in the new primary signal. The emulator 124 of this embodiment then demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state. The emulator 124 of this embodiment then generates and includes in the emulated pointing signal a DOWN event from the new primary pointing signal.
  • the emulated pointing signal of this embodiment matches the primary pointing signal and processing flows from block 430 back to block 308.
  • the emulator 124 of this embodiment includes in the emulated pointing signal an UP event from the current primary pointing signal.
  • the emulator 124 of this embodiment then demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state.
  • the emulator 124 of this embodiment then includes in the emulated pointing signal a MOVE event specifying a move from the last recorded position of the old primary pointing signal to the current position of the new primary pointing signal.
  • the emulator 124 of this embodiment then generates and includes in the emulated signal a DOWN event from the new primary pointing signal.
  • the emulated pointing signal of this embodiment matches the primary pointing signal and processing flows from block 432 back to block 308.
  • the emulator 124 is shown as being distinct from the OS 122. However, in further embodiments such as that illustrated in Figures 1 to 5 the emulator 124 and the OS 122 are part of the same element of the smartphone 2 (there the OS). Regardless of the precise arrangement, in embodiments of the present invention the emulated pointer signal is formed such that it is available to all software components of the computing device which require access thereto. In other words, the emulated pointing signal of certain embodiments acts as a 'global' variable within the context of the computing device. This will help to minimise the chances that separate software components of the computing device will need to form their own individual emulated pointing signal.
  • single pointer software running on the computing device can ignore multipoint signals which may cause the computing device to malfunction and instead choose to receive the emulated pointer signal.
  • the emulated pointer signal corresponds to the received pointing signals such that essential information included in the received pointing signals is also included in the emulated pointing signal. Accordingly, it is an advantage in example embodiments that the emulated pointer signal may be used as an approximation of, and an equivalent to, the plurality of received pointing signals.
  • the pointing device as seen by a user i.e.
  • the pointing device as seen by the computing device i.e. the pointing device as defined by the pointing signal sent to the computing device, is referred to as the 'logical pointer' .
  • the example embodiments of the present invention have been described with reference to a computing device and a single pointing device being a touch-screen having both proximity and multi-touch capabilities. However, further embodiments of the invention are applied to a wide range of different computing devices and a wide range of related pointing devices. For example, instead of a smartphone, a PDA, a desktop computer or a laptop could be used.
  • a touch-screen instead of a touch-screen, a touch-pad could be used. Further-still, instead of a single pointing device, multiple pointing devices could be used, including at least one mouse, joystick, touch-screen or touch-pad.
  • the state model may comprise only a subset of the states defined with respect to Figure 4.
  • an alternative state model comprises only the UP and DOWN states and not the OOR state.
  • Such a state model can be used to describe a mouse which in operation cannot reach an OOR state.
  • Another alternative state model comprises only the DOWN and OOR states and not the UP state.
  • Such a state model can be used to describe a touch-screen without a proximity sensing capability. Such touch-screens are only able to detect pointing instruments when they actually contact the screen and therefore these devices cannot detect an instrument which is in close proximity but is not contacting the screen.
  • a "computer-readable medium” may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of a computer described and depicted in Figure 1.
  • a computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
  • the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the above- described functions may be optional or may be combined.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The present invention provides an emulator for a computing device. The computing device is capable of receiving at least one pointing signal from each of one or more pointing devices. Each pointing signal of the or each pointing device contains data relating to at least part of an instruction from a user of the or each pointing device to the computing device. The emulator comprises an input arranged to receive concurrently a plurality of pointing signals relating to at least one instruction. The emulator further comprises an emulated signal generator arranged to generate an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules. The emulated pointing signal is arranged to imitate a single pointing signal from a single pointing device.

Description

METHOD OF AND APPARATUS FOR EMULATING INPUT
BACKGROUND TO EXAMPLE EMBODIMENTS OF THE INVENTION
Example embodiments of the invention relate to receiving multiple pointing signals from one or more pointing devices and generating a corresponding single emulated pointing signal.
Example embodiments of the invention relate to a method comprising: receiving concurrently a plurality of pointing signals in a computing device; and, generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device, wherein: one of the received pointing signals is selected to provide the emulated pointing signal and said selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules.
The received pointing signal selected to provide the emulated pointing signal may be changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of said pointing signals continues.
The method may further comprise controlling the computing device using the emulated pointing signal.
The received pointing signals may relate to at least one instruction and the emulated pointing signal may relate to the same at least one instruction.
The emulated pointing signal may relate to data of at least two of the received pointing signals.
Data of each pointing signal may comprise data relating to gestures of the user and the emulated pointing signal may relate to gestures of at least two of the received pointing signals. Data of each pointing signal may comprise data relating to gestures of a user, and the set of predefined emulation rules may be created according to a predefined set of interpretation rules which define how gestures are interpreted as instructions.
The method may further comprise providing the emulated pointing signal to a software application of the computing device, said software application being operable only with a single pointing signal from a single pointing device.
The computing device may receive the plurality of pointing signals from a touch-screen or touch-pad having a multi-touch capability.
The computing device may receive a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad.
An operating system of the computing device may generate the emulated pointing signal.
Example embodiments of the invention relate to an apparatus comprising: a processor, memory including computer program code, the memory and the computer program code configured to, with the processor cause the apparatus at least to perform: receiving concurrently a plurality of pointing signals; and, generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device, wherein: one of the received pointing signals is selected to provide the emulated pointing signal and said selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules.
The received pointing signal selected to provide the emulated pointing signal may be changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of said pointing signals continues. The apparatus may further comprise a controller configured to control the computing device using the emulated pointing signal.
The received pointing signals may relate to at least one instruction and the emulated pointing signal may relate to the same at least one instruction.
The emulated pointing signal may relate to data of at least two of the received pointing signals.
Data of each pointing signal may comprise data relating to gestures of the user and the emulated pointing signal may relate to gestures of at least two of the received pointing signals.
Data of each pointing signal may comprise data relating to gestures of a user, and the set of predefined emulation rules may be created according to a predefined set of interpretation rules which define how gestures are interpreted as instructions.
The emulated pointing signal may be provided to a software application of the apparatus, said software application being operable only with a single pointing signal from a single pointing device.
The apparatus may receive the plurality of pointing signals from a touch-screen or touch-pad having a multi-touch capability.
The apparatus may receive a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad.
An operating system of the computing device may generate the emulated pointing signal.
Example embodiments of the invention relate to an apparatus comprising: one or more input devices configured to generate corresponding pointing signals, a processor, memory including computer program code, the memory and the computer program code configured to, with the processor cause the apparatus at least to perform: production of an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal.
The emulated pointing signal may be produced by first selecting a first pointing signal as said emulated pointing signal and thereafter selecting a second of said pointing signals as said emulated pointing signal.
Example embodiments of the invention relate to a computer program comprising: code for concurrently processing a plurality of pointing signals; and code for producing an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal, when the computer program is run on a processor.
The computer program may be a computer program product comprising a computer-readable medium bearing computer program code embodied therein for use with a computer.
Example embodiments of the invention relate to a computer readable medium encoded with instructions that, when executed by a computer, perform: receiving a plurality of pointing signals from one or more input devices; production of an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal.
BRIEF DESCRIPTION OF THE DRAWINGS
Example embodiments of the present invention will now be described by way of example only and by reference to the accompanying drawings in which:
Figure 1 is a representation of a smartphone computing device;
Figure 2 is a schematic of some of the internal elements of the smartphone of Figure 1;
Figure 3 is another schematic of some of the internal elements of the smartphone of Figure 1; Figure 4 is a state diagram of a pointing device of the smartphone of Figure 1;
Figure 5 is a schematic of some of the internal elements of the smartphone of Figure 1;
Figure 6 is a flow diagram defining a set of emulation rules for providing an emulated pointing signal; and
Figures 7 to 12 are flow diagrams defining sets of emulation rules for providing an emulated pointing signal.
DESCRIPTION OF EXAMPLE EMBODIMENTS
Figure 1 represents a smartphone 2 according to an example embodiment which comprises a keypad 4, a touch-screen 6, a microphone 8, a speaker 10 and an antenna 12. The touchscreen 6 of this embodiment provides a pointing device of the smartphone 2. The smartphone 2 of this embodiment is capable of being operated by a user to perform a variety of different functions, such as, for example, hosting a telephone call, browsing the internet or sending an email.
Although the embodiment of Figure 1 is a mobile computing device, further embodiments of the invention extend to computing devices which are not mobile and those which do not function primarily, or at all, as communication devices.
Figure 2 shows a schematic view of some of the internal hardware elements of the known smartphone 2 of this embodiment. With reference to Figure 2, the smartphone 2 of the example embodiment comprises hardware to perform telephony functions, together with an application processor 108 and corresponding support hardware to enable the phone to have other functions which are desired by a smartphone, such as messaging, internet browsing, email functions and the like. In the embodiment of Figure 2, the telephony hardware is represented by the RF processor 102 which provides an RF signal to the antenna 12 for the transmission of telephony signals, and processes RF signals received by the antenna 12. Additionally, a baseband processor 104 provides signals to and receives signals from the RF Processor 102 in this embodiment. The baseband processor 104 of this embodiment also interacts with a subscriber identity module 106 in a known manner.
The keypad 4 and the touch-screen 6 of this embodiment are controlled by an application processor 108. A power and audio controller 109 is provided in this embodiment to supply power from a battery (not shown) to the telephony subsystem, the application processor 108, and the other hardware. Additionally, the power and audio controller 109 of this embodiment also controls input from the microphone 8, and audio output via the speaker 10.
In order for the application processor 108 to operate, various different types of memory are provided in this embodiment. Firstly, the smartphone 2 here includes Random Access Memory (RAM) 112 connected to the application processor 108 into which data and program code can be written and read from at will. Code placed anywhere in RAM 112 of this embodiment can be executed by the application processor 108 from the RAM 112. RAM 112 represents a volatile memory of the smartphone 2.
Secondly, the smartphone 2 of this embodiment is provided with a long-term storage 114 connected to the application processor 108. The long-term storage 114 of this embodiment comprises three partitions, an operating system (OS) partition 116, a system partition 118 and a user partition 120. The long-term storage 1 14 represents a non- volatile memory of the smartphone 2 of this embodiment.
In the present example embodiment, the OS partition 116 includes an operating system. An operating system is necessary in this embodiment in order for the application processor 108 to operate and therefore, the operating system is initialised when the smartphone system 2 is first switched on. Generally speaking it is the role of the operating system to manage hardware and software resources of the computing device. In this embodiment, these resources include the application processor 108, the RAM 1 12, and the long-term storage 114. As such, the operating system of this embodiment helps to provide a stable, consistent way for software applications running on the smartphone 2 to deal with the hardware resources of the smartphone 2 without the application needing to know all the details of the physical resources available to the hardware. Other computer programs may also be stored on the long-term storage 114 of this embodiment, such as application programs, and the like. In particular, application programs which are mandatory to the device, such as, in the case of a smartphone, communications applications and the like are stored in the system partition 1 18 of this embodiment. The application programs stored on the system partition 118 of this embodiment are those which are bundled with the smartphone by the device manufacturer when the phone is first sold. Application programs which are added to the smartphone by the user are stored in the user partition 120 of this embodiment.
As stated above, the representation of Figure 2 is schematic. In practise, the various functional components illustrated may be substituted into one and the same component. For example, the long-term storage 114 may comprise NAND flash, NOR flash, a hard disk drive or a combination of these. Furthermore, the processes and functionality of the depicted processors may be fulfilled by more or fewer processors.
Figure 3 shows another schematic diagram of the smartphone 2 of this embodiment which depicts certain of the hardware components mentioned above with respect to Figure 2. More particularly, Figure 3 shows an operating system (OS) 122 in communication with the application processor 108. Figure 3 also shows the touch-screen 6 and the various memory elements of the smartphone 2, as described above. As mentioned previously, the OS 122 is stored on the OS partition 118 and controls the operation of the application processor 108 in this embodiment. In particular, the OS 122 controls the application processor 108 to provision access of software stored on the memory elements 1 12 and 1 14 to the input devices, such as the touch-screen 6 of this embodiment.
During operation, it is often necessary for the software of the smartphone 2 to receive commands or instructions from a user (not shown) of the smartphone 2. In some circumstances such instructions can be specifically requested by the software from the user, other times, such instructions are issued by the user without being specifically requested. The user may provide instructions to the software using one or more of the input devices of the smartphone 2, such as, the keypad 4, the touch-screen 6, or the microphone 8. However, if a pointing device input is required by the software then the user must use the touch-screen 6 as this is the only pointing device of the smartphone 2 of this embodiment. When the user controls the touch-screen 6 of this embodiment to provide an instruction, the application processor 108 (controlled by the OS 122) receives the pointing signal relating to the user's instruction from the touch-screen 6 for provision to the relevant software at the appropriate place in memory. For example, if the pointing signal has been provided for a software application stored on the user partition 120, the application processor 108 receives the pointing signal from the touch-screen 6 for provision to the user partition 120. It is also the case that the software of the smartphone 2 of this embodiment can issue instructions to the touch-screen 6 via the application processor 108. For example, such instructions may result in displaying a particular window or dialogue box on the touch-screen 6, or to highlight a selection in response to a previously received gesture from the user.
The touch-screen 6 of this embodiment has a multi-touch capability and therefore can recognise more than one finger (or other pointing instrument) simultaneously contacting the touch-screen 6. Accordingly, the touch-screen 6 a multipoint device. The touch-screen 6 of this embodiment is capable of providing a multipoint signal comprising one pointing signal for each finger (or other interface device) concurrently contacting the screen. In addition, the touch-screen 6 of this embodiment is provided with a proximity capability which enables the touch-screen 6 to recognise not only a finger contacting its screen but also a finger close the screen. Accordingly, the touch-screen 6 of this embodiment is capable of providing a multipoint signal comprising multiple pointer signals wherein each pointer signal corresponds to a different finger concurrently contacting, or in close proximity to, the screen.
The touch-screen 6 of this embodiment can also identify the pressure with which a finger (or other pointing instrument) contacts the screen. Therefore, for each finger contacting the screen, the touch-screen 6 provides a pressure rating for the pointing signal which corresponds to the pressure that the finger applies to the screen.
Figure 4 depicts a state model of the touch-screen 6 of this embodiment which defines its operation in relation to gestures of a single finger (or other pointing instrument or interface device). Corresponding state models exist for each of the pointing signals which the touchscreen 6 of this embodiment is capable of generating. In Figure 4, the following three states are defined, an initial state 200, an OutOfRange (OOR) state 202, an UP state 204 and a DOWN state 206. Connecting each state to other states in this embodiment are labelled arrows, each of which represents an event. In summary, the touch-screen 6 of this embodiment can be in any one of the three defined states and then changes state when an event corresponding to the arrows is performed, as described below.
The touch-screen 6 of this embodiment starts in the initial state 200 and changes to the OOR state 202 when the touch-screen 6 is first operated, for example, when the smartphone 2 is turned on. The OOR state 202 of this embodiment represents the state where a finger is not detected by the touch-screen 6. If a finger moves into detection range of the touch-screen 6 of this embodiment, this represents a MOVE event and causes a state change from the OOR state 202 to the UP state 204. Each MOVE event is included in the pointing signal corresponding to the detected finger (or other interface device) together with locations corresponding to the physical locations of the detected finger in relation to the screen. Importantly, the UP state 204 represents a state where a finger is in detectable range but is not contacting the touch-screen 6 of this embodiment. If the detected finger contacts the screen, this represents a DOWN event and causes a state change from the UP state 204 to the DOWN state 206 in this embodiment. Each DOWN event is included in the pointing signal corresponding to the detected finger (or other interface device) together with a location corresponding to the initial contact point between the finger and the screen.
If the finger changes position while it is in the down state (i.e. while it is still touching the screen) this represents a DRAG event. Each DRAG event of this embodiment is included in the pointing signal corresponding to the detected finger together with locations corresponding to the various physical locations of the detected finger on the screen. If the finger moves out of range while still in the DOWN state 206 this represents an OOR event in this embodiment and causes a state change from the DOWN state 206 to the OOR state 202. Each OOR event is included in the pointing signal corresponding to the finger together with the physical location where the finger was last detected in the DOWN state 206 in this embodiment.
Alternatively, if the finger is moved so that it no longer contacts the screen but is still within detectable range this represents an UP event and the state changes from the DOWN state 206 back to the UP state 204. Each UP event of this embodiment is included in the pointing signal corresponding to the detected finger together with the physical location on the screen where the finger lost contact. While in the UP state 204, if the finger changes location with respect to the touch-screen 6 MOVE events are issued as appropriate in this embodiment. From the UP state 204, if the finger moves out of range, or in other words, out of detectable range of the touch screen 6 this represents an OOR event in this embodiment and causes a state change from the UP state 204 to the OOR state 202. Each OOR event of this embodiment is included in the pointing signal corresponding to the finger together with the physical location where the finger was last detected.
According to the above described operation of the smartphone 2 and the touch screen 6 of this embodiment, the OS 122 (via the application processor 108) maintains an up-to-date state model for each finger (or other interface device) detected by the touch-screen 6. Each time a multipoint signal is received from the touch-screen 6 by the smartphone 2 of this embodiment, the state model(s) relating to each of the pointing signals of the multipoint signal are updated appropriately.
As mentioned previously, if a multipoint signal from the touch-screen 6 of this embodiment is provided to software of the smartphone 2 which is only designed to operate with a single pointing signal (i.e. a signal from a single-pointer device) then the logic of the software may cause the smartphone 2 to malfunction or fail to function at all.
In the embodiment of Figure 3, emulation of a single pointing signal from multiple pointing signals is possible by operating system 122.
Advantageously, where the operating system is directly involved in the emulation, standards to determine the emulation may be prescribed which may suit certain hardware and/or software arrangements, providing more efficient emulation.
In the embodiment depicted in Figure 5 an emulator 124 is connected in between the OS 121 and the various memory elements 112 to 120. In operation, the emulator 124 of this embodiment receives data from the OS 121 and provides data to the various memory elements 112 to 120. More specifically, an input of the emulator 124 of this embodiment receives data from the OS 121 in the form of a multipoint signal which the OS 121 receives via the application processor 108 from the touch-screen 6. An emulated signal generator of the emulator 124 of this embodiment is then capable of generating a corresponding single pointing signal (hereinafter called an 'emulated pointing signal') in dependence on the received multipoint signal. The emulated pointing signal of this embodiment provides instructions which correspond to the instructions provided by the received multipoint signal. The emulated pointing signal can then be accessed from the emulator 124 by software running on any memory element of the smartphone 2 of this embodiment. The emulated pointing signal would be used by single pointer software of the smartphone 2 of this embodiment. In such cases, the single pointer software would disregard multipoint signals provided by the application processor 108 and instead only concern itself with the emulated pointing signal provided by the emulator 124. In the embodiment of Figure 3, the operating system 122 would provide similar functionality.
In a example embodiments, the received pointing signals relate to at least one instruction and the emulated pointing signal relates to the same at least one instruction. It is an advantage of these embodiments that the emulated pointing signal provides the same essential information as the plurality of pointing instructions so that the emulated pointing signal can be used to replace the plurality of pointing instructions effectively.
In view of the fact that a multipoint signal can comprise multiple pointing signals and the emulated pointing signal of embodiments of the invention comprises a single pointing signal, the multipoint signal provides a richer information bearer than the emulated pointing signal. Accordingly, the information carried by the multipoint signal must be interpreted by the emulator 124 of the embodiment of Figure 5 so that the corresponding emulated pointing signal contains at least the essential information of the multipoint signal. Some of the information carried by the multipoint signal will not be contained in the emulated pointing signal.
Assessing which information of the multipoint signal is essential and should be incorporated into the emulated pointing signal is one of the functions of the emulator 124 of this embodiment or the operating system 122 of the embodiment of Figure 3. To perform this function the emulator 124 of this embodiment interprets the multipoint signal according to a set of predefined rules called 'gesture interpretations'. More specifically, the gesture interpretations define how to interpret the user's gestures as defined by the multipoint signal in order to establish which pointing signal or pointing signals of the multipoint signal represent the user's intended instruction. Accordingly, the gesture interpretations define how to interpret the pointing signals of the multipoint signal to help identify which one of the pointing signals to use as the emulated pointing signal. According to this example embodiment, the emulator 124 applies the following gesture interpretations. The first gesture interpretation relates to 'touching' events, i.e. when a user moves a finger from the UP state 204 to the DOWN state 206, and the second gesture interpretation relates to 'lifting' events, i.e. when a user moves a finger from the DOWN state 206 to the UP state 204. in this embodiment, the following rules are applied:
Whenever the user touches the screen with a new finger, the focus is moved to the signal generated by this finger. The action performed with the previous Touching finger should be cancelled or ignored. The action performed with the new finger should be performed.
If the user lifts the current finger and there are other fingers touching the screen,
Lifting the focus should be moved to one of the other touching fingers (as considered below).
According to the above gesture interpretations the emulator 124 of this embodiment is able to receive a multipoint signal comprising multiple pointing signals and interpret that multipoint signal to identify which pointing signal to use as the emulated pointing signal.
However, the gesture interpretations alone do not always produce a signal which can be considered the emulated pointing signal. Therefore, a second set of predefined rules, called 'emulation rules' are created based on the gesture interpretations and it is the emulation rules which are applied directly to a multipoint signal in order to generate the emulated pointing signal in this embodiment.
Figure 6 represents the emulation rules of this embodiment in a flow diagram. Before describing the emulation rules with reference to Figure 6 it is important to note the following operation of the emulator 124 of this embodiment. When a multipoint signal is provided by the touch-screen 6, one of the pointing signals of the multipoint signal may be designated as the 'primary' pointing signal. The primary pointing signal then provides the emulated pointing signal. All events from pointing signals of the multipoint signal, other that the primary pointing signal, are not included in the emulated pointing signal. The only exception to this is for some transition events from non-primary pointing signals which are included in the emulated pointing signal when the primary pointing signal is changed. The flow diagram of Figure 6 defines this operation in more detail.
The flow diagram of Figure 6 begins at block 302 wherein the emulator 124 of this embodiment identifies if a primary pointing signal of the multipoint signal is assigned. If a primary pointing signal is not assigned then processing flows to block 304, alternatively, if a primary pointing signal is assigned then processing flows to block 308, which will be discussed later. At block 304, the emulator 124 of this embodiment determines if a multipoint signal comprising at least one pointing signal is present. Practically, in this block the emulator 124 of this embodiment determines if one or more fingers are detected by the touchscreen 6. If a multipoint signal comprising at least one pointing signal is detected (i.e. at least one finger is detected) then processing flows to block 306 where the first pointing signal of the multipoint signal detected is designated as the primary pointing signal. Alternatively, if a multipoint signal comprising at least one pointing signal is not detected at block 304 of this embodiment (i.e. no fingers are detected by the touch screen 6) then processing waits at block 304 until one is detected, after which processing flows to block 306. In either case, once a pointing signal of the multipoint signal has been designated primary at block 306, processing flows to block 308 of this embodiment.
It is an advantage of this embodiment that the received pointer signal which provides the emulated pointer signal is switched when it becomes less suitable for providing the emulated pointing signal than a different received pointing signal. This dynamic operation ensures that the emulated pointer signal always provides the most relevant information from the plurality of received pointing signals.
According to the above, if processing of this embodiment reaches block 308 then a pointing signal of the multipoint signal is designated as the primary pointing signal. Accordingly, the emulated pointing signal will be set equal to the primary pointing signal. Whether or not the pointing signal designated as primary is maintained as the primary pointing signal in this embodiment will depend on the behaviour of the finger relating to the primary pointing signal and the behaviour of any other fingers which are detected by the touch-screen 6. More specifically, at block 308 the emulator 124 of this embodiment detects if any finger moves into the DOWN state, other than the finger corresponding to the primary pointing signal. In order to perform this operation, the emulator 124 of this embodiment detects if any pointing signal issues a DOWN event, other then the primary pointing signal.
If the emulator 124 does not detect a DOWN event from a non-primary pointing signal at block 308, processing flows to block 314, which will be discussed later. Alternatively, if the emulator 124 of this embodiment does detect a DOWN event from a non-primary pointing signal at block 308 then processing flows to block 310. At block 310 of this embodiment the emulator detects the current state of the finger represented by the primary pointing signal. If the finger represented by the primary pointing signal is in the DOWN state then processing flows from block 310 back to block 308. Alternatively, if the finger represented by the primary pointing signal is in any state other than the DOWN state, processing of this embodiment flows from block 310 to block 312. At block 312, the emulator 124 of the embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at block 308 to primary status (in other words, the "focus" is changed). In addition to changing which pointing signal is designated as primary, the emulator 124 of this embodiment also includes, in the emulated signal, the DOWN event that was issued by the new primary pointing signal prior to it becoming the primary pointing signal. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing of this embodiment flows from block 312 back to block 308.
As mentioned above, if the emulator 124 does not detect a DOWN event from a non-primary pointing signal at block 308, processing of this embodiment flows from block 308 to block 314. At block 314, the emulator 124 of this embodiment detects if the finger represented by the primary pointing signal moves to an OOR state based on whether an OOR event is issued by the primary pointing signal. If the emulator 124 does not detect an OOR event in the primary pointing signal, processing flows back to block 308. Alternatively, if the emulator 124 of this embodiment detects an OOR event in the primary pointing signal, processing flows to block 316. At block 316, the emulator 124 includes the OOR event issued at block 314 in the emulated pointing signal, then demotes the primary pointing signal from its primary status and leaves the primary status unassigned. Processing then flows from block 316 back to the block 302, as discussed above. The operation of this example embodiment, as described above with respect to Figure 6, can be modified as follows in a manner which is not shown by Figure 6. Operation of this embodiment is identical to the above embodiment with the following alterations. After processing flows from block 316 to block 302 and then to block 304, no pointing signal which was detected by the emulator 124 when the primary status was unassigned at block 316 is promoted to primary status. Instead, at blocks 304 and 306, only a new pointing signal which starts to be detected after processing at block 316 has finished is promoted to primary status. For example, a pointing signal which is detected by the emulator 124 when the primary status is unassigned at block 316, moves out of range and therefore ceases to be detected. Very soon after processing at block 316 has finished, the same pointing signal of this example is then detected again by the emulator 124 as it moves back into range. In this case, the pointing signal qualifies as a new pointing signal when it moves back into range, and after that time it is eligible for promotion to primary status.
Based on the above description of the flow diagram of Figure 6, it can be seen that some blocks of this embodiment can be grouped together based on the operation they are associated with. More specifically, those blocks in a rectangular box, i.e. blocks 306, 312 and 316 indicate instructions which are performed by the emulator 124 of this embodiment. Alternatively, blocks 302, 304 and 310 indicate tests wherein the emulator examines the current state of an element, for example, the current state of the finger represented by the primary input signal in block 310. Alternatively, blocks 308 and 314 indicate tests wherein the emulator reacts to an event, for example, the receipt of a DOWN event from a non- primary pointing signal in block 308. Accordingly, while the primary pointer signal is assigned and unchanged, processing of the emulator 124 with respect to Figure 6 flows in a loop consisting of blocks 308 and 314. Processing only breaks from this loop when an event that is tested for by either block 308 or 314 occurs.
According to the above described operation of the emulator 124, the emulated pointing signal of this embodiment is defined as one of the pointing signals of the multipoint signal received by the emulator 124 from the touch-screen 6. Accordingly, the software of the smartphone 2, in particular the single pointer software of the smartphone 2, can receive the emulated pointing signal in preference to the multipoint signal which may cause the software and smartphone 2 to malfunction. It is an advantage of certain embodiments that legacy and new single pointer software can operate on the smartphone 2. Additionally, it is an advantage that the same emulated pointing signal can be used by each software component of the smartphone 2, irrespective of where each software component is stored in the memory of the smartphone 2. This is particularly beneficial as it means that the same approach to emulation may be adopted for all software applications which use the emulated pointing signal. A further benefit is that the development of new single pointer software is simplified because software developers do not need to consider how to handle multipoint signals. Instead, they can simply build new single pointer software to use the emulated pointer signal.
Various modifications can be made to the example embodiments described above to provide alternative embodiments which are also covered by the scope of the appended claims. For example, the emulation rules applied by the emulator of the present invention can be modified. Further embodiments of the present invention will now be described with reference to Figures 7 to 12.
Figure 7 shows an embodiment incorporating a first set of alternative emulation rules which have been created based on the same gesture interpretations as described above. A difference between the emulation rules of Figure 7 and those of the above embodiments is the addition of a block 400 in between the block 310 and the block 308. The following describes how this new block is integrated into the flow diagram of the embodiment of Figure 7. At block 310, the emulator 124 of this embodiment detects the current state of the finger represented by the primary pointing signal. If the finger is in a DOWN state, processing flows from block 310 to block 400 and not back to block 308. At block 400, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at block 308 to primary status. The emulator 124 of this embodiment also includes in the emulated pointing signal the first UP event sent by either the old primary signal or the new primary signal. However, in either case, the emulator of this embodiment includes this UP event as issued by the new primary pointing signal. The emulator then includes in the emulated pointing signal the DOWN event that was issued by the new primary pointing signal at block 308 of this embodiment. Thereafter, the emulated pointing signal matches the primary pointing signal and processing flows from block 400 back to block 308 of this embodiment. Figure 8 shows a second set of alternative emulation rules which have been created based on the same lifting gesture interpretation as the above described embodiment and the following alternative touching gesture interpretation.
Whenever the user touches the screen with a new finger, the focus is moved to
Touching the signal generated this finger. The action performed with the previous finger should be continued at the location of the new finger.
A difference between the emulation rules of Figure 8 and those of the embodiment of Figure 6 is the addition of a block 402 in between the block 310 and the block 308. The following describes how this new block is integrated into the flow diagram of Figure 8. At block 310 of this embodiment, the emulator 124 detects the current state of the finger represented by the primary pointing signal. If the finger is in a DOWN state then processing of this embodiment flows from block 310 to block 402 and not back to block 308. At block 402, the emulator 124 demotes the current primary pointing signal from its primary status and promotes the non- primary pointing signal which issued the DOWN event at block 308 of this embodiment. The emulator 124 also includes in the emulated pointing signal a MOVE event specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal of this embodiment. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing flows from block 402 back to block 308 in this embodiment.
Figure 9 shows an embodiment incorporating a third set of alternative emulation rules which have been created based on the same lifting gesture interpretation as the embodiment of Figure 6 and the following alternative touching gesture interpretation.
The user's focus is always on the first finger that touched the screen and does Touching not change until this finger is lifted.
Differences between the emulation rules of Figure 9 and those of the embodiment of Figure 6 are the substitution of blocks 308 to 312 with new blocks 404 and 406. The following describes how these new blocks are integrated into the flow diagram of Figure 9. At block 302, if a primary pointing signal is assigned then processing flows to block 404 and not to block 308. At block 404 the emulator 124 of this embodiment detects if each of the fingers represented by all of the detected pointing signals are in the UP state and one of the non- primary pointing signals issues a DOWN event. If the conditions of block 404 are not true then processing of this embodiment flows to block 314, which was discussed above with reference to the embodiment of Figure 6 (a difference being that processing flows to block 404 rather than block 308 if the test is false). Alternatively, if the conditions of block 404 are true then processing of this embodiment flows from block 404 to block 406. At block 406, the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal which issued the DOWN event at block 404. The emulator 124 of this embodiment also includes in the emulated pointing signal the DOWN event that was issued by the new primary pointing signal at block 404. Thereafter, the emulated pointing signal matches the new primary pointing signal and processing of this embodiment flows from block 406 back to block 404.
Figure 10 shows an embodiment incorporating a fourth set of alternative emulation rules which have been created based on the same gesture interpretations as the embodiment of Figure 6. Differences between the emulation rules of Figure 10 and those of the embodiment of Figure 6 are the addition of new blocks 408 to 424.
The following describes how blocks 408 to 412 of this embodiment are integrated into the flow diagram of Figure 10. At block 308, if the emulator 124 of this embodiment does not detect a DOWN event from a non-primary pointing signal then processing flows to block 408 and not block 314. At block 408, the emulator 124 of this embodiment detects if the primary pointing signal sends an UP event. If the primary pointing signal does not issue an UP event, processing flows to block 314 which is discussed above with reference to the embodiment of Figure 6. Alternatively, if the primary pointing signal does issue an UP event then processing flows to block 410. At block 410, the emulator 124 of this embodiment detects if each finger relating to each non-primary pointing signal is in the DOWN state. If any of such fingers are in a DOWN state then processing flows from block 410 to block 412, alternatively, processing flows from block 410 back to block 308, which is discussed above with reference to the preferred embodiment. At block 412, the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in a DOWN state. More specifically, the closest finger in this embodiment is taken to be the finger in closest proximity to the touch-screen 6 or, if more than one finger is contacting the touch-screen 6, the finger contacting with the greatest pressure. The emulator 124 then includes in the emulated pointing signal a MOVE event specifying a move from the last recorded position of the old primary pointing signal to the current position of the new primary pointing signal in this embodiment. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 412 back to block 308.
The following describes how the new blocks 414 to 424 are integrated into the flow diagram of Figure 10. At block 314 of this embodiment, if the primary pointing signal issues an OOR event, processing flows to block 414 and not block 316. At block 414, the emulator 124 of this embodiment detects if any non-primary pointing signals are present. If no non-primary pointing signals are present then processing of this embodiment flows back to block 316, which is discussed above with reference to the embodiment of Figure 6. Alternatively, if at least one non-primary pointing signal is present, processing of this embodiment flows from block 414 to block 416. Block 416 is a test block wherein the emulator 124 of this embodiment examines if the state of the finger represented by the primary pointing signal was in a DOWN state before the OOR event was sent at block 314. Also at block 416, the emulator of this embodiment examines if the state of the closest finger is in the DOWN state. If neither condition is true, processing flows from block 416 to block 420, which is discussed below. Alternatively, if both conditions are true, processing in this embodiment flows from block 416 to block 418. At block 418, the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger. The emulator 124 of this embodiment also includes in the emulated pointing signal a DRAG event specifying a drag from the last recorded position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 418 back to block 308.
As mentioned above processing flows from block 416 to block 420 if neither condition specified in block 416 is true. Block 420 of this embodiment provides another test block wherein the emulator 124 examines if the state of the finger represented by the primary pointing signal was in a DOWN state before the OOR event was sent at block 314. Also at block 420, the emulator 124 of this embodiment examines if the state of the closest finger is in the UP state. If both conditions are true then processing in this embodiment flows to block 422, otherwise processing flows to block 424. At block 422, an UP event is included in the emulated pointing signal from the current primary pointing signal. The emulator 124 of this embodiment then demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger. The emulator 124 of this embodiment then includes a MOVE event in the emulated pointing signal specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal of this embodiment. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 422 back to block 308. At block 424, the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger. The emulator 124 then includes a MOVE event in the emulated pointing signal of this embodiment specifying a move from the last position of the old primary pointing signal to the current position of the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 424 back to block 308.
Figure 11 shows an embodiment incorporating a fifth set of alternative emulation rules which have been created based on the same touching gesture interpretation as the embodiment of Figure 6 and the following alternative lifting gesture interpretation.
Figure imgf000021_0001
Differences between the emulation rules of Figure 11 and those of Figure 10 are that blocks 412 and 418 are replaced by blocks 426 and 428, respectively. At block 426, the emulator
124 of this embodiment demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state. The emulator 124 of this embodiment then includes in the emulated pointing signal the first UP event sent by either the old primary or the new primary signals. However, in either case, the emulator of this embodiment includes this UP event as issued by the new primary pointing signal. The emulator 124 of this embodiment then generates and includes in the emulated pointing signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the primary pointing signal and processing flows from block 426 back to block 308.
At block 428, the emulator 124 of this embodiment demotes the current primary pointing signal from its primary status and promotes to primary status the non-primary pointing signal representing the closest finger. The emulator 124 of this embodiment also includes in the emulated pointing signal an UP event from the new primary pointing signal. The emulator 124 then generates and includes in the emulated signal of this embodiment a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the new primary pointing signal and processing flows from block 428 back to block 308.
Figure 12 shows an embodiment incorporating a sixth set of alternative emulation rules which have been created based on the same touching gesture interpretation as the embodiment of Figure 6 and the following alternative lifting gesture interpretation.
Figure imgf000022_0001
The differences between the emulation rules of Figure 12 and those of Figure 10 are that blocks 412 and 418 are replaced by blocks 430 and 432, respectively. At block 430, the emulator 124 of this embodiment includes in the emulated pointing signal the first UP event sent by either the old primary signal or the new primary signal. However, in either case, the emulator of this embodiment includes this UP event in the new primary signal. The emulator 124 of this embodiment then demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state. The emulator 124 of this embodiment then generates and includes in the emulated pointing signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the primary pointing signal and processing flows from block 430 back to block 308. At block 432, the emulator 124 of this embodiment includes in the emulated pointing signal an UP event from the current primary pointing signal. The emulator 124 of this embodiment then demotes the current primary pointing signal from its primary status and promotes the non-primary pointing signal relating to the closest finger in the DOWN state. The emulator 124 of this embodiment then includes in the emulated pointing signal a MOVE event specifying a move from the last recorded position of the old primary pointing signal to the current position of the new primary pointing signal. The emulator 124 of this embodiment then generates and includes in the emulated signal a DOWN event from the new primary pointing signal. Thereafter, the emulated pointing signal of this embodiment matches the primary pointing signal and processing flows from block 432 back to block 308.
In the preferred embodiments of Figures 6 to 12, the emulator 124 is shown as being distinct from the OS 122. However, in further embodiments such as that illustrated in Figures 1 to 5 the emulator 124 and the OS 122 are part of the same element of the smartphone 2 (there the OS). Regardless of the precise arrangement, in embodiments of the present invention the emulated pointer signal is formed such that it is available to all software components of the computing device which require access thereto. In other words, the emulated pointing signal of certain embodiments acts as a 'global' variable within the context of the computing device. This will help to minimise the chances that separate software components of the computing device will need to form their own individual emulated pointing signal. This is advantageous since it is likely that the presence of more than one emulated pointer signal will detract from the user experience. More specifically, it is possible that different emulated signals will use different approaches to emulation and in this case, the user may be uncertain of how their gestures will be interpreted into instructions.
It is an advantage of example embodiments of the present invention that single pointer software running on the computing device can ignore multipoint signals which may cause the computing device to malfunction and instead choose to receive the emulated pointer signal. Further, it is an advantage of example embodiments of the invention that the emulated pointer signal corresponds to the received pointing signals such that essential information included in the received pointing signals is also included in the emulated pointing signal. Accordingly, it is an advantage in example embodiments that the emulated pointer signal may be used as an approximation of, and an equivalent to, the plurality of received pointing signals. For the purposes of this specification and the example embodiment described below, the pointing device as seen by a user, i.e. the physical mouse or touchpad connected to the computing device that is held or touched by a user, is referred to as the 'physical pointer' . On the other hand, the pointing device as seen by the computing device, i.e. the pointing device as defined by the pointing signal sent to the computing device, is referred to as the 'logical pointer' .
The example embodiments of the present invention have been described with reference to a computing device and a single pointing device being a touch-screen having both proximity and multi-touch capabilities. However, further embodiments of the invention are applied to a wide range of different computing devices and a wide range of related pointing devices. For example, instead of a smartphone, a PDA, a desktop computer or a laptop could be used.
Further, instead of a touch-screen, a touch-pad could be used. Further-still, instead of a single pointing device, multiple pointing devices could be used, including at least one mouse, joystick, touch-screen or touch-pad.
The example embodiments of the present invention have been described with reference to a pointing device whose operation can be described with reference to the state diagram of Figure 4. However, further embodiments of the invention operate with a pointing device whose operation is defined by a different state model than that of Figure 4. For example, the state model may comprise only a subset of the states defined with respect to Figure 4. More specifically, an alternative state model comprises only the UP and DOWN states and not the OOR state. Such a state model can be used to describe a mouse which in operation cannot reach an OOR state. Another alternative state model comprises only the DOWN and OOR states and not the UP state. Such a state model can be used to describe a touch-screen without a proximity sensing capability. Such touch-screens are only able to detect pointing instruments when they actually contact the screen and therefore these devices cannot detect an instrument which is in close proximity but is not contacting the screen.
In an example embodiment, the application logic, software or an instruction set is maintained on any one of various conventional computer-readable media. In the context of this document, a "computer-readable medium" may be any media or means that can contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer, with one example of a computer described and depicted in Figure 1. A computer-readable medium may comprise a computer-readable storage medium that may be any media or means that can contain or store the instructions for use by or in connection with an instruction execution system, apparatus, or device, such as a computer.
If desired, the different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the above- described functions may be optional or may be combined.
Although various aspects of the invention are set out in the independent claims, other aspects of the invention comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims.
It is also noted herein that while the above describes example embodiments of the invention, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present invention as defined in the appended claims.

Claims

1. A method comprising: receiving concurrently a plurality of pointing signals in a computing device; and, generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device, wherein: one of the received pointing signals is selected to provide the emulated pointing signal and said selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules.
2. The method of claim 1 wherein the received pointing signal selected to provide the emulated pointing signal is changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of said pointing signals continues.
3. The method of any preceding claim further comprising controlling the computing device using the emulated pointing signal.
4. The method as claimed in any preceding claim wherein the received pointing signals relate to at least one instruction and the emulated pointing signal relates to the same at least one instruction.
5. The method of any preceding claim wherein the emulated pointing signal relates to data of at least two of the received pointing signals.
6. The method of claim 5 wherein data of each pointing signal comprises data relating to gestures of the user and the emulated pointing signal relates to gestures of at least two of the received pointing signals.
7. The method of any preceding claim wherein data of each pointing signal comprises data relating to gestures of a user, and wherein the set of predefined emulation rules are created according to a predefined set of interpretation rules which define how gestures are interpreted as instructions.
8. The method of any preceding claim further comprising providing the emulated pointing signal to a software application of the computing device, said software application being operable only with a single pointing signal from a single pointing device.
9. The method of any preceding claim wherein the computing device receives the plurality of pointing signals from a touch-screen or touch-pad having a multi-touch capability.
10. The method of any of claims 1 to 8 wherein the computing device receives a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad.
11. The method of any preceding claim wherein an operating system of the computing device generates the emulated pointing signal.
12. Apparatus comprising: a processor, memory including computer program code, the memory and the computer program code configured to, with the processor cause the apparatus at least to perform: receiving concurrently a plurality of pointing signals; and, generating an emulated pointing signal in dependence on the received pointing signals according to a predefined set of emulation rules, the emulated pointing signal imitating a single pointing signal from a single pointing device, wherein: one of the received pointing signals is selected to provide the emulated pointing signal and said selection is performed in dependence on the received pointing signals and according to the predefined set of emulation rules.
13. The apparatus of claim 12 wherein the received pointing signal selected to provide the emulated pointing signal is changed in dependence on the received pointing signals and according to the predefined set of emulation rules while concurrent receipt of said pointing signals continues.
14. The apparatus of claim 12 or claim 13 further comprising a controller configured to control the computing device using the emulated pointing signal.
15. The apparatus of any of claims 12 to 14 wherein the received pointing signals relate to at least one instruction and the emulated pointing signal relates to the same at least one instruction.
16. The apparatus of any of claims 12 to 15 wherein the emulated pointing signal relates to data of at least two of the received pointing signals.
17. The apparatus of claim 16 wherein data of each pointing signal comprises data relating to gestures of the user and the emulated pointing signal relates to gestures of at least two of the received pointing signals.
18. The apparatus of any of claims 12 to 17 wherein data of each pointing signal comprises data relating to gestures of a user, and wherein the set of predefined emulation rules are created according to a predefined set of interpretation rules which define how gestures are interpreted as instructions.
19. The apparatus of any of claims 12 to 18 wherein the emulated pointing signal is provided to a software application of the apparatus, said software application being operable only with a single pointing signal from a single pointing device.
20. The apparatus of any of claims 12 to 19 wherein the apparatus receives the plurality of pointing signals from a touch-screen or touch-pad having a multi-touch capability.
21. The apparatus of any of claims 12 to 19 wherein the apparatus receives a single pointing signal from each of a plurality of pointing devices including at least one of the following pointing devices: a mouse, a joystick and a touch-screen or touch-pad.
22. The apparatus of any of claims 12 to 21 wherein an operating system of the computing device generates the emulated pointing signal.
23. Apparatus comprising: one or more input devices configured to generate corresponding pointing signals, a processor, memory including computer program code, the memory and the computer program code configured to, with the processor cause the apparatus at least to perform: production of an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal.
24. The apparatus according to claim 23 wherein the emulated pointing signal is produced by first selecting a first pointing signal as said emulated pointing signal and thereafter selecting a second of said pointing signals as said emulated pointing signal.
25. A computer program comprising: code for concurrently processing a plurality of pointing signals; and code for producing an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal, when the computer program is run on a processor.
26. The computer program of claim 25 further comprising code for performing the method of any of claims 2 to 11.
27. The computer program according to claim 25 or claim 26, wherein the computer program is a computer program product comprising a computer-readable medium bearing computer program code embodied therein for use with a computer.
28. A computer readable medium encoded with instructions that, when executed by a computer, perform: receiving a plurality of pointing signals from one or more input devices; production of an emulated pointing signal by selection of one of said pointing signals as said emulated pointing signal.
29. The computer readable medium of claim 29 further encoded with instructions for performing the method of any of claims 2 to 11.
PCT/IB2009/055573 2008-12-15 2009-12-08 Method of and apparatus for emulating input Ceased WO2010070528A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0822845.4 2008-12-15
GB0822845A GB2466077A (en) 2008-12-15 2008-12-15 Emulator for multiple computing device inputs

Publications (1)

Publication Number Publication Date
WO2010070528A1 true WO2010070528A1 (en) 2010-06-24

Family

ID=40326139

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/055573 Ceased WO2010070528A1 (en) 2008-12-15 2009-12-08 Method of and apparatus for emulating input

Country Status (2)

Country Link
GB (1) GB2466077A (en)
WO (1) WO2010070528A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9665205B1 (en) * 2014-01-22 2017-05-30 Evernote Corporation Programmable touch emulating device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2826288C (en) 2012-01-06 2019-06-04 Microsoft Corporation Supporting different event models using a single input source
US10255101B2 (en) * 2014-12-11 2019-04-09 Sap Se Device emulator

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005029460A1 (en) * 2003-08-21 2005-03-31 Microsoft Corporation Focus management using in-air points
US20060012580A1 (en) * 2004-07-15 2006-01-19 N-Trig Ltd. Automatic switching for a dual mode digitizer
WO2006094308A2 (en) * 2005-03-04 2006-09-08 Apple Computer, Inc. Multi-functional hand-held device
WO2007089766A2 (en) * 2006-01-30 2007-08-09 Apple Inc. Gesturing with a multipoint sensing device
WO2008020446A1 (en) * 2006-08-15 2008-02-21 N-Trig Ltd. Gesture detection for a digitizer
WO2008085418A2 (en) * 2007-01-03 2008-07-17 Apple Inc. Proximity and multi-touch sensor detection and demodulation
EP2098948A1 (en) * 2008-03-04 2009-09-09 Apple Inc. Touch event model

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6100875A (en) * 1992-09-03 2000-08-08 Ast Research, Inc. Keyboard pointing device
US6895589B2 (en) * 2000-06-12 2005-05-17 Microsoft Corporation Manager component for managing input from existing serial devices and added serial and non-serial devices in a similar manner
US20070061126A1 (en) * 2005-09-01 2007-03-15 Anthony Russo System for and method of emulating electronic input devices
US20070257891A1 (en) * 2006-05-03 2007-11-08 Esenther Alan W Method and system for emulating a mouse on a multi-touch sensitive surface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005029460A1 (en) * 2003-08-21 2005-03-31 Microsoft Corporation Focus management using in-air points
US20060012580A1 (en) * 2004-07-15 2006-01-19 N-Trig Ltd. Automatic switching for a dual mode digitizer
WO2006094308A2 (en) * 2005-03-04 2006-09-08 Apple Computer, Inc. Multi-functional hand-held device
WO2007089766A2 (en) * 2006-01-30 2007-08-09 Apple Inc. Gesturing with a multipoint sensing device
WO2008020446A1 (en) * 2006-08-15 2008-02-21 N-Trig Ltd. Gesture detection for a digitizer
WO2008085418A2 (en) * 2007-01-03 2008-07-17 Apple Inc. Proximity and multi-touch sensor detection and demodulation
EP2098948A1 (en) * 2008-03-04 2009-09-09 Apple Inc. Touch event model

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9665205B1 (en) * 2014-01-22 2017-05-30 Evernote Corporation Programmable touch emulating device

Also Published As

Publication number Publication date
GB2466077A (en) 2010-06-16
GB0822845D0 (en) 2009-01-21

Similar Documents

Publication Publication Date Title
US12307084B2 (en) Single contact scaling gesture
US9250783B2 (en) Toggle gesture during drag gesture
CN110663018B (en) App launch in multi-monitor devices
US9110587B2 (en) Method for transmitting and receiving data between memo layer and application and electronic device using the same
US10203815B2 (en) Application-based touch sensitivity
CN110362414B (en) Proxy gesture recognizer
RU2675153C2 (en) Method for providing feedback in response to user input and terminal implementing same
US20190302984A1 (en) Method and device for controlling a flexible display device
US20140344765A1 (en) Touch Sensitive UI Pinch and Flick Techniques for Managing Active Applications
US9501175B2 (en) Techniques and apparatus for managing touch interface
CN104903835A (en) Apparatus, method and graphical user interface for forgoing generating haptic output for multi-touch gestures
KR20130111615A (en) Event recognition
US20140145945A1 (en) Touch-based input control method
CN109891374B (en) Method and computing device for force-based interaction with digital agents
WO2012015647A1 (en) Mapping trackpad operations to touchscreen events
US8842088B2 (en) Touch gesture with visible point of interaction on a touch screen
CN109491562A (en) Interface display method of voice assistant application program and terminal equipment
CN102609083A (en) Global settings for the enablement of culture-based gestures
CN104346077B (en) Application triggers method and apparatus
WO2010070528A1 (en) Method of and apparatus for emulating input
US9026691B2 (en) Semi-autonomous touch I/O device controller operation under control of host
EP2956839A1 (en) Methods and systems for multimodal interaction
CN105393214B (en) Self-revealing symbolic gestures
CN109359187A (en) Sentence entry exchange method and device, electronic equipment, storage medium
WO2014016845A1 (en) Computer device and method for converting gesture

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09833040

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09833040

Country of ref document: EP

Kind code of ref document: A1