[go: up one dir, main page]

WO2014179948A1 - Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard - Google Patents

Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard Download PDF

Info

Publication number
WO2014179948A1
WO2014179948A1 PCT/CN2013/075329 CN2013075329W WO2014179948A1 WO 2014179948 A1 WO2014179948 A1 WO 2014179948A1 CN 2013075329 W CN2013075329 W CN 2013075329W WO 2014179948 A1 WO2014179948 A1 WO 2014179948A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
swipe gesture
swipe
sensitive keyboard
gestures
Prior art date
Application number
PCT/CN2013/075329
Other languages
French (fr)
Inventor
Christian Rossing Kraft
Shijun Yuan
Sun XIE
Jianqiu FENG
Bokai ZENG
Original Assignee
Nokia Corporation
Nokia (China) Investment Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation, Nokia (China) Investment Co., Ltd. filed Critical Nokia Corporation
Priority to PCT/CN2013/075329 priority Critical patent/WO2014179948A1/en
Publication of WO2014179948A1 publication Critical patent/WO2014179948A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the method of an example embodiment may also include determining the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures are performed within a predefined time period.
  • the method of an example embodiment may also include determining whether one or more parameters associated with the second swipe gesture is within a predefined range of one or more corresponding parameters associated with the first swipe gesture.
  • the method may also determine the first and second swipe gestures to change the input mode in an instance in which the one or more parameters of the first and second swipe gestures are within a predefined range of one another.
  • the method may also determine the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures extend in opposite directions.
  • the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus of this example embodiment to cause a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
  • a computer program product includes at least one non-transitory computer-readable storage medium having computer- executable program code portions stored therein with the computer-executable program code portions including program code instructions for receiving an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user action to provide character input and that is also responsive to touch gestures including swipe gestures.
  • the computer-executable program code portions of this example embodiment also include program code instructions for receiving an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard.
  • an apparatus in yet another example embodiment, includes means for receiving an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user action to provide character input and that is also responsive to touch gestures including swipe gestures.
  • the apparatus of this example embodiment also includes means for receiving an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard.
  • the apparatus of this example embodiment also includes means for causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
  • touch-sensitive keyboards that are configured to receive both character input and gestures
  • a user in the process of providing character input by actuation of the keys of the touch-sensitive keyboard may drag their fmger from one key to another key.
  • the user in this instance, may be intending to provide character input, but the path traced by the user's fmger as the user slides their fmger across the touch-sensitive keyboard from one key to another key may also appear to represent a gesture.
  • an inability to consistently properly distinguish user input intended to provide character input from user input desired to represent a gesture may create issues including the display of content that is different from that intended by the user which may, at a minimum, frustrate the user or at least create inefficiencies for the user.
  • the apparatus may therefore, in some cases, be configured to implement an example embodiment of the present invention on a single chip or as a single "system on a chip.”
  • a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processor 22 may be embodied in a number of different ways.
  • the processor may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an example embodiment of the present invention while configured accordingly.
  • the processor when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein.
  • the processor when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed.
  • the processor may be a processor of a specific device (e.g., a mobile terminal or a fixed computing device) configured to employ an example embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein.
  • the processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
  • ALU arithmetic logic unit
  • a swipe gesture may be a gesture in which a user initially makes contact at a point of origin and then moves their finger, a pointer or the like across at least a portion of the touch-sensitive keyboard while maintaining contact with the touch-sensitive keyboard prior to lifting their fmger, pointer or the like from the touch-sensitive keyboard.
  • the indication of the first swipe gesture may include information describing the swipe gesture such as the point of origin, the point at which the user lifted their finger, pointer or the like from the touch-sensitive keyboard and, in some example embodiments, other parameters related to the swipe gesture, such as the speed with which the swipe gesture was made, the direction of the swipe gesture, etc.
  • the first and second swipe gestures are determined to not represent a change in input mode and, instead, the prior first swipe gesture is ignored and the second swipe gesture is, in effect, treated as a new first swipe gesture with the apparatus then awaiting receipt of another swipe gesture within a predefined time period in order to interpret the swipe gestures as a change in input mode.
  • the cursor may be modified in different manners depending upon the direction of movement that will be provided in response to the second swipe gesture, such as a cursor having a rightwardly pointing arrow in an instance in which the content and/or cursor will be moved rightwardly in response to the receipt of a second swipe gesture, a cursor in the form of an upwardly pointing arrow in an instance in which the direction in which the content and/or cursor will be moved in an upward direction in response to the receipt of the second swipe gesture and a cursor having a downward arrow in an instance in which the content and/or cursor will be moved downwardly in response to the receipt of a second swipe gesture.
  • the apparatus 20, such as the processor 22 or the like may be configured to monitor the time that has lapsed following the input of the most recent character.
  • the apparatus such as the processor, the user interface 28 or the like, may cause a change in the input mode resulting in movement of the display element in response to the first and second swipe gestures.
  • the receipt of the indication of the first swipe gesture may cause the scrolling of the display element without awaiting receipt of a second swipe gesture.
  • the apparatus 20 may move a display element, such as a cursor and/or content, based upon the first swipe input. See block 88.
  • the apparatus such as the processor, then determines if there has been character input during the period of time in which the first swipe gesture was evaluated. See block 90. If so, the apparatus, such as the processor, causes the timer to be restarted prior to returning to monitor for another touch move. See blocks 92 and 70.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method, apparatus and computer program product are provided to distinguish user input intended to provide character input from user input desired to represent a gesture in a consistent and accurate manner. In the context of a method, an indication of a first swipe gesture across a touch-sensitive keyboard is received that includes a plurality of keys configured to be responsive to user actuation to provide character input and that is also responsive to touch gestures including swipe gestures. The method may also receive an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard. The method may also include causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.

Description

METHOD AND APPARATUS FOR DISTINGUISHING SWIPE GESTURE FROM CHARACTER INPUT PROVIDED VIA TOUCH-SENSITIVE KEYBOARD
TECHNOLOGICAL FIELD
[0001] An example embodiment of the present invention relates generally to the recognition of a swipe gesture that may cause scrolling of a display element and more particularly, to distinguishing a swipe gesture from character input provided via a touch-sensitive keyboard.
BACKGROUND
[0002] A number of mobile terminals have included both a keyboard including one or more keys that may be actuated in order to provide character input and a touch screen display, distinct from the keyboard. The touch sensitive display may be configured to receive user input in the form of one or more gestures, such as a swipe gesture. Since the character input and the gestures are provided via separate and distinct interfaces, e.g., a keyboard for character input and a touch screen display for gestures, mobile terminals could readily distinguish character input from gestures and could respond appropriately to the different forms of user input.
[0003] Mobile terminals have been proposed to include a touch-sensitive keyboard. A touch-sensitive keyboard may include a plurality of keys configured to be responsive to user actuation to provide character input. However, a touch-sensitive keyboard is also configured to receive input in the form of gestures.
BRIEF SUMMARY
[0004] A method, apparatus and computer program product of an example embodiment are therefore provided in order to facilitate efforts to distinguish user input intended to provide character input from user input desired to represent a gesture in a consistent and accurate manner. Thus, a touch-sensitive keyboard may be utilized in order to receive both character input and input in the form of gestures and to respond in the manner intended by the user. The method, apparatus and computer program product of an example embodiment may improve the user experience associated with a touch-sensitive keyboard.
[0005] In one example embodiment, a method is provided that includes receiving an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user actuation to provide character input and that is also responsive to touch gestures including swipe gestures. The method of this example embodiment also receives an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard. The method of this example embodiment also includes causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
[0006] The method of an example embodiment may also include determining the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures are performed within a predefined time period. The method of an example embodiment may also include determining whether one or more parameters associated with the second swipe gesture is within a predefined range of one or more corresponding parameters associated with the first swipe gesture. In this example embodiment, the method may also determine the first and second swipe gestures to change the input mode in an instance in which the one or more parameters of the first and second swipe gestures are within a predefined range of one another. In another example embodiment, the method may also determine the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures extend in opposite directions.
[0007] The method of an example embodiment may also include continuing to operate in the input mode until an indication of a key press is received. The display element that is moved may be a cursor presented upon the display and/or content presented upon the display. In an example embodiment, the method may also include modifying a cursor in response to the first swipe gesture. For example, the cursor may be modified to indicate a direction of the first swipe gesture. The method of another example embodiment may also cause the display element to be moved in response to the first swipe gesture. In this example embodiment, the movement provided in response to the first swipe gesture may be less than the movement provided in response to the first and second swipe gestures. The method of this example embodiment may also include repositioning the display element to a position in which the display element was presented prior to the movement in response to the first swipe gesture such that the movement in response to the first swipe gesture is temporary. The method of an example embodiment may also include causing movement of the display element in response to receiving the indication of the first swipe gesture in an instance in which a user has not input a character for at least a predetermined time period. In this example embodiment, the method may also cause movement of the display element in response to receiving the indications of the first and second swipe gestures to be performed in an instance in which the user has input the character within the predetermined time period.
[0008] In another example embodiment, an apparatus is provided that includes at least one processor and at least one memory including computer program code with the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least receive an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user action to provide character input and that is also responsive to touch gestures including swipe gestures. The at least one memory and the computer program code are also configured to, with the processor, cause the apparatus of this example embodiment to receive an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard. The at least one memory and the computer program code are further configured to, with the processor, cause the apparatus of this example embodiment to cause a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
[0009] The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of an example embodiment to determine the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures are performed within a predefined time period. The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of this example embodiment to determine whether one or more parameters associated with the second swipe gesture is within a predefined range of one or more corresponding parameters associated with the first swipe gesture. In this example embodiment, the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus to determine the first and second swipe gestures to change the input mode in an instance in which the one or more parameters of the first and second swipe gestures are within a predefined range of one another. In another example embodiment, the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus to determine the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures extend in opposite directions.
[0010] The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of an example embodiment to continue to operate in the input mode until an indication of a key press is received. The display element that is moved may be a cursor presented upon the display and/or content presented upon the display. In an example embodiment, the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus to modify a cursor in response to the first swipe gesture. For example, the cursor may be modified to indicate a direction of the first swipe gesture. The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of another example embodiment to cause the display element to be moved in response to the first swipe gesture. In this example embodiment, the movement provided in response to the first swipe gesture may be less than the movement provided in response to the first and second swipe gestures. The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of this example embodiment to reposition the display element to a position in which the display element was presented prior to the movement in response to the first swipe gesture such that the movement in response to the first swipe gesture is temporary. The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of an example embodiment to cause movement of the display element in response to receiving the indication of the first swipe gesture in an instance in which a user has not input a character for at least a predetermined time period. In this example embodiment, the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus to cause scrolling of the display element in response to receiving the indications of the first and second swipe gestures to be performed in an instance in which the user has input the character within the predetermined time period.
[0011] In further example embodiment, a computer program product is provided that includes at least one non-transitory computer-readable storage medium having computer- executable program code portions stored therein with the computer-executable program code portions including program code instructions for receiving an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user action to provide character input and that is also responsive to touch gestures including swipe gestures. The computer-executable program code portions of this example embodiment also include program code instructions for receiving an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard. The computer-executable program code portions of this example embodiment also include program code instructions for causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
[0012] The computer-executable program code portions of an example embodiment may also include program code instructions for determining the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures are performed within a predefined time period. The computer-executable program code portions of an example embodiment may also include program code instructions for determining whether one or more parameters associated with the second swipe gesture are within a predefined range of one or more corresponding parameters associated with the first swipe gesture. In this example embodiment, the computer-executable program code portions may also include program code instructions for determining the first and second swipe gestures to change the input mode in an instance in which the one or more parameters of the first and second swipe gestures are within a predefined range of one another. In another example embodiment, the computer-executable program code portions may also include program code instructions for determining the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures extend in opposite directions.
[0013] The computer-executable program code portions of an example embodiment may also include program code instructions for continuing to operate in the input mode until an indication of a key press is received. The display element that is moved may be a cursor presented upon the display and/or content presented upon the display. In an example embodiment, the computer- executable program code portions may also include program code instructions for modifying a cursor in response to the first swipe gesture. For example, the cursor may be modified to indicate a direction of the first swipe gesture. The computer-executable program code portions of another example embodiment may also include program code instructions for causing the display element to be moved in response to the first swipe gesture. In this example embodiment, the movement provided in response to the first swipe gesture may be less than the movement provided in response to the first and second swipe gestures. The computer-executable program code portions of this example embodiment may also include program code instructions for repositioning the display element to a position in which the display element was presented prior to the movement in response to the first swipe gesture such that the movement in response to the first swipe gesture is temporary. The computer-executable program code portions of an example embodiment may also include program code instructions for causing movement of the display element in response to receiving the indication of the first swipe gesture in an instance in which a user has not input a character for at least a predetermined time period. In this example embodiment, the computer-executable program code portions may also include program code instructions for causing movement of the display element in response to receiving the indications of the first and second swipe gestures to be performed in an instance in which the user has input the character within the predetermined time period.
[0014] In yet another example embodiment, an apparatus is provided that includes means for receiving an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user action to provide character input and that is also responsive to touch gestures including swipe gestures. The apparatus of this example embodiment also includes means for receiving an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard. The apparatus of this example embodiment also includes means for causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
[0015] In one example embodiment, a method is provided that includes receiving an indication of a first swipe gesture across a predefined portion of a touch-sensitive keyboard. The touch-sensitive keyboard includes a plurality of keys configured to be responsive to user actuation. The touch-sensitive keyboard is also responsive to touch gestures including swipe gestures. In this example embodiment, the predefined portion of the touch-sensitive keyboard includes less than all of the touch-sensitive keyboard. The method of this example embodiment also causes a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indication of the first swipe gesture. [0016] In an example embodiment, the predefined portion of the touch-sensitive keyboard includes one or more rows or columns of the touch-sensitive keyboard, such as one or more rows or columns located proximate an edge of the touch-sensitive keyboard. The method of an example embodiment may continue to operate in the input mode until an indication of a key press is received. The display element that is scrolled may include a cursor and/or content presented upon the display.
[0017] In another example embodiment, an apparatus is provided that includes at least one processor and at least one memory including computer program code with the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least receive an indication of a first swipe gesture across a predefined portion of a touch- sensitive keyboard. The touch-sensitive keyboard includes a plurality of keys configured to be responsive to user actuation. The touch-sensitive keyboard is also responsive to touch gestures including swipe gestures. In this example embodiment, the predefined portion of the touch- sensitive keyboard includes less than all of the touch-sensitive keyboard. The method of this example embodiment also includes causing a change in input mode resulting in movement of a display element, such as a cursor and/or content, presented upon a display associated with the touch-sensitive keyboard in response to receiving the indication of the first swipe gesture.
[0018] The predefined portion of the touch-sensitive keyboard of an example embodiment may include one or more rows or columns of the touch-sensitive keyboard, such as one or more rows or columns located proximate an edge of the touch- sensitive keyboard. The at least one memory and the computer program code of an example embodiment may be further configured to, with the processor, cause the apparatus to continue to operate in the input mode until an indication of the key press is received.
[0019] In a further example embodiment, a computer program product is provided that includes at least one non-transitory computer-readable storage medium having computer- executable program code portions stored therein. The computer-executable program code portions include program code instructions for receiving an indication of a first swipe gesture across a predefined portion of a touch-sensitive keyboard. The touch-sensitive keyboard includes a plurality of keys configured to be responsive to user actuation. The touch-sensitive keyboard is also responsive to touch gestures including swipe gestures. In this example embodiment, the predefined portions of the touch-sensitive keyboard include less than all of the touch-sensitive keyboard. The computer-executable program code portions of this example embodiment also include program code instructions for causing a change in input mode resulting in movement of a display element, such as a cursor and/or content, presented upon a display associated with the touch-sensitive keyboard in response to receiving the indication of the first swipe gesture. The predefined portions of the touch-sensitive keyboard of an example embodiment may include one or more rows or columns of the touch-sensitive keyboard, such as one or more rows or columns located proximate an edge of the touch-sensitive keyboard. The computer-executable program code portions of an example embodiment may also include program code portions for causing a continuation of the input mode until an indication of a key press is received.
[0020] In yet another example embodiment, an apparatus is provided that includes means for receiving an indication of a first swipe gesture across a predefined portion of a touch-sensitive keyboard. The touch-sensitive keyboard includes a plurality of keys configured to be responsive to user actuation. The touch-sensitive keyboard is also responsive to touch gestures including swipe gestures. The predefined portion of the touch-sensitive keyboard of this example embodiment includes less than all of the touch-sensitive keyboard. The apparatus of this example embodiment also includes means for causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indication of the first swipe gesture. BRIEF DESCRIPTION OF THE DRAWINGS
[0021] Having thus described example embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
[0022] Figure 1 illustrates a mobile terminal including a touch-sensitive keyboard that may be configured in accordance with an example embodiment of the present invention;
[0023] Figure 2 is a block diagram of an apparatus that may be embodied or otherwise associated with a computing device, such as the mobile terminal of Figure 1 , and that may be specifically configured in accordance with an example embodiment of the present invention;
[0024] Figure 3 is a flow chart illustrating the operations performed, such as by the apparatus of Figure 2, in accordance with an example embodiment of the present invention; [0025] Figure 4 is an example of the modification of a cursor in accordance with an example embodiment of the present invention;
[0026] Figure 5 is a more specific flow chart illustrating the operations performed, such as by the apparatus of Figure 2, in accordance with another example embodiment of the present invention;
[0027] Figure 6 is a more specific flow chart illustrating the operations performed, such as by the apparatus of Figure 2, in accordance with a further example embodiment of the present invention; and
[0028] Figure 7 is a flow chart of the operations performed, such as by the apparatus of Figure 2, in accordance with a yet another example embodiment of the present invention.
DETAILED DESCRIPTION
[0029] Some example embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, example embodiments of the invention are shown. Indeed, various example embodiments of the invention may be embodied in many different forms and should not be construed as limited to the example embodiments set forth herein; rather, these example embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. As used herein, the terms "data," "content," "information," and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with example embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of example embodiments of the present invention.
[0030] Additionally, as used herein, the term 'circuitry' refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
[0031] As defined herein, a "computer-readable storage medium," which refers to a non- transitory physical storage medium (e.g., volatile or non-volatile memory device), can be differentiated from a "computer-readable transmission medium," which refers to an
electromagnetic signal.
[0032] For touch-sensitive keyboards that are configured to receive both character input and gestures, it may be difficult to distinguish user input that is intended to provide character input from user input that is intended to represent a swipe gesture. By way of example, a user in the process of providing character input by actuation of the keys of the touch-sensitive keyboard may drag their fmger from one key to another key. The user, in this instance, may be intending to provide character input, but the path traced by the user's fmger as the user slides their fmger across the touch-sensitive keyboard from one key to another key may also appear to represent a gesture. Thus, an inability to consistently properly distinguish user input intended to provide character input from user input desired to represent a gesture may create issues including the display of content that is different from that intended by the user which may, at a minimum, frustrate the user or at least create inefficiencies for the user.
[0033] A method, apparatus and computer program product are therefore provided in accordance with an example embodiment in order to distinguish character input from input in the form of a gesture that is provided via a touch-sensitive keyboard. A variety of different types of computing devices may include or otherwise be associated with touch-sensitive keyboards including, for example, various types of mobile terminals, such as a portable digital assistant
(PDA), mobile telephone, smartphone, pager, mobile television, gaming device, laptop computer, camera, tablet computer, headset, touch surface, video recorder, audio/video player, radio, electronic book, positioning device (e.g., global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text communications systems. Additionally or alternatively, the computing device that is associated with a touch-sensitive keyboard may be embodied by a fixed computing device, such as a personal computer, a workstation or the like.
[0034] Referring, by way of example to Figure 1 , a mobile terminal 10 is illustrated that includes a touch-sensitive keyboard 12 having a plurality of keys 14 configured to be responsive to user actuation to provide character input. As described below, the touch-sensitive keyboard is also responsive to touch gestures including swipe gestures. The mobile terminal of the example embodiment also includes a display 16 for presenting a variety of content. The touch-sensitive keyboard may include keys having a physical dome associated therewith to provide tactile feedback or a flat touch-sensitive keyboard, such as in an instance in which the outline of the keys is painted or presented onto a touch-sensitive surface. In either instance, the touch-sensitive keyboard is responsive both to character input and to touch gestures including swipe gestures. Indeed, in an example embodiment in which the touch-sensitive keyboard includes keys having a physical dome associated therewith, the surface of the keys also provides a touch-sensitive surface.
[0035] The user actuation of the keys 14 to provide character input may be performed in various manners. In an example embodiment, such as in an instance in which the touch-sensitive keyboard includes keys having a physical dome, the user actuation may include a variety of different types of actuation other than actuation that relies upon touch sensitive technology, such as tapping upon a touch-sensitive key. The actuation may therefore be a non-touch actuation. For example, the user actuation of the keys of this example embodiment may include the physical depression of the keys by the user. In another example embodiment, such as in an instance in which the touch-sensitive keyboard includes touch-sensitive keys, the user actuation may rely upon touch-sensitive technology and may, for example, include tapping a touch- sensitive key.
[0036] Not only does the recognition of the type of user input become more difficult with a touch-sensitive keyboard, but the use of a touch-sensitive keyboard may also prove more challenging in an instance in which the mobile terminal is configured to select matches during text entry of Latin characters that are then matched with corresponding Chinese characters. Further, the use of a touch-sensitive keyboard may also pose challenges in an instance in which the user input is provided to complete various parts of a form with the user also required to scroll between each field of the form during character entry. In order to facilitate both character entry and touch gestures including swipe gestures via the touch-sensitive keyboard, a method, apparatus and computer program product of an example embodiment are provided in order to distinguish between character input and touch gestures including swipe gestures.
[0037] In this regard, the apparatus 20 of an example embodiment for distinguishing between character input and touch gestures including swipe gestures is illustrated in Figure 2. The apparatus may embodied by or otherwise associated with the computing device that includes or is otherwise associated with the touch-sensitive keyboard 12, such as the mobile terminal 10 of Figure 1. The apparatus may include or otherwise be in communication with a processor 22, a memory device 24, a communication interface 26 and a user interface 28. In some example embodiments, the processor (and/or co-processors or any other processing circuitry assisting or otherwise associated with the processor) may be in communication with the memory device via a bus for passing information among components of the apparatus. The memory device may be non-transitory and may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device like the processor). The memory device may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus to carry out various functions in accordance with an example embodiment of the present invention. For example, the memory device could be configured to buffer input data for processing by the processor. Additionally or alternatively, the memory device could be configured to store instructions for execution by the processor.
[0038] As noted above, the apparatus 20 may be embodied by a computing device, such as a mobile terminal 10 or a fixed computing device. However, in some example embodiments, the apparatus may be embodied as a chip or chip set. In other words, the apparatus may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus may therefore, in some cases, be configured to implement an example embodiment of the present invention on a single chip or as a single "system on a chip." As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein. [0039] The processor 22 may be embodied in a number of different ways. For example, the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. As such, in some example embodiments, the processor may include one or more processing cores configured to perform independently. A multi-core processor may enable multiprocessing within a single physical package. Additionally or alternatively, the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
[0040] In an example embodiment, the processor 22 may be configured to execute instructions stored in the memory device 24 or otherwise accessible to the processor.
Alternatively or additionally, the processor may be configured to execute hard coded
functionality. As such, whether configured by hardware or software methods, or by a
combination thereof, the processor may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an example embodiment of the present invention while configured accordingly. Thus, for example, when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor may be a processor of a specific device (e.g., a mobile terminal or a fixed computing device) configured to employ an example embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein. The processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
[0041] Meanwhile, the communication interface 26 may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a computing device, such as the mobile terminal 10, that embodies or is otherwise associated with the apparatus 20. In this regard, the communication interface may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless
communication network. Additionally or alternatively, the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s). In some environments, the
communication interface may alternatively or also support wired communication. As such, for example, the communication interface may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
[0042] The apparatus 20 also includes a user interface 28 that may, in turn, be in
communication with the processor 22 to provide output to the user and, in some example embodiments, to receive an indication of a user input. As such, the user interface may include a display 16 and a touch-sensitive keyboard 12. In some example embodiments, the user interface may also include a mouse, a joystick, touch areas, soft keys, one or more microphones, a speaker, or other input/output mechanisms. Alternatively or additionally, the processor may comprise user interface circuitry configured to control at least some functions of one or more user interface elements such as a display and, in some example embodiments, a speaker, ringer, one or more microphones and/or the like. The processor and/or user interface circuitry comprising the processor may be configured to control one or more functions of one or more user interface elements through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor (e.g., memory device 24, and/or the like).
[0043] In order to distinguish between character input and input in the form of a gesture that is provided via a touch-sensitive keyboard 12, the operations performed, such as by the apparatus 20 of Figure 2 embodied by or otherwise associated with a computing device, in accordance with an example embodiment are illustrated in Figure 3. Referring now to block 30 of Figure 3, the apparatus 20 may include means, such as the processor 22, the user interface 28 or the like, for receiving an indication of a first swipe gesture across a touch-sensitive keyboard. In this regard, a swipe gesture may be a gesture in which a user initially makes contact at a point of origin and then moves their finger, a pointer or the like across at least a portion of the touch-sensitive keyboard while maintaining contact with the touch-sensitive keyboard prior to lifting their fmger, pointer or the like from the touch-sensitive keyboard. The indication of the first swipe gesture may include information describing the swipe gesture such as the point of origin, the point at which the user lifted their finger, pointer or the like from the touch-sensitive keyboard and, in some example embodiments, other parameters related to the swipe gesture, such as the speed with which the swipe gesture was made, the direction of the swipe gesture, etc.
[0044] Following the first swipe gesture and as shown in block 34 of Figure 3, the apparatus 20 may also include means, such as the processor 22, the user interface 28 or the like, for receiving an indication of a second swipe gesture across the touch-sensitive keyboard 12. As illustrated in block 38 of Figure 3, the apparatus may also include means, such as the processor, the user interface or the like, for causing a change in input mode, such as, for example, a change from character input mode to scrolling mode, resulting in movement of a display element presented upon the display 16 associated with the touch-sensitive keyboard in response to the indications of the first and second swipe gestures. The display element may be a cursor, such that the cursor is moved in the direction indicated by the first and second swipe gestures in an instance in which the swipe gestures are performed in the same direction or in a direction defined by a predefined one of the first and second swipe gestures in an instance in which the first and second swipe gestures extend in different directions. In this example embodiment, the movement of the cursor may facilitate selection of one or more candidates from a plurality of candidates, such as by selecting a word utilizing an auto-correction application or selecting a matching Chinese word, or the selection of a field to be completed within a form. In an alternative example embodiment, the display element may be the content presented upon the display such that scrolling of the content is provided in a direction defined by the first and second swipe gesture(s) as described above in conjunction with the scrolling of the cursor.
[0045] By requiring both the first and the second swipe gestures to be provided, the method, apparatus 20 and computer program product of an example embodiment may distinguish an inadvertent swipe gesture made by a user in conjunction with the entry of character input, such as in an instance in which a user drags their finger from one key to another key during the course of character input, from user input in the form of a swipe gesture. In other words, by only causing the scrolling of a display element in response to the provision of both first and second swipe gestures, the method, apparatus and computer program product of an example embodiment may avoid an inadvertent classification a first swipe gesture as representative of character input and, instead, appropriately respond to the swipe gesture by causing scrolling of a display element. Moreover, the requirement of a second swipe gesture to cause the scrolling of the display element in this example embodiment is also intuitive as a user who is unsure as to why the display element did not scroll in response to the first swipe gesture would likely repeat the swipe gesture in an effort to scroll the display element, which would have the intended effect as the repeated swipe gesture may be treated as the second swipe gesture.
[0046] In an example embodiment, the apparatus 20 may include means, such as the processor 22, the user interface 28 or the like, for continuing to operate in the same input mode, e.g., a scrolling mode, until an indication of a key press is received. See block 40 of Figure 3. In this regard, the apparatus, such as the processor, the user interface or the like, may continue to recognize the user input as scroll input, such as provided by a swipe gesture, until a key press is received, which is then interpreted as character input and causes the apparatus to switch from scrolling mode to character input mode.
[0047] As shown in block 36 of Figure 3, following receipt of the first and second swipe gestures, the apparatus 20 of an example embodiment may also include means, such as the processor 22 or the like, for determining whether the first and second swipe gestures represent a change in input mode. The apparatus, such as the processor, may employ various definitions of instances in which the first and second swipe gestures are or are not to represent a change in input mode. For example, the apparatus, such as the processor, may determine the first and second swipe gestures to represent a change in input mode in an instance in which the first and second swipe gestures are performed within a predefined time period, such as 1 second. In an instance in which the first and second swipe gestures are not performed within a predefined time period, the first and second swipe gestures are determined to not represent a change in input mode and, instead, the prior first swipe gesture is ignored and the second swipe gesture is, in effect, treated as a new first swipe gesture with the apparatus then awaiting receipt of another swipe gesture within a predefined time period in order to interpret the swipe gestures as a change in input mode.
[0048] In another example embodiment, the apparatus 20, such as the processor 22, may determine the first and second swipe gestures to represent a change in input mode in an instance in which the first and second swipe gestures extend in opposite directions. However, in an instance in which the first and second swipe gestures extend in the same direction, the first and second swipe gestures will not be interpreted as a change in input mode, but, instead the second swipe gesture will, in effect, be treated as a new first swipe gesture and the apparatus, such as the processor, the user interface 28 or the like, will await receipt of another swipe gesture that extends in a direction opposite to that of the second swipe gesture, that is, the new first swipe gesture, in order to recognize the swipe gestures as a change in input mode.
[0049] In yet another example embodiment, the apparatus 20, such as the processor 22, may be configured to determine the first and second swipe gestures to represent a change in input mode in an instance in which one or more parameters associated with the second swipe gesture are within a predefined range of one or more corresponding parameters associated with the first swipe gesture. The apparatus, such as the processor, of this example embodiment may evaluate various parameters associated with the swipe gestures in order to determine if the corresponding parameters of the first and second swipe gestures are within a predefined range of one another so as to cause the first and second swipe gestures to be considered representative of a change in input mode. As described below by way of example but not of limitation, the parameter may be the direction of the swipe gesture, such that first and second swipe gestures that extend in the same direction may be considered representative of a change in input mode. Additionally or alternatively, the parameter evaluated with respect to the swipe gestures may include the length of the swipe gestures, the position of the swipe gestures, the speed of the swipe gestures, etc.
[0050] As illustrated in block 32 of Figure 3, the apparatus 20 of an example embodiment may include means, such as the processor 22, the user interface 28 or the like, for providing an indication, following receipt of the first swipe gesture and prior to receipt of the second swipe gesture, of the movement that may be performed in response to a second swipe gesture that has not yet been entered. Thus, the provision of an indication of the movement may be performed in response to a first swipe gesture and may provide a user with a preview of the direction, the magnitude or the like of the movement that may subsequently be performed in response to a second swipe gesture.
[0051] As shown in the upper portion of Figure 4, a vertical line representing the cursor during a character input mode of operation is depicted. In response to a first swipe gesture that extends from the right side of the touch-sensitive keyboard to the left side of the touch-sensitive keyboard and in an instance in which the direction of movement is dictated by the first swipe gesture, the cursor may be modified to indicate the direction in which the content and/or cursor presented upon the display will be moved in an instance in which a second swipe gesture is entered, as shown in the middle portion of Figure 4. In the illustrated example embodiment, the cursor may be modified to have a leftwardly facing arrow in an instance in which the content and/or cursor will be moved leftwardly in response to the receipt of a second swipe gesture. The cursor may be modified in different manners depending upon the direction of movement that will be provided in response to the second swipe gesture, such as a cursor having a rightwardly pointing arrow in an instance in which the content and/or cursor will be moved rightwardly in response to the receipt of a second swipe gesture, a cursor in the form of an upwardly pointing arrow in an instance in which the direction in which the content and/or cursor will be moved in an upward direction in response to the receipt of the second swipe gesture and a cursor having a downward arrow in an instance in which the content and/or cursor will be moved downwardly in response to the receipt of a second swipe gesture. In an instance in which the second swipe gesture is entered and the new input mode, e.g., a scrolling mode, is entered, the cursor may again be modified to be indicative of the scrolling mode of operation, as shown in the lower portion of Figure 4. In instances in which the second swipe gesture is not received or in which the second swipe gesture is not received in a manner that satisfies the requirements for the first and second swipe gestures to represent a change in input mode, such as in an instance in which the time lapse between the first and second swipe gestures does not satisfy a predefined time period, the cursor may revert from the modified form of the cursor indicative of the direction in which the content and/or cursor will be moved to the cursor that is indicative of the character input mode of operation, as shown in the upper portion of Figure 4.
[0052] In another example embodiment, the indication of the movement that may be performed in response to a second swipe gesture that has not yet been received may include causing the display element, such as the cursor and/or the content, to be moved in response to the first swipe gesture prior to receipt of a second swipe gesture. In this example embodiment, the movement provided by the apparatus 20, such as the processor 22, the user interface 28 or the like, in response to the first swipe gesture may be in the same direction as, but less, in magnitude, than the movement to be provided in response to the first and second swipe gestures. As such, the movement provided in response to the first swipe gesture may be a preview of the direction or other attributes of the movement that will be provided in response to the first and second swipe gestures, but does not cause the content and/or cursor to move to such a degree that the user will be distracted during the entry of the second swipe gesture and prior to the actual movement of the content and/or cursor. In a further example embodiment, the apparatus may include means, such as the processor, the user interface or the like, for repositioning the display element to the same position in which the display element was presented prior to the movement in response to the first swipe gesture such that the movement in response to the first swipe gesture is temporary. Thus, the method, apparatus and computer program product of this example embodiment may provide a temporary preview or indication of the movement that will be provided in response to the receipt of both the first and second swipe gestures.
[0053] In one example embodiment, the apparatus 20, such as the processor 22 or the like, may be configured to monitor the time that has lapsed following the input of the most recent character. In an instance in which the first swipe gesture is performed within a predetermined time period following the entry of the most recent character input, the apparatus, such as the processor, the user interface 28 or the like, may cause a change in the input mode resulting in movement of the display element in response to the first and second swipe gestures. However, in an instance in which character input has not been provided for at least a predetermined time period, the receipt of the indication of the first swipe gesture may cause the scrolling of the display element without awaiting receipt of a second swipe gesture.
[0054] By way of further explanation, the operations performed in order to detect and respond to first and second swipe gestures in an example embodiment that does not incorporate a timer for limiting the time period between receipt of the first and second swipe gestures is described in conjunction with Figure 5. In this regard, the process primarily includes a touch move process 44 and a touch release process 60 and makes use of a first drag flag that is set in response to a first swipe gesture and a first drag direction value indicative of the direction of the first swipe gesture. In response to a touch press 42 and the subsequent movement of that touch, the apparatus 20, such as the processor 22, may determine if the first drag flag associated with an immediately prior swipe gesture is true, indicative of there having been an immediately prior swipe gesture. See block 46. In an instance in which a first drag flag is false so as not to indicate that there has been a prior swipe gesture, the first drag flag is set to true as shown in block 48 and the process advances to the touch release process. Following a determination that the first drag flag is true as shown in block 62, the current drag direction, that is, the direction of the swipe gesture, is obtained and the first drag direction value is set to the current drag direction as shown in blocks 64 and 66 prior to returning to monitor for an additional touch press at block 42.
[0055] Thereafter, in response to the receipt of a second swipe gesture, the apparatus 20, such as the processor 22, may determine the first drag flag to be true at block 46 (as a result of the prior first swipe gesture) and may determine the direction of the second swipe gesture (the current drag direction) and may compare the direction of the second swipe gesture with the first drag direction, that is, the direction of the first swipe gesture to determine if the first and second swipe gestures extend in the same direction. See blocks 50 and 52. In regards to the first and second swipe gestures extending in the same direction, the apparatus, such as the processor may consider the first and second swipe gestures to extend in the same direction so long as the first and second swipe gestures are within a predefined tolerance of one another. The predefined tolerance may be defined in various manners. For example, with respect to swipe gestures that trace predominantly straight lines, the apparatus, such as the processor, may determine the first and second swipe gestures to extend in the same direction in an instance in which the angle defined between the first and second swipe gestures is less than a predefined angle. As another example, the apparatus, such as the processor, may determine the first and second swipe gestures (either linear or non-linear swipe gestures) to extend in the same direction in an instance in which the second swipe gesture follows the path defined by the first swipe gesture and deviates therefrom by no more than a predefined amount.
[0056] In the example embodiment that requires the first and second swipe gestures to extend in the same direction in order to represent a change in input mode, the apparatus 20, such as the processor 22, the user interface 28 or the like, may cause the cursor may be moved in accordance to the swipe direction in an instance in which the first and second swipe gestures extend in the same direction. See block 58. The first drag flag and the first drag direction may also be cleared by the apparatus, such as the processor, so that the receipt of a next swipe gesture will be treated as the first swipe gesture. See blocks 54 and 56. Since the first drag flag has already been cleared, the apparatus, such as the processor, may determine that the first drag flag has been cleared in block 62 and, as such, repeat the process by returning to monitor for a touch press 42.
[0057] In an instance in which the current drag direction is determined by the apparatus 20, such as the processor 22, in block 52 to not be the same direction as the first drag direction, such as in an instance in which the first and second swipe gestures extend in opposite directions, the apparatus, such as the processor or the like, may determine in block 62 that the first drag flag is true and may then obtain the current drag direction and set the first drag direction to the current drag direction as shown in blocks 64 and 66 prior to returning to monitor for a subsequent touch press at block 40. As such, in this instance, the second swipe gesture is determined to serve as the new first swipe gesture going forward since, in this example embodiment, the first and second swipe gestures must extend in the same direction to be recognized as a scroll input with the first and second swipe gestures extending in opposite directions causing the first swipe gesture to be ignored and the second swipe gesture to be treated as the new first swipe gesture going forward.
[0058] The foregoing example relied upon the direction of the first and second swipe gestures to determine if the first and second swipe gestures were the same or substantially similar so as to represent a change in input mode. As described above, however, the apparatus 20, such as the processor 22, may evaluate, in addition to or instead of direction, one or more other parameters associated with the second swipe gesture to determine if the one or more other parameters are within a predefined range of one or more corresponding parameters associated with the first swipe gesture in order to determine if the first and second swipe gestures were the same or substantially similar so as to represent a change in input mode. In addition to or instead of direction, the parameter(s) that are evaluated may include the length of the swipe gestures, the position of the swipe gestures, the speed of the swipe gestures, etc.
[0059] As noted above, in another example embodiment, the first and second swipe gestures are required to cause a change in input mode in an instance in which character input has been provided within a predefined time period in advance of the receipt of an indication of the first swipe gesture and in which only the first swipe gesture is needed to initiate a movement in an instance in which the indication of the first swipe gesture is received following a lapse of more than the predefined time period since the most recent character input. In this example embodiment as shown in Figure 6, in response to a touch move as shown in block 70, such as a user dragging their finger across the touch-sensitive keyboard 12, the apparatus 20, such as the processor 22 or the like, may determine if the timer is active. See block 72. In an instance in which the timer is active, the most recent character input was provided within the predefined time period from the time at which the indication of the first swipe gesture was received. Conversely, in an instance in which the timer is inactive, more than the predefined time period has lapsed since the most recent character input.
[0060] As such, in an instance in which the timer is inactive, the apparatus 20, such as the processor 22, the user interface 28 or the like, may move a display element, such as a cursor and/or content, based upon the first swipe input. See block 88. The apparatus, such as the processor, then determines if there has been character input during the period of time in which the first swipe gesture was evaluated. See block 90. If so, the apparatus, such as the processor, causes the timer to be restarted prior to returning to monitor for another touch move. See blocks 92 and 70. However, in an instance in which character input has not been provided during the analysis of the first swipe gesture, the apparatus, such as the processor, evaluates the first drag flag at block 94 with the first drag flag having remained false in this instance, and, in response, may return to block 70 which the input is monitored for a swipe gesture.
[0061] In an instance in which the timer is determined at block 72 to be active, the apparatus 20, such as the processor 22, determines if the first drag flag is true. In an instance in which the first drag flag is true, the swipe gesture will be treated as the second swipe gesture, while in an instance in which the first drag flag is false, the swipe gesture will be treated as the first swipe gesture. See block 74. In this regard, in an instance in which the first drag flag is true, the apparatus, such as the processor or the like, will determine the current drag direction and will determine if the current drag direction is the same as the first drag direction. See blocks 78 and 82. If so, the apparatus, such as the processor or the like, may clear the first drag flag and the first drag direction, such that the subsequent swipe gesture will be treated as the first swipe gesture, and then cause a change in the input mode resulting in movement of the display element, such as the cursor and/or content prior to performing the touch release process as described above. See blocks 84, 86 and 88. In an instance in which the first drag flag is determined to be false in block 94, the apparatus, such as the processor, may set the first drag flag to true and may then proceed with the touch release process so as to return to the block 70 representing the touch move in which the apparatus, such as the processor, user interface or the like, monitors user input for a swipe gesture.
[0062] In another example embodiment depicted in Figure 7, a single swipe gesture within a predefined portion of a touch-sensitive keyboard 12 may cause scrolling of a display element. As shown in block 100 of Figure 7, the apparatus 20 may include means, such as the processor 22, the user interface 28 or the like, for receiving an indication of a first swipe gesture across a predefined portion of the touch-sensitive keyboard. The predefined portion of the touch- sensitive keyboard may include less than all of the touch-sensitive keyboard and, in one example embodiment, a minority of the touch-sensitive keyboard. For example, the predefined portion may include one or more rows or columns of the touch-sensitive keyboard, such as rows or columns located proximate an edge of the touch-sensitive keyboard. With respect to the example embodiment depicted in Figure 1 , the predefined portion of the touch-sensitive keyboard may include the topmost row and/or the bottommost row with the three intermediate rows including keys generally associated with letters and numbers being outside of the predefined portion of the touch-sensitive keyboard and therefore not configured to receive a first swipe gesture.
[0063] In an instance in which an indication of a first swipe gesture across the predefined portion of the touch-sensitive keyboard 12 is received, the apparatus 20, such as the processor 22, the user interface 28 or the like, may cause a change in input mode resulting in movement of a display element such as the cursor and/or content, presented upon the display associated with the touch-sensitive keyboard. See block 102. For example, the display element may be moved in the direction indicated by the first swipe gesture and, or, to an extent, degree or magnitude that corresponds to the length and/or speed of the first swipe gesture. Conversely, a swipe gesture across other portions of the touch-sensitive keyboard that are not included within the predefined portion of the touch-sensitive keyboard will not cause the input mode to change and, as such, will not cause movement of a display element.
[0064] As described above, the apparatus 20, such as the processor 22 or the like, may continue to operate in the new input mode, e.g., a scrolling mode, such that swipe gestures cause the display element to be moved, e.g., scrolled, until an indication of a key press, indicative of a character input, is received. See block 104 of Figure 7.
[0065] As described above, a method, apparatus and computer program product of an example embodiment are provided to distinguish user input intended to provide character input from user input desired to represent a gesture in a consistent and accurate manner. Thus, a touch-sensitive keyboard 12 may be utilized in order to receive both character input and input in the form of gestures and to respond in the manner intended by the user. The method, apparatus and computer program product of an example embodiment may therefore improve the user experience associated with a touch-sensitive keyboard. [0066] As described above, Figures 3 and 5-7 illustrate flowcharts of an apparatus 20, method, and computer program product according to example embodiments of the invention. It will be understood that each block of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other communication devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device 24 of an apparatus employing an embodiment of the present invention and executed by a processor 22 of the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other programmable apparatus (e.g., hardware) to produce a machine, such that the resulting computer or other programmable apparatus implements the functions specified in the flowchart blocks. These computer program
instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart blocks. The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart blocks. The computer program product may be embodied as an application that is configured to implement, for example, at least certain ones of the operations of the flowcharts of Figures 3 and 5-7.
[0067] Accordingly, blocks of the flowcharts support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowcharts, can be implemented by special purpose hardware -based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions. [0068] In some example embodiments, certain ones of the operations above may be modified or further amplified, Furthennore, in some example embodiments, additional optional operations may be included, such as illustrated by the blocks having dashed outlines in Figures 3 and 7. Modifications, additions, or amplifications to the operations above may be performed in any order and in any combination.
[0069] Many modifications and other example embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific example embodiments disclosed and that modifications and other example embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative example
embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
25

Claims

WHAT IS CLAIMED IS:
1. A method comprising:
receiving an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user actuation to provide character input and that is also responsive to touch gestures including swipe gestures;
receiving an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard; and
causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
2. A method according to Claim 1 further comprising determining the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures are performed within a predefined time period.
3. A method according to Claim 1 further comprising determining whether one or more parameters associated with the second swipe gesture are within a predefined range of one or more corresponding parameters associated with the first swipe gesture and determining the first and second swipe gestures to change the input mode in an instance in which the one or more parameters of the first and second swipe gestures are within a predefined range of one another.
4. A method according to any of Claims 1 to 3 further comprising continuing to operate in the input mode until an indication of a key press is received.
5. A method according to any of Claims 1 to 4 wherein the display element comprises a cursor presented upon the display.
6. A method according to any of Claims 1 to 5 wherein the display element comprises content presented upon the display.
7. A method according to any of Claims 1 to 6 further comprising modifying a cursor in response to the first swipe gesture.
8. A method according to Claim 7 wherein modifying the cursor comprises modifying the cursor to indicate a direction of the first swipe gesture.
9. A method according to Claims 1 to 6 further comprising causing the display element to be moved in response to the first swipe gesture, wherein movement provided in response to the first swipe gesture is less than the movement provided in response to the first and second swipe gestures.
10. A method according to Claim 9 further comprising repositioning the display element to a position in which the display element was presented prior to the movement in response to the first swipe gesture such that the scrolling in response to the first swipe gesture is temporary.
11. A method according to any one of Claims 1 to 10 further comprising causing movement of the display element in response to receiving the indication of the first swipe gesture in an instance in which a user has not input a character for at least a predetermined time period, wherein causing movement of the display element in response to receiving the indications of the first and second swipe gestures is performed in an instance in which the user has input the character within the predetermined time period.
12. A computer program product comprising at least one non-transitory computer- readable storage medium having computer-executable program code portions stored therein, the computer-executable program code portions comprising program code instructions for causing the method of any one of Claims 1 to 11 to be performed.
13. An apparatus comprising:
at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least:
receive an indication of a first swipe gesture across a touch-sensitive keyboard that includes a plurality of keys configured to be responsive to user actuation to provide character input and that is also responsive to touch gestures including swipe gestures; receive an indication of a second swipe gesture, following the first swipe gesture, across the touch-sensitive keyboard; and
cause a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indications of the first and second swipe gestures.
14. An apparatus according to Claim 13 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to determine the first and second swipe gestures to change the input mode in an instance in which the first and second swipe gestures are performed within a predefined time period.
15. An apparatus according to Claim 13 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to determine whether one or more parameters associated with the second swipe gesture are within a predefined range of one or more corresponding parameters associated with the first swipe gesture and to determine the first and second swipe gestures to change the input mode in an instance in which the one or more parameters of the first and second swipe gestures are within a predefined range of one another.
16. An apparatus according to any of Claims 13 to 15 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to continue to operate in the input mode until an indication of a key press is received.
17. An apparatus according to any of Claims 13 to 16 wherein the display element comprises a cursor presented upon the display.
18. An apparatus according to any of Claims 13 to 17 wherein the display element comprises content presented upon the display.
19. An apparatus according to any of Claims 13 to 18 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to modify a cursor in response to the first swipe gesture.
20. An apparatus according to Claim 19 wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to modify the cursor by modifying the cursor to indicate a direction of the first swipe gesture.
21. An apparatus according to Claims 13 to 20 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to cause the display element to be moved in response to the first swipe gesture, wherein movement provided in response to the first swipe gesture is less than the movement provided in response to the first and second swipe gestures.
22. An apparatus according to Claim 21 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to reposition the display element to a position in which the display element was presented prior to the movement in response to the first swipe gesture such that the movement in response to the first swipe gesture is temporary.
23. An apparatus according to any one of Claims 13 to 22 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to cause movement of the display element in response to receiving the indication of the first swipe gesture in an instance in which a user has not input a character for at least a predetermined time period, wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to cause movement of the display element in response to receiving the indications of the first and second swipe gestures is performed in an instance in which the user has input the character within the predetermined time period.
24. A method comprising:
receiving an indication of a first swipe gesture across a predefined portion of a touch- sensitive keyboard that includes a plurality of keys configured to be responsive to user actuation and that is also responsive to touch gestures including swipe gestures, wherein the predefined portion of the touch-sensitive keyboard comprises less than all of the touch-sensitive keyboard; and causing a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indication of the first swipe gesture.
25. A method according to Claim 24 wherein the predefined portion of the touch- sensitive keyboard comprises one or more rows or columns of the touch-sensitive keyboard.
26. A method according to Claim 25 wherein the one or more rows or columns of the touch-sensitive keyboard comprise one or more rows or columns located proximate an edge of the touch-sensitive keyboard.
27. A method according to any of Claims 24 to 26 further comprising continuing to operate in the input mode until an indication of a key press is received.
28. A method according to any of Claims 24 to 27 wherein the display element comprises a cursor presented upon the display.
29. A method according to any of Claims 24 to 28 wherein the display element comprises content presented upon the display.
30. A computer program product comprising at least one non-transitory computer- readable storage medium having computer-executable program code portions stored therein, the computer-executable program code portions comprising program code instructions for causing the method of any one of Claims 24 to 29 to be performed.
31. An apparatus comprising:
at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least:
receive an indication of a first swipe gesture across a predefined portion of a touch- sensitive keyboard that includes a plurality of keys configured to be responsive to user actuation and that is also responsive to touch gestures including swipe gestures, wherein the predefined portion of the touch-sensitive keyboard comprises less than all of the touch-sensitive keyboard; and cause a change in input mode resulting in movement of a display element presented upon a display associated with the touch-sensitive keyboard in response to receiving the indication of the first swipe gesture.
32. An apparatus according to Claim 31 wherein the predefined portion of the touch- sensitive keyboard comprises one or more rows or columns of the touch-sensitive keyboard.
33. An apparatus according to Claim 32 wherein the one or more rows or columns of the touch-sensitive keyboard comprise one or more rows or columns located proximate an edge of the touch-sensitive keyboard.
34. An apparatus according to any of Claims 31 to 33 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to continue to operate in the input mode until an indication of a key press is received.
35. An apparatus according to any of Claims 31 to 34 wherein the display element comprises a cursor presented upon the display.
36. An apparatus according to any of Claims 31 to 35 wherein the display element comprises content presented upon the display.
PCT/CN2013/075329 2013-05-08 2013-05-08 Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard WO2014179948A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/075329 WO2014179948A1 (en) 2013-05-08 2013-05-08 Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/075329 WO2014179948A1 (en) 2013-05-08 2013-05-08 Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard

Publications (1)

Publication Number Publication Date
WO2014179948A1 true WO2014179948A1 (en) 2014-11-13

Family

ID=51866616

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/075329 WO2014179948A1 (en) 2013-05-08 2013-05-08 Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard

Country Status (1)

Country Link
WO (1) WO2014179948A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116654730A (en) * 2023-04-10 2023-08-29 三菱电机上海机电电梯有限公司 Elevator destination floor false registration interception system and elevator control system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020015064A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
US20120011462A1 (en) * 2007-06-22 2012-01-12 Wayne Carl Westerman Swipe Gestures for Touch Screen Keyboards

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020015064A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
US20120011462A1 (en) * 2007-06-22 2012-01-12 Wayne Carl Westerman Swipe Gestures for Touch Screen Keyboards

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116654730A (en) * 2023-04-10 2023-08-29 三菱电机上海机电电梯有限公司 Elevator destination floor false registration interception system and elevator control system

Similar Documents

Publication Publication Date Title
US20250265409A1 (en) Device, method, and graphical user interface for annotating text
TWI617953B (en) Multi-task switching method, system and electronic device for touching interface
US10831337B2 (en) Device, method, and graphical user interface for a radial menu system
US9785335B2 (en) Systems and methods for adaptive gesture recognition
US10331313B2 (en) Method and apparatus for text selection
US8907900B2 (en) Touch-control module
US9354805B2 (en) Method and apparatus for text selection
US10025487B2 (en) Method and apparatus for text selection
US20120102401A1 (en) Method and apparatus for providing text selection
EP2660727B1 (en) Method and apparatus for text selection
CA2821814C (en) Method and apparatus for text selection
EP2660696A1 (en) Method and apparatus for text selection
EP2660697A1 (en) Method and apparatus for text selection
US20200097164A1 (en) Linking Multiple Windows in a User Interface Display
CN106933481B (en) Screen scrolling method and device
US20130044061A1 (en) Method and apparatus for providing a no-tap zone for touch screen displays
US10613732B2 (en) Selecting content items in a user interface display
US20150153925A1 (en) Method for operating gestures and method for calling cursor
US20140267030A1 (en) Computer and mouse cursor control method
WO2014179948A1 (en) Method and apparatus for distinguishing swipe gesture from character input provided via touch-sensitive keyboard
CA2821772C (en) Method and apparatus for text selection
CN101799727A (en) Signal processing device and method of multipoint touch interface and selecting method of user interface image
CN107025054A (en) Gesture recognition method of touch pad
CN104978102A (en) Electronic device and user interface control method
KR20160072446A (en) Method for inputting execute command by pointer and multimedia apparatus using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13883891

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13883891

Country of ref document: EP

Kind code of ref document: A1