[go: up one dir, main page]

WO2013161170A1 - Dispositif d'entrée, programme et procédé d'aide à l'entrée - Google Patents

Dispositif d'entrée, programme et procédé d'aide à l'entrée Download PDF

Info

Publication number
WO2013161170A1
WO2013161170A1 PCT/JP2013/001799 JP2013001799W WO2013161170A1 WO 2013161170 A1 WO2013161170 A1 WO 2013161170A1 JP 2013001799 W JP2013001799 W JP 2013001799W WO 2013161170 A1 WO2013161170 A1 WO 2013161170A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
screen
touch
hover
proximity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2013/001799
Other languages
English (en)
Japanese (ja)
Inventor
雅俊 中尾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Corp
Original Assignee
Panasonic Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corp filed Critical Panasonic Corp
Publication of WO2013161170A1 publication Critical patent/WO2013161170A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention relates to an input device that accepts an input operation via a touch panel, an input support method, and a program.
  • a touch panel that can be operated intuitively by a user is widely used as a device that receives an input operation of an electronic device including a mobile phone.
  • the touch panel accepts input operations for the screen of a display unit (for example, LCD (Liquid Crystal Display) or organic EL (Electroluminescence) display) provided in the electronic device and displays the processing result of the electronic device in the same screen.
  • a display unit for example, LCD (Liquid Crystal Display) or organic EL (Electroluminescence) display
  • a touch panel that can detect the proximity of a finger is known (see, for example, Patent Document 1).
  • This touch panel can detect a state where a finger is held at a position separated from the touch panel by a predetermined height, that is, a proximity state between the finger and the touch panel, and has a capacitance determined by a distance between the finger and the touch panel. Based on this, it is possible to detect that the finger has been slid in substantially parallel to the touch panel in the same manner as when the finger is directly slid on the touch panel. For this reason, a touch panel capable of detecting the proximity of a finger is expected to be established as a new user interface.
  • the non-contact type user input device of Patent Document 1 includes a plurality of linear transmission electrodes, a transmitter that supplies an alternating current for transmission to each transmission electrode, and a plurality that is arranged so as not to contact each transmission electrode.
  • a capacitor is formed at each intersection of the transmission electrode and the reception electrode, and a capacitor is formed according to the proximity of the user's fingertip. Therefore, the capacitance of the capacitor changes according to the proximity of the fingertip.
  • the non-contact type user input device can recognize the distance between the touch panel and the finger based on the change in capacitance.
  • the conventional input device has the following problems. Specifically, in order to detect a slide operation in which a finger is separated from the touch panel in a substantially parallel manner with respect to a touch panel capable of detecting the proximity of a finger, in a conventional input device, once in a hover mode, It was necessary to switch to a mode for detecting the movement of the finger by continuing the proximity state.
  • the conventional input device uses the touch operation. It seems that it is difficult to give the user such operability.
  • the hover operation there is nothing to support the finger during the operation, and the finger is not stable compared to the touch operation, so there is a high possibility that an erroneous operation will occur.
  • the finger is continuously slid at a position where the finger is separated from the touch panel as a hover operation, it is necessary to return the finger to the original position in order to perform the second slide operation after the first slide operation. If the finger is not sufficiently separated from the touch panel, a return slide operation may be detected, and it may be difficult to detect a continuous slide operation.
  • the present invention has been made in view of the above-described conventional circumstances, and an input device and input support for efficiently selecting control contents for content according to a user input operation on a touch panel and providing comfortable operability.
  • An object is to provide a method and a program.
  • the present invention provides a display unit that displays data on a screen, a proximity detection unit that detects the proximity of the first finger to the screen, and a second detection unit that detects the proximity of the first finger and detects the proximity of the first finger.
  • a contact detection unit that detects contact of a finger; and an operation execution unit that executes an operation corresponding to the combined operation in accordance with an operation combining the proximity of the first finger and the contact of the second finger; It is an input device provided with.
  • an input support method for an input device including a display unit that displays data on a screen, the step of detecting the proximity of the first finger to the screen, and the detection of the proximity of the first finger. And a step of detecting contact of the second finger on the screen, a step of receiving an operation combining the proximity of the first finger and the contact of the second finger, and corresponding to the combined operation. Performing an operation.
  • a computer which is an input device including a display unit that displays data on a screen, the step of detecting the proximity of the first finger to the screen, and after the proximity of the first finger is detected , Detecting a contact of the second finger with the screen, receiving a combined operation of the proximity of the first finger and the contact of the second finger, and performing an operation corresponding to the combined operation Is a program for realizing
  • the present invention it is possible to efficiently select the control content for the content according to the user's input operation on the touch panel, and to provide comfortable operability.
  • the block diagram which shows the hardware constitutions of the portable terminal in each embodiment The block diagram which shows the functional structure of the portable terminal in each of 1st-5th embodiment (A) The figure which shows a mode that the hover slide operation was performed independently, (B) The figure which shows a mode that the combination operation of a hover slide operation and a touch slide operation was performed.
  • the figure which shows combination operation of the hover slide operation and touch slide operation in 3rd Embodiment (A) The figure which shows the mode at the time of the start of combination operation, (B) The hover slide operation is performed after the start of combination operation. The figure which shows a mode that the space
  • the figure which shows the touch receivable range 45a The figure which shows combination operation of the hover slide operation and touch hold operation in 4th Embodiment, (A) The figure which shows a mode that the hover slide operation was performed independently, (B) Between hover slide operation and touch hold operation Diagram showing how the combination operation is performed The flowchart explaining the operation
  • the figure which shows combination operation of hover slide operation and touch hold operation in 5th Embodiment (A) The figure which shows a mode that combination operation of hover slide operation and touch hold operation was performed, (B) Combination operation The figure which shows a mode that the space
  • the input device of the present embodiment is an electronic device including a display unit that displays data on a screen, and is, for example, a mobile phone, a smartphone, a tablet terminal, a digital still camera, a PDA (personal digital assistant), or an electronic book terminal.
  • a mobile terminal for example, a smartphone
  • a smartphone will be described as an example of the input device of each embodiment.
  • the present invention can also be expressed as an input device as a device or a program for operating the input device as a computer. Furthermore, the present invention can also be expressed as an input support method including each operation (step) executed by the input device. That is, the present invention can be expressed in any category of an apparatus, a method, and a program.
  • the predetermined process is a process (for example, a process of reproducing video data) that executes contents related to the content currently displayed in the application.
  • the “button” may be a hyperlinked character string, that is, a news headline, or an image (for example, an icon or an icon for prompting the user to perform a selection operation). Keyboard software key) or a combination of a character string and an image.
  • the input device can accept, for example, the selection of “news headline” corresponding to the button as the operation on the button in accordance with the input operation of the user, and can display the details of the news corresponding to the selected button.
  • the “button” is determined according to the application running on the input device.
  • the two axes representing the horizontal plane on the touch panel are the x-axis and the y-axis
  • the axis representing the vertical direction (height direction) of the touch panel is the z-axis.
  • the “coordinate” is a position on the horizontal plane of the touch panel, that is, a coordinate (x, y) determined by a combination of the x coordinate and the y coordinate, and the coordinates (x, y) and the touch panel and the finger.
  • the coordinate (x, y, z) using the distance in the vertical direction between them, that is, the height z of the finger from the touch panel.
  • the instruction medium for the touch panel is described using a user's finger as an example, but is not limited to the finger, and a conductive stylus held by the user's hand may be used.
  • the instruction medium is not particularly limited as long as it can detect proximity and touch to the touch panel according to the structure and detection method of the touch panel.
  • an operation of placing a finger on a position on a space separated from the surface of the touch panel is defined as a “hover operation”, and the position on the touch panel surface is determined from the position on the space that is deceived by the hover operation.
  • the operation of sliding (moving) substantially in parallel is defined as “hover slide operation”. Therefore, the operation in which the finger directly touches the surface of the touch panel is not “hover operation” but “touch operation”.
  • an operation of sliding (moving) in a state where a finger is in contact with the surface of the touch panel is defined as a “touch slide operation”.
  • an operation for maintaining the touch state at the position without sliding the finger from the position on the touch panel surface is defined as a “touch hold operation”.
  • the distance between the finger and the surface of the touch panel is inversely proportional to the capacitance detected by the touch panel. It is preferable to correspond.
  • FIG. 1 is a block diagram showing a hardware configuration of the mobile terminals 1 and 1A in each embodiment.
  • 1 includes a processor 11, a display unit 13, a touch panel driver 14, a touch panel 15, a power supply control unit 16, a communication control unit 17 to which an antenna 17a is connected, a ROM (Read Only Memory) 21, and a RAM. (Random Access Memory) 22 and a storage unit 23 are included.
  • the processor 11, the display unit 13, the touch panel driver 14, the power supply control unit 16, the communication control unit 17, the ROM 21, the RAM 22, and the storage unit 23 are connected to each other via a bus 19 so as to be able to input and output data.
  • the processor 11 is configured using, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor), and performs overall control of the mobile terminals 1, 1A, 1B, and 1C. Various arithmetic processes or control processes are performed.
  • the processor 11 reads the program and data stored in the ROM 21 and performs various processes in each embodiment described later.
  • the ROM 21 stores an application 65 (see FIG. 2) installed in the mobile terminal 1, and a program and data for the processor 11 to execute processing in each unit shown in FIG.
  • the RAM 22 operates as a work memory in the operation of the processor 11, the touch panel driver 14, or the communication control unit 17.
  • the storage unit 23 is configured by using a hard disk or a flash memory built in the mobile terminal 1, and stores data acquired or generated by the mobile terminals 1, 1A, 1B, and 1C.
  • the application 65 (see FIG. 2) is stored in the storage unit 23.
  • the storage unit 23 may be configured using, for example, an external storage medium (for example, a USB memory) connected via a USB (Universal Serial Bus) terminal instead of a hard disk or a flash memory.
  • the display unit 13 has a function of displaying a screen, and is configured using, for example, an LCD or an organic EL display, and displays data output from the processor 11 or the touch panel driver 14 on the screen.
  • the touch panel driver 14 controls the operation of the touch panel 15 and monitors a user input operation on the touch panel 15. For example, when the touch panel driver 15 detects contact by a touch operation or a touch slide operation of a user's finger 68 (see FIG. 3A) or proximity by a hover operation or a hover slide operation, the touch coordinates (x, y ) Or proximity coordinates (x, y, z), and information on the contact coordinates (x, y) or proximity coordinates (x, y, z) is output to the processor 11, the RAM 22, or the storage unit 23.
  • the contact coordinates (x, y) are referred to as “touch coordinates (x, y)”.
  • the touch panel 15 is mounted on the screen 45 (see FIG. 3A) of the display unit 13, and detects that the user's finger 68 performs a touch operation or a touch slide operation on the horizontal surface of the touch panel 15. Further, the touch panel 15 detects that the user's finger 68 has approached the touch panel 15 by a hover operation or a hover slide operation.
  • the touch panel 15 has a finger height z value in a hover operation of a predetermined value zth or less, Alternatively, it is detected that the finger 68 is close to the touch panel 15 when the capacitance determined according to the value of the finger height z is equal to or greater than a predetermined value.
  • the power supply control unit 16 is configured using a power supply source (for example, a battery) of the mobile terminal 1, and switches the power supply state of the mobile terminal 1 to an on state or an off state according to an input operation to the touch panel 15. When the power supply is on, the power supply control unit 16 supplies power from the power supply source to each unit shown in FIG. 1 so that the mobile terminal 1 can operate.
  • a power supply source for example, a battery
  • the communication control unit 17 is configured using a wireless communication circuit, transmits data as a processing result processed by the processor 11 via a transmission / reception antenna 17a, and further, a base station (not shown) or other communication Receives data sent from the terminal.
  • FIG. 1 illustrates a configuration necessary for the description of each embodiment including the present embodiment, but the mobile terminals 1, 1A, 1B, and 1C in each embodiment perform voice control for controlling call voice.
  • FIG. 2 is a block diagram showing a functional configuration of the mobile terminal 1 in each of the first to fifth embodiments.
  • a proximity detection unit 5 includes a proximity detection unit 5, a touch detection unit 10, a screen display unit 30, a memory 40, a proximity coordinate extraction unit 51, a touch coordinate extraction unit 52, a state management unit 54, an image button management unit 55, An operation determination unit 56, a display image generation unit 58, an application screen generation unit 59, an image composition unit 60, and an application 65 are included.
  • Application 65 includes a control extension unit 64.
  • the control extension unit 64 includes a state change amount change unit 61, a change target change unit 62, and a state change continuation unit 63.
  • the proximity detector 5 detects a state in which the user's finger is close to the touch panel 15 by a hover operation or a hover slide operation.
  • the proximity detection unit 5 outputs a proximity notification indicating that the finger has approached the touch panel 15 to the proximity coordinate extraction unit 51.
  • the touch detection unit 10 as a contact detection unit detects an operation in which a finger touches the touch panel 15 by a touch operation or a touch slide operation.
  • the touch detection unit 10 outputs a contact notification that a finger has touched the touch panel 15 to the touch coordinate extraction unit 52.
  • the proximity detection unit 5 and the touch detection unit 10 can be configured using the touch panel 15, and in FIG. 2, the proximity detection unit 5 and the touch detection unit 10 are configured separately. You may comprise.
  • the screen display unit 30 corresponds to the display unit 13 shown in FIG. 1, has a function of displaying data on the screen 45, and displays composite image data output from the image composition unit 60 described later on the screen 45.
  • the composite image data is data obtained by combining the data of the screen of the application 65 (hereinafter simply referred to as “application screen”) and the image data generated by the display image generating unit 58 by the image combining unit 60.
  • the memory 40 corresponds to the RAM 22 or the storage unit 23 shown in FIG. 1 and is configured as at least an image button database 55a.
  • the image button database 55a is used in, for example, screen data and image data used in the application 65, image data generated by the application 65, image data received from a base station (not shown) or another communication terminal, and the application 65. Button coordinate information and operation information of the application 65 assigned to the button are stored.
  • the memory 40 temporarily stores information on the proximity coordinates (x, y, z) extracted by the proximity coordinate extraction unit 51 or the touch coordinates (x, y) extracted by the touch coordinate extraction unit 52. May be. In FIG. 2, the arrows from the proximity coordinate extraction unit 51 and the touch coordinate extraction unit 52 to the memory 40 are omitted in order to avoid complication of the drawing.
  • the proximity coordinate extraction unit 51 calculates and extracts proximity coordinates (x, y, z) of the finger with respect to the touch panel 15 based on the proximity notification output from the proximity detection unit 5.
  • the x component and the y component are coordinate values representing positions on the touch panel 15, and the z component is a vertical distance between the finger and the touch panel 15, that is, the finger. Is a coordinate value representing the height z with respect to the touch panel 15.
  • the proximity coordinate extraction unit 51 outputs information on the extracted proximity coordinates (x, y, z) to the operation determination unit 56.
  • the touch coordinate extraction unit 52 calculates and extracts touch coordinates (x, y) when a finger touches the touch panel 15 based on the contact notification output from the touch detection unit 10.
  • the touch coordinate extraction unit 52 outputs information on the extracted touch coordinates (x, y) to the operation determination unit 56.
  • the user input operation determined by the operation determination unit 56 is a hover operation, a hover slide operation, a touch operation, a touch slide operation, or a combination operation of each operation, but is not limited to these operations.
  • the state management unit 54 inputs operation determination result information (described later) output from the operation determination unit 56. Based on the operation determination result information output from the operation determination unit 56, the state management unit 54 is the content of the input operation of the user's finger, that is, any of hover operation, hover slide operation, touch operation, and touch slide operation. Depending on whether the operation is performed, the mobile terminal 1 determines which of the proximity state, the contact state, and the state where the proximity state and the contact state coexist.
  • the state management unit 54 performs a touch slide operation with a second finger (for example, the thumb 68b; the same applies below) when the hover slide operation is performed with a user's first finger (for example, an index finger 68a; the same applies to the following). It is assumed that operation determination result information indicating that a combination operation of a hover slide operation and a touch slide operation has been performed is acquired from the operation determination unit 56.
  • a user input operation the operation of the state management unit 54 when a combination operation of a hover slide operation and a touch slide operation is used will be described.
  • the state management unit 54 changes the proximity state by the hover slide operation and the touch slide operation from the proximity state by the hover slide operation as the state of the mobile terminal 1. It determines with having shifted to the coexistence state with the contact state by. Further, the state management unit 54 outputs (notifies) to the application 65 that the combination operation of the hover slide operation and the touch operation has been performed.
  • the state management unit 54 is information on the slide amount (movement amount) of the combination operation.
  • a display image generation instruction for generating a display image to be displayed on the screen 45 is output to the display image generation unit 58.
  • the slide amount (movement amount) of the combination operation is calculated by the operation determination unit 56 and included in the operation determination result information.
  • the image button management unit 55 reads or writes information indicating display coordinates on the screen 45 of each button constituting the application screen used in the application 65 or image data used in the application 65 from the memory 40.
  • the operation determination unit 56 inputs the information of the proximity coordinates (x, y, z) output from the proximity coordinate extraction unit 51 or the information of the touch coordinates (x, y) output from the touch coordinate extraction unit 52.
  • the operation determination unit 56 based on the input proximity coordinate (x, y, z) information or touch coordinate (x, y) information, the content of the input operation of the user's finger, that is, hover operation, hover slide It is determined which operation or which operation is combined with which operation among the operation, touch operation, and touch slide operation.
  • the operation determination unit 56 performs a combination operation of a hover slide operation with the user's first finger (index finger 68a) and a touch slide operation with the user's second finger (thumb 68b) in the same direction. It is determined whether or not the distance is the same. Furthermore, the operation determination unit 56 calculates the value of the movement amount (slide amount) of the first finger and the second finger by the combination operation. The operation determination unit 56 also calculates the value of the finger movement amount (slide amount) by each operation even when the hover slide operation or the touch slide operation is performed independently.
  • the operation determination unit 56 has the same screen 45 in which the hover slide operation by the user's first finger (index finger 68a) and the touch slide operation by the same user's second finger (thumb 68b) are in the same direction.
  • the state management is performed with information indicating that the combination operation of the hover slide operation and the touch slide operation has been performed in the same direction and the slide amount (movement amount) as operation determination result information.
  • the slide amount movement amount
  • the display image generation unit 58 Based on the display image generation instruction output from the state management unit 54 and the information on the display range of the screen 45 output from the application screen generation unit 59, the display image generation unit 58 performs the application via the image button management unit 55. The image data at 65 is acquired. The display image generation unit 58 generates a display image to be displayed on the screen 45 by cutting out an image in a range corresponding to the display image generation instruction from the acquired image data. The display image generation unit 58 outputs the generated display image to the image composition unit 60.
  • the application screen generation unit 59 acquires various data necessary for generating a screen in the application 65 from the memory 40 via the image button management unit 55 based on the screen generation instruction output from the application 65.
  • the application screen generation unit 59 generates screen data of the application screen in the application 65 using the acquired various data.
  • the application screen generation unit 59 outputs the generated image data to the image composition unit 60.
  • the application screen generation unit 59 and the application 65 are shown as separate configurations. However, the application 65 has the function of the application screen generation unit 59, so that the application screen generation unit 59 and the application 65 You may comprise as the application 65 which bundled.
  • the image composition unit 60 synthesizes the display image data output from the display image generation unit 58 and the screen data of the application screen output from the application screen generation unit 59 and causes the screen display unit 30 to display them.
  • the state change amount changing unit 61 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch slide operation has been performed in the same direction and the same distance, the hover slide operation and the touch slide operation are performed.
  • the state change amount of the operation executed in the application 65 immediately before the combination operation is performed in the same direction and the same distance is changed.
  • the state change amount changing unit 61 increases or decreases the moving amount (sliding amount) or moving speed (sliding speed) of the map image 47 displayed on the screen 45 as an example of changing the operation state changing amount. Change to
  • the change target changing unit 62 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch slide operation is performed in the same direction and the same distance
  • the change target change unit 62 performs the hover slide operation and the touch slide operation.
  • the operation change target (for example, the output sound of the video) executed in the application 65 immediately before the combination operation is performed in the same direction and the same distance is changed to another change target.
  • the change target changing unit 62 changes the output sound of the video to the playback speed of the video as an example of changing the change target of the operation.
  • the state change continuation unit 63 acquires information from the state management unit 54 that the combination operation of the hover slide operation and the touch hold operation is performed in the same direction and the same distance, the state change continuation unit 63 performs the hover slide operation and the touch hold operation.
  • the change state of the operation executed in the application 65 immediately before the combination operation is performed is maintained, and the maintained state is continued.
  • the state change continuation unit 63 is an example of continuing the state in which the change state of the operation is maintained, and when the map image 47 is moved in the screen 45 by the amount of sliding of the finger by a hover slide operation.
  • the combination operation of the hover slide operation and the touch hold operation causes the hover slide operation to be in the state of the hover operation, but by continuing the touch hold operation, the same operation according to the hover slide operation, that is, the finger slide
  • the movement of the map image 47 in the screen 45 is continued by twice the amount.
  • FIG. 3A is a diagram illustrating a state where the hover slide operation is performed independently.
  • FIG. 3B is a diagram illustrating a state in which a combination operation of a hover slide operation and a touch slide operation is performed.
  • FIG. 3A After the user's finger 68 shown in FIG. 3A starts a hover slide operation on the touch panel 15 (screen 45), as shown in FIG. 3B, another finger (index finger 68a) is displayed. ) Touches (touches) the touch panel 15 and performs a touch slide operation by the same distance in the same direction as the hover slide operation of the index finger 68a.
  • the mobile terminal 1 changes the state change amount of the operation of the application 65 according to the hover slide operation immediately before the combination operation of the hover slide operation and the touch slide operation is performed.
  • the index finger 68 shown in FIG. 3A is the same as the index finger 68a shown in FIG. 3B, and is not the same as the thumb 68b shown in FIG. 3B. Further, in the following description, a finger that performs a hover operation or a hover slide operation is described as an index finger 68a, and a finger that performs a touch operation or a touch slide operation is described as a thumb 68b. However, the present invention is not particularly limited.
  • the hover slide operation is performed independently with the finger 68.
  • the screen 45 displays.
  • the map image 47 thus moved moves (or is also referred to as “scroll”) within the screen 45. By this movement, the content of the image displayed in the screen 45 is scrolled and switched.
  • the map image 47 displayed on the screen 45 by the hover slide operation of the finger 68 causes the movement amount of the finger 68 (the hover slide amount, the length of the arrow a) in the same direction as the hover slide operation of the finger 68 (the direction of the arrow b). For example) is moved (scrolled) within the screen 45 by a distance (corresponding to the length of the arrow b), for example, twice as long.
  • the map image 47 displayed in the screen 45 by the combination operation of the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b has the same distance as the hover slide amount of the index finger 68a in the direction of the arrow b.
  • the position touched by the thumb 68b is represented by a circle 50 (see dotted line), and the same applies to the following embodiments.
  • FIG. 4 is a flowchart for explaining the operation procedure of the mobile terminal 1 in accordance with the combination operation of the hover slide operation and the touch slide operation in the first embodiment.
  • the flowchart shown in FIG. 4 represents the operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 3 (A) and 3 (B).
  • the state management unit 54 determines whether or not a hover slide operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S1). When it is determined that the hover slide operation is being performed, the operation of the mobile terminal 1 proceeds to step S2.
  • the state management unit 54 determines that the hover slide operation is being performed with the user's index finger 68a (S1, YES), based on the operation determination result information from the operation determination unit 56, the user's thumb It is determined whether a touch slide operation is performed by 68b (S2).
  • step S2 the state management unit 54 determines that the hover slide operation by the user's index finger 68a and the touch slide operation by the thumb 68b are in the same direction and are performed at the same distance based on the operation determination result information of the operation determination unit 56.
  • the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO)
  • the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed.
  • the state change amount changing unit 61 initializes the state change amount of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 (S3).
  • the map image 47 is displayed in the screen 45 of the display unit 13 of the mobile terminal 1, and the state change amount in step S3 is set as the movement amount (slide amount or scroll amount) of the map image 47, and the initial state change amount is set. Double the value.
  • the state change amount changing unit 61 moves within the screen 45 according to the movement amount (hover slide amount) of the finger 68 when the hover slide operation (see FIG. 3A) is performed alone.
  • the movement amount (state change amount) of the map image 47 to be made is initialized.
  • the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES)
  • the combined operation of the hover slide operation and the touch slide operation is performed.
  • the state change amount changing unit 61 changes the state change amount of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 to, for example, the same size (1 time) (S4).
  • the amount of movement of the index finger 68a is equal to the amount of movement of the map image 47 displayed in the screen 45, and the user can move the finger 68 performing the hover slide operation.
  • the map image 47 is scrolled by the hover slide operation twice as much as the amount of movement, but by the combined operation of the hover slide operation and the touch slide operation, it is the same as the amount of movement of the index finger 68a performing the hover slide operation.
  • the map image can be scrolled by the amount of movement, and the fine adjustment of the scroll of the map image 47 can be easily performed.
  • step S5 the operation of the mobile terminal 1 proceeds to step S5.
  • the operation determination unit 56 calculates the slide amount of the hover slide operation determined to be executed in step S1 based on the operation determination instruction output from the state management unit 54 (S5).
  • the operation determination unit 56 outputs information on the slide amount as a calculation result to the state management unit 54.
  • the state management unit 54 multiplies the slide amount output from the operation determination unit 56 by the change amount obtained in step S3 or S4, and the map image 47 in the screen 45 corresponding to the slide amount calculated in step S5. Is calculated (scroll amount) (S6).
  • the state management unit 54 outputs a display image generation instruction including information on the calculated movement amount (scroll amount) of the map image to the display image generation unit 58.
  • the display image generation unit 58 Based on the display image generation instruction output from the state management unit 54, the display image generation unit 58 displays the map image 47 after moving within the screen 45 by the movement amount (scroll amount) calculated in step S6. It produces
  • the image composition unit 60 synthesizes the display image data output from the display image generation unit 58 and the screen data of the application screen output from the application screen generation unit 59 and causes the screen display unit 30 to display them (S8). . After step S8, the operation of the portable terminal 1 returns to the process of step S1.
  • the mobile terminal 1 of the present embodiment uses two fingers when, for example, the hover slide operation is performed independently and the map image 47 is moved within the screen 45 by twice the slide amount of the hover slide operation.
  • the movement (scrolling) of the map image 47 within the screen 45 becomes a movement (scrolling) equivalent to the slide amount of the hover slide operation (scrolling).
  • the mobile terminal 1 can finely adjust the movement (scrolling) of the map image 47 in the screen 45 from twice the hover slide amount to the same amount as the movement processing of the map image 47 in the screen 45. According to the user's input operation on the touch panel, the control content for the content (map image) can be efficiently selected and comfortable operability can be given.
  • the movement amount in the screen 45 of the map image 47 when the hover slide operation is performed alone has been described as being twice the hover slide amount of the hover slide operation, but is not particularly limited to twice.
  • the state change amount the movement amount in the screen 45 of the map image has been described, but other state change amounts can be similarly applied.
  • an operation change target in the application 65 is changed to another change target.
  • changing the operation change target in the application 65 to another change target when changing the luminance changed according to the hover slide amount of the hover slide operation to saturation, or reproducing the video data.
  • the volume (output sound) that has been changed according to the hover slide amount of the hover slide operation may be changed to the playback speed.
  • switching of the operation change target in the application 65 is not limited to these examples.
  • the portable terminal of 2nd Embodiment since the portable terminal of 2nd Embodiment has the structure similar to the portable terminal 1 of 1st Embodiment, it uses the same code
  • FIG. 5 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch slide operation in the second embodiment.
  • movement same as 1st Embodiment description is abbreviate
  • the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO)
  • the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed.
  • the change target changing unit 62 initializes the change target of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 (S3A).
  • step S3A the change target changing unit 62 initializes the change target according to the hover slide operation of the finger 68 when the hover slide operation (see FIG. 3A) is performed alone.
  • the change target changing unit 62 initializes the change target to change the change target according to the hover slide operation of the finger 68 from the current “selection process of video data list to be played back”. Initialization to “Volume adjustment processing during playback”. Note that the initialized processing content is preferably determined according to the application 65, and “volume adjustment processing during reproduction” is an example.
  • the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES)
  • the combined operation of the hover slide operation and the touch slide operation is performed. Is output (notified) to the application 65.
  • the change target changing unit 62 changes the change target of the operation of the application 65 corresponding to the hover slide operation determined to be executed in step S1 to another change target (S4A).
  • step S4A the change target changing unit 62 changes the change target according to the hover slide operation of the finger 68 to another change target when a combination operation of the hover slide operation and the touch slide operation is performed.
  • the change target changing unit 62 changes the change target to change the change target according to the hover slide operation of the finger 68 from, for example, the current “selection process of video data to be played back” to “double speed playback of video data”. Change to "Process".
  • step S5A or S4A the operation of the mobile terminal 1 proceeds to step S5.
  • the operation determination unit 56 calculates the slide amount of the hover slide operation determined to be executed in step S1 based on the operation determination instruction output from the state management unit 54 (S5).
  • the operation determination unit 56 outputs information on the slide amount as a calculation result to the state management unit 54.
  • the state management unit 54 Based on the information on the slide amount output from the operation determination unit 56, the state management unit 54 changes the effect of giving the change in the slide 65 for the change target obtained in step S3A or S4A to the operation in the application 65.
  • a target reflection instruction is output to the application 65.
  • the change target changing unit 62 Based on the change target reflection instruction output from the state management unit 54, the change target changing unit 62 reflects the change according to the slide amount with respect to the change target obtained in step S3A or S4A to the operation of the application 65. (S8A). After step S8A, the operation of the mobile terminal 1 returns to step S1.
  • the mobile terminal 1 performs the hover slide operation and the touch slide operation using two fingers, for example, when the list of video data to be reproduced is selected by performing the hover slide alone.
  • the combination operation the selection processing of the video data list is changed to the volume adjustment processing at the time of reproduction of the currently reproduced video data.
  • the portable terminal 1 can easily change the operation change target in the application 65 by a combination operation of the hover slide operation and the touch slide operation, and can give a comfortable operability to the user. .
  • the portable terminal of the third embodiment has the same configuration as that of the first embodiment, the same reference numerals are used for the same components as those of the first embodiment, and description thereof is omitted and different. The contents will be described.
  • FIG. 6 is a diagram illustrating a combination operation of a hover slide operation and a touch slide operation in the third embodiment.
  • FIG. 6A is a diagram illustrating a state at the start of the combination operation.
  • FIG. 6B is a diagram illustrating a state in which the inter-finger spacing between the finger performing the hover slide operation (index finger 68a) and the finger performing the touch slide operation (thumb 68b) after the combination operation is started is shortened. It is.
  • FIG. 6C is a diagram illustrating a state where the locus of the combination operation is a circle. Also in this embodiment, the state change amount of the operation in the application 65 is the amount of movement in the screen 45 of the map image 47.
  • the index finger 68a is close to the touch panel 15 and the thumb 68b is touched (contacted). Suppose that it is in the state.
  • the distance m between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is maintained, and the hover slide operation is performed using the index finger 68a and the thumb 68b. Assume that the combination operation with the touch slide operation is performed in the direction of the arrow a1.
  • the movement amount (slide amount or scroll amount) by which the map image 47 moves within the screen 45 is maintained at a constant movement amount (see arrow b1). That is, the amount of movement of the map image 47 within the screen 45 is the amount of movement equivalent to the hover slide amount of the hover slide operation of the index finger 68a.
  • the amount of movement (slide amount) by which the map image 47 moves within the screen 45 increases, for example, unlike the first embodiment (see arrow b2). That is, the amount of movement of the map image 47 in the screen 45 is, for example, twice the amount of hover slide of the hover slide operation of the index finger 68a.
  • hover slide operation and the combination operation of the hover slide operation and the touch slide operation are not limited to operations for drawing a linear locus (see FIGS. 6A and 6B). It may be a continuous operation (see FIG. 6C).
  • the distance between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is maintained, or the distance is widened or narrowed.
  • the state change amount of the operation in the application 65 is changed according to the interval.
  • the mobile terminal 1 adjusts the volume at the time of reproducing the video data according to the rotation amount, or a plurality of thumbnails displayed in the screen 45 by a combination operation that draws a circular motion locus in FIG. Can be cyclically switched and displayed.
  • FIG. 7 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch slide operation in the third embodiment.
  • the flowchart shown in FIG. 7 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 6 (A) and 6 (B), for example.
  • movement same as 1st Embodiment description is abbreviate
  • the state management unit 54 determines that the touch slide operation is not performed with the user's thumb 68b (S2, NO)
  • the application indicates that the combination operation of the hover slide operation and the touch slide operation is not performed.
  • the state change amount changing unit 61 initializes the state change amount of the operation of the application 65 according to the hover slide operation determined to be executed in step S1 (S3).
  • the state management unit 54 determines that the touch slide operation is being performed with the user's thumb 68b (S2, YES), the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b are in the same direction. Then, it is determined whether or not the interval between the position on the touch panel 15 (screen 45) in the vertically downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is within a predetermined specified value ( S2A, S2B). 4 and 5, illustration of the operations of steps S2A and S2B is omitted, but the contents of the operations of steps S2A and S2B shown in FIG. 7 are included in step S2.
  • the state management unit 54 determines that the hover slide operation of the index finger 68a and the touch slide operation of the thumb 68b are in the same direction, and the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the touch panel 15 ( When it is determined that the distance from the position on the screen 45) is within the specified value (S2A-YES, S2B-YES), the combination operation of the hover slide operation and the touch slide operation is in the same direction and between the fingers Output (notify) to the application 65 that the interval is within the specified value.
  • the state change amount changing unit 61 changes the state change amount of the operation of the application 65 according to the hover slide operation determined to be executed in step S1 to the state change amount according to the inter-finger interval (S4B, (See FIG. 6B).
  • the specified value used in the determination in step S2B is set to a range where the positions of the index finger 68a and the thumb 68b are considered to be appropriate (touch acceptable range), and the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a. It is determined by relative coordinates starting from the position inside (see FIG. 8).
  • FIG. 8 is a diagram showing a touch acceptable range 45a set in accordance with the position of the index finger 68a on the screen 45. As shown in FIG.
  • the touch receivable range 45a is a distance (interval) between a position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a that is operated by the hover and a position on the touch panel 15 (screen 45) of the touched thumb 68b. ) Is within a prescribed value (for example, a circular or elliptical range).
  • the position of the index finger 68a that is, the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a operating the hover is a position of a circle 50A shown in FIG.
  • the touch position of the thumb 68b is included in the touch receivable range 45a (see the hatched portion in FIG. 8), it is determined that the inter-finger interval between the index finger 68a and the thumb 68b is within a specified value.
  • step S2B if the inter-finger spacing exceeds the specified value, the operation of the mobile terminal 1 proceeds to the operation of step S3, assuming that there is a possibility of an erroneous operation by the user. Since the process after step S5 is the same as that of the first embodiment, the description thereof is omitted.
  • the mobile terminal 1 reduces the movement amount (state change amount) of the map image in the screen 45 when the finger interval between the index finger 68a operating the hover and the thumb 68b operating the touch slide is reduced. ) Can be increased and the distance between fingers can be increased, the amount of movement (state change amount) of the map image within the screen 45 can be reduced, and fine adjustment of the state change amount can be easily performed.
  • the inter-finger interval and the increase / decrease in the magnitude of the state change amount may be in a direct proportional or inverse proportional relationship.
  • the movement amount (state of the map image in the screen 45) (Change amount) may be decreased.
  • the amount of movement (state of the map image in the screen 45 from the time when the inter-finger interval is changed) (Change amount) may be further changed.
  • the sliding directions of the index finger 68a for the hover slide operation and the thumb 68b for the touch slide operation are the same direction, or the touch acceptable range 45a is provided, so that the mobile terminal 1 may be erroneously operated by the user. Operations can be eliminated and user operability can be improved.
  • the mobile terminal 1 may determine the touch receivable range 45a of the touch operation after determining the user's dominant hand. For example, when the mobile terminal 1 has touch coordinates (x1, y1) and proximity coordinates (x2, y2, z2), the touch coordinates (x1, y1) and the coordinates (x2, y2) of the proximity coordinates on the xy plane
  • the user's dominant hand may be determined according to the positive direction or the negative direction of the slope of the straight line. In this case, the mobile terminal 1 determines that it is right-handed if the slope of the straight line is the right monotonously increasing direction (positive direction), and is determined to be left-handed if it is the left monotonically increasing direction (negative direction).
  • the setting of the touch receivable range 45a can be applied to the first and second embodiments described above. In each of the following embodiments, the touch acceptable range 45a may be applied.
  • the portable terminal of 4th Embodiment since the portable terminal of 4th Embodiment has the structure similar to the portable terminal 1 of 1st Embodiment, it uses the same code
  • FIG. 9 is a diagram illustrating a combination operation of a hover slide operation and a touch hold operation in the fourth embodiment.
  • FIG. 9A is a diagram illustrating a state where the hover slide operation is performed independently.
  • FIG. 9B is a diagram illustrating a state where a combination operation of a hover slide operation and a touch hold operation is performed.
  • the index finger 68a shown in FIG. 9A starts a hover slide operation on the touch panel 15 (screen 45)
  • the index finger 68a is placed on the touch panel 15 as shown in FIG. 9B. Touch and continue the touch hold operation.
  • the index finger 68a is in the state of hover operation, and the map image 47 displayed in the screen 45 continues to move (scroll), and the map image 47 in the direction when the hover slide operation is performed (see reference sign d). Scrolls.
  • FIG. 10 is a flowchart for explaining the operation procedure of the mobile terminal 1 according to the combination operation of the hover slide operation and the touch hold operation in the fourth embodiment.
  • the flowchart shown in FIG. 10 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 9A and 9B.
  • the state management unit 54 determines whether or not a hover slide operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S11). When it is determined that the hover slide operation is performed, the operation of the mobile terminal 1 proceeds to step S12.
  • the state management unit 54 outputs (notifies) that the hover slide operation is being performed to the application 65.
  • the state change amount changing unit 61 executes a change process in the operation of the application 65 according to the hover slide operation without changing the state change amount of the operation of the application 65 according to the hover slide operation (S12).
  • step S12 the mobile terminal 1 performs scrolling of the map image 47 displayed in the screen 45 based on the hover slide amount of the hover slide operation as an example of the operation of the application 65 in response to the hover slide operation. To do.
  • the state change amount changing unit 61 holds the state change amount (change condition) of the operation of the application 65 according to the hover slide operation in the memory 40 (S13).
  • the change condition may be a state change amount of the operation of the application 65 according to the hover slide operation or a change target of the operation of the application 65 according to the hover slide operation.
  • the state management unit 54 determines whether or not the hover slide operation has stopped and the state of the hover operation has been entered based on the operation determination result information from the operation determination unit 56 (S14). When it is determined that the hover slide operation is not stopped, the operation of the mobile terminal 1 returns to the operation of step S12.
  • the state management unit 54 determines whether the touch hold operation is performed simultaneously with the stop of the hover slide operation (S15).
  • the term “simultaneously” does not limit the stop of the hover slide operation and the detection of the touch hold operation, and may include, for example, a case where the touch hold operation is detected immediately before the hover slide operation stops. .
  • the state management unit 54 stops the change process of the operation of the application 65 according to the hover slide operation (S16). For example, the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S16). After step S16, the operation of the portable terminal 1 returns to the operation of step S11.
  • the state management unit 54 determines that the simultaneous touch hold operation has been performed (S15, YES)
  • the process is continued (S17).
  • the mobile terminal 1 automatically performs screen scrolling when the hover slide operation is performed independently even when the hover slide operation is stopped and the hover operation is performed and the touch hold operation is performed simultaneously with the stop of the hover slide operation. Can continue.
  • the state management unit 54 determines whether or not the touch hold operation is continued based on the information on the touch coordinates (x, y) output from the touch coordinate extraction unit 52 (S18). When it is determined that the touch hold operation is continued (S18, YES), the operation of the mobile terminal 1 returns to the operation of step S17.
  • the state management unit 54 determines whether the application 65 corresponding to the hover slide operation The operation change process is stopped (S19). For example, the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S19).
  • the mobile terminal 1 can easily scroll the map image 47 in the screen 45 even using a combination operation of a hover slide operation and a touch hold operation. Further, since the mobile terminal 1 does not need to perform the hover slide operation during the touch hold operation and only needs to perform the hover operation, a malfunction occurs due to detection of the return operation when the hover slide operation is continuously performed. None will happen.
  • the mobile terminal 1 of the present embodiment is not limited to continuing the movement state (screen scroll state) of the map image 47 within the screen 45, and for example, the movement amount (state change) of the map image 47 within the screen 45 (The screen scroll amount) may be continued (maintained). For example, if a fast hover slide operation is performed with the index finger 68a, the portable terminal 1 continues the change corresponding to the fast hover slide operation even when the thumb 68b is touch-held, and the slow hover slide operation is performed with the index finger 68a. If so, it is possible to continue the change corresponding to the slow hover slide operation at the time of the touch hold by the thumb 68b.
  • the mobile terminal of the fifth embodiment has the same configuration as the mobile terminal 1 of the fourth embodiment, the same reference numerals are used for the same components as those of the mobile terminal 1 of the fourth embodiment. Thus, the description will be omitted, and different contents will be described.
  • FIG. 11 is a diagram illustrating a combination operation of a hover slide operation and a touch hold operation according to the fifth embodiment.
  • FIG. 11A is a diagram illustrating a state where a combination operation of a hover slide operation and a touch hold operation is performed.
  • FIG. 11B is a diagram illustrating a state in which the inter-finger interval between the finger performing the hover operation and the finger performing the touch hold operation after the start of the combination operation is shortened.
  • the thumb 68b touches the screen 45, so that the screen 45 of the map image 47 is displayed. Assume that the movement (scrolling) inside continues.
  • the inter-finger spacing between the position on the touch panel 15 (screen 45) in the vertical downward direction of the index finger 68a and the position on the touch panel 15 (screen 45) of the thumb 68b is set.
  • the state change amount of the operation of the application 65 increases or decreases (see FIG. 11B).
  • the state change amount For example, the amount of movement of the map image 47 within the screen 45 increases.
  • FIG. 12 is a flowchart illustrating an operation procedure of the mobile terminal 1 according to a combination operation of the hover slide operation and the touch hold operation in the fifth embodiment.
  • FIG. 12A shows the overall operation.
  • FIG. 12B is a diagram showing details of the change continuation process.
  • the flowchart shown in FIG. 12 represents an operation procedure of the mobile terminal 1 in response to an input operation on the screen 45 (touch panel 15) shown in FIGS. 11 (A) and 11 (B).
  • movement same as 5th Embodiment description is abbreviate
  • step S17A change continuation processing is performed by the state management unit 54 and the control extension unit 64 (S17A). Details of the change continuation processing will be described with reference to FIG.
  • the state management unit 54 determines whether or not the touch position of the thumb 68b shown in FIG. 12B has moved based on the operation determination result information from the operation determination unit 56 (S21).
  • the state management unit 54 reduces the state change amount of the operation of the application 65 according to the hover slide operation. Information to that effect is output to the application 65.
  • the state change amount changing unit 61 reduces the state change amount of the operation of the application 65 in response to the hover slide operation based on the information output from the state management unit 54 (S22). After step S22, the operation of the mobile terminal 1 proceeds to step S24.
  • the state management unit 54 determines the state change amount of the operation of the application 65 according to the hover slide operation. Information indicating the increase is output to the application 65.
  • the state change amount changing unit 61 increases the state change amount of the operation of the application 65 in response to the hover slide operation based on the information output from the state management unit 54 (S23). After step S23, the operation of the mobile terminal 1 proceeds to step S24.
  • step S24 when there is no movement of the touch position, that is, when the touch hold operation is continued at the position touched by the thumb 68b (S21, none), the operation of the portable terminal 1 proceeds to the process of step S24.
  • step S24 the state management unit 54 continues the process of changing the operation of the application 65 according to the hover slide operation according to the change condition held in step S13 (S24).
  • the mobile terminal 1 automatically performs screen scrolling when the hover slide operation is performed independently even when the hover slide operation is stopped and the hover operation is performed and the touch hold operation is performed simultaneously with the stop of the hover slide operation. Can continue.
  • the state management unit 54 determines whether or not the touch hold operation is continued based on the operation determination result information from the operation determination unit 56 (S25). When it is determined that the touch hold operation is continued (S25, YES), the operation of the mobile terminal 1 returns to the operation of step S21. When it is determined that the touch hold operation is not continued (S25, NO), the change continuation process of the mobile terminal 1 is finished, and the operation of the mobile terminal 1 proceeds to step S19.
  • the state management unit 54 stops the change process of the operation of the application 65 according to the hover slide operation (S19).
  • the mobile terminal 1 stops scrolling in the screen 45 of the map image 47 executed in step S12 (S19).
  • the mobile terminal 1 according to the present embodiment moves the thumb 68b in which the touch hold operation has been continued in the forward direction with respect to the direction of the index finger 68a for the hover slide operation, as compared with the mobile terminal 1 according to the fourth embodiment.
  • the state change amount of the operation of the application 65 can be easily adjusted, and the operability for the user can be improved.
  • the portable terminal of the present embodiment designates an item to be selected by a hover operation of the index finger 68a, and determines an item designated by a touch operation of the thumb 68b as a selection target.
  • FIG. 13 is a block diagram illustrating a functional configuration of the mobile terminal 1A according to the sixth embodiment.
  • the same components as those of the mobile terminal 1 shown in FIG. 2 are denoted by the same reference numerals, description thereof is omitted, and different contents are described.
  • a mobile terminal 1A shown in FIG. 13 includes a proximity detection unit 5, a touch detection unit 10, a screen display unit 30, a memory 40, a proximity coordinate extraction unit 51, a touch coordinate extraction unit 52, a state management unit 54, an image button management unit 55, An operation determination unit 56, an indicator state management unit 67, an application screen generation unit 59, and an application 65 are included.
  • the state management unit 54 determines whether or not the user's index finger 68a is in a proximity state by a hover operation, and the user's thumb 68b performs a touch operation to determine the contact state. Further, it is determined whether or not the thumb 68b is touched and the proximity state and the contact state coexist when the index finger 68a is in the proximity state by the hover operation.
  • the status management unit 54 When the state management unit 54 determines that the user's index finger 68 a is hovering, the status management unit 54 outputs information indicating that the index finger 68 a is hovering to the application 65 and the indicator state management unit 67.
  • the indicator state management unit 67 selects the user at a position on the touch panel 15 (screen 45) in the vertical downward direction of the finger (for example, the index finger 68a) that is operating the hover.
  • An indicator generation instruction for displaying a pointer as an indicator indicating a target candidate is output to the application screen generation unit 59.
  • the indicator generation instruction includes a position (x coordinate and y coordinate) on the touch panel 15 where the indicator is displayed, and the type of the indicator.
  • the indicator state management unit 67 may use a cursor in addition to the pointer as the type of indicator included in the indicator generation instruction, or selected when a plurality of items are displayed on the screen 45.
  • a focus may be used to explicitly identify the target.
  • the application screen generation unit 59 Based on the indicator generation instruction output from the indicator state management unit 67, the application screen generation unit 59 displays the indicator at the position of the touch panel 15 (screen 45) in the vertical downward direction of the finger (for example, the index finger 68a) that is operating the hover. Is generated and displayed on the screen display unit 30. Note that the application screen generation unit 59 acquires the indicator via the image button management unit 55 when the pointer shape is stored in the image button database 55a as an indicator, for example.
  • the application 65 does not illustrate the configuration of the control extension unit 64, but may or may not include the control extension unit 64.
  • FIG. 14 is a diagram illustrating a combination operation of a hover operation and a touch operation in the sixth embodiment.
  • FIG. 14A is a diagram illustrating a state where the hover operation is performed independently.
  • FIG. 14B is a diagram illustrating a state where a combination operation of a hover operation and a touch operation is performed.
  • buttons 81 as selectable items are displayed on the screen 45 (touch panel 15) shown in FIG. 14A
  • the plurality of buttons 81 are displayed by the hover operation of the index finger 68a.
  • a specific button for example, letter “c”
  • the specific button is specified around the specific button or a part or all of the specific button.
  • a possible pointer 85 is displayed.
  • the specific specified as the selection target surrounded by the pointer 85 Button (for example, the letter “c”) is selected.
  • the selection of the plurality of buttons 81 selected in advance is confirmed by the touch operation on the screen 45 of the thumb 68b. May be.
  • FIG. 15 is a flowchart illustrating an operation procedure of the mobile terminal 1A according to a combination operation of a hover operation and a touch operation according to the sixth embodiment.
  • the state management unit 54 determines whether or not a hover operation is being performed with the user's index finger 68 based on the operation determination result information from the operation determination unit 56 (S31). When it is determined that the hover operation is being performed, the operation of the mobile terminal 1 proceeds to step S2.
  • the state management unit 54 determines that the hover operation is being performed with the user's index finger 68a (S31, YES), based on the operation determination result information from the operation determination unit 56, the user's thumb 68b. Whether or not a touch operation has been performed is determined (S32).
  • step S32 when the determination that the touch operation is not performed is the first time, the operation of the mobile terminal 1A returns to step S31, and the determination that the touch operation is not performed is other than the first time. In that case, the operation of the portable terminal 1A returns to step S32.
  • the number of times that the touch operation has not been performed is temporarily stored in the RAM 22, and the operation determination unit 56 determines whether the hover operation is determined in step S31 or the touch operation in step S32 based on the information on the determination number. Make a decision.
  • the state management unit 54 determines that the touch operation has been performed with the user's thumb 68b (S32, YES)
  • the state management unit 54 is vertically below the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. It is determined whether or not there is a button at a position (proximity position) on the touch panel 15 (screen 45) in the direction (S33).
  • the state management unit 54 searches the image button database 55a via the image button management unit 55, and determines whether or not a button exists in the proximity position.
  • step S35 When it is determined that there is no button at a position (proximity position) on the touch panel 15 (screen 45) in the vertical downward direction of the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. (S33, NO), the operation of the portable terminal 1A proceeds to step S35.
  • the state management unit 54 has a button at a position (proximity position) on the touch panel 15 (screen 45) in the vertical downward direction of the proximity coordinates (x, y, z) corresponding to the hover operation determined in step S31. If it is determined that it is present (S33, YES), information indicating that the type (information) of the button at the position designated by the hover operation is held is output to the indicator state management unit 67 (S34). As an example of the operation of step S34, the state management unit 54 stores (stores) the type (information) of the button at the position specified by the hover operation in the RAM 22 or the memory 40, and further confirms that the button is specified as the selection target. A pointer (indicator) is displayed for explicit recognition.
  • the state management unit 54 displays information indicating that the type of the button at the position specified by the hover operation is retained as the indicator state management unit Output to 67. Based on the information output from the state management unit 54, the indicator state management unit 67 selects the user's selection target at a position on the touch panel 15 (screen 45) in the vertical downward direction of the finger (index finger 68a) that is operating the hover. An indicator generation instruction for displaying a pointer as an indicator indicating the candidate is output to the application screen generation unit 59.
  • the application screen generation unit 59 uses a button at the position of the touch panel 15 (screen 45) in the vertically downward direction of the finger (index finger 68a) that is operating the hover. An application screen displaying the corresponding indicator is generated and displayed on the screen display unit 30.
  • step S31 the state management unit 54 continues the hover operation with the user's index finger 68 based on the information of the proximity coordinates (x, y, z) output from the proximity coordinate extraction unit 51. It is determined whether or not (S35). When it is determined that the hover operation is continuously performed by the user's index finger 68 (S35, YES), the operation of the mobile terminal 1A returns to step S32.
  • the state management unit 54 determines that the hover operation is not performed with the user's index finger 68 (S35, NO)
  • the state management unit 54 causes the operation corresponding to the operation after the hover operation to be executed (S36). For example, when a touch operation is performed after a hover operation, an operation corresponding to the touched position is executed. If it is determined in step S33 that there is a button in the proximity position, as an example of the operation in step S36, the state management unit 54 determines the button type (information) or button function specified by the hover operation. The corresponding operation is executed (S36).
  • the mobile terminal 1A designates the button to be selected by the hover operation of the index finger 68a when, for example, a plurality of items (buttons) are displayed in the screen 45, and touches the thumb 68b. By the operation, selection of the designated button can be confirmed. Thereby, 1 A of portable terminals can select the small button displayed in the screen 45 correctly, and can start the process according to the button.
  • the mobile terminal 1A is not limited to the case where a button is specified by a hover operation and the selection of the button is confirmed by a touch operation, but the change in the operation state of the application 65 is performed by a combination operation of the hover operation and the touch operation.
  • the operation state of the application 65 may be initialized, or the change amount or the change target may be changed as in the above-described embodiments.
  • the mobile terminal 1A may determine that the combination operation of the hover operation and the touch operation is performed only when the touch operation is performed in the touch acceptable range 45a illustrated in FIG. Thereby, 1 A of portable terminals can eliminate the erroneous operation of combined operation with hover operation and touch operation.
  • the movement (scrolling) of the screen 45 on which the map image 47 is displayed is shown as an example of the operation for the hover slide operation, but a photograph or a handwritten image may be used instead of the map image 47.
  • a photograph or a handwritten image may be used instead of the map image 47.
  • scrolling (selection) of a list item of video data or audio data, volume (output audio), content reproduction, stop, fast forward, fast reverse, and the like may be used.
  • examples of the change target that is changed by the combination operation of the hover slide operation and the touch slide operation include volume, luminance, saturation, transparency, and the like. In reproduction of content (for example, video data), double speed reproduction is exemplified.
  • the intermediate state of the change in the operation of the application 65 according to the hover slide operation is memorized by the combination operation of the hover slide operation and the touch hold operation, and is reproduced from the stored state again.
  • the time (timing) at which the touch hold operation is performed such as “bookmark”, may be stored by the combined operation of the hover slide operation and the touch hold operation during reproduction of video data or audio data.
  • the operation of the application 65 according to the hover slide operation may be resumed.
  • the operation state of the application 65 according to the hover slide operation may be reset.
  • the present invention is useful as an input device, an input support method, and a program that efficiently select control contents for content according to a user input operation on a touch panel when displaying an image on a screen and provide comfortable operability. is there.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

Le contenu d'une image (47) affichée sur un écran (45) change si un doigt (68) qui reste à proximité de l'écran (45) est amené à décrire un mouvement de survol. L'image (47) affichée sur l'écran (45) parcourt deux fois la distance que le doigt (68) a parcouru dans la même direction de survol. Si néanmoins une opération de toucher-glisser est ajoutée à l'opération de survol, le pouce (68b) est amené à toucher l'écran (45) et le doigt pointeur (68a) qui est resté à proximité de l'écran est amené à décrire un mouvement de survol. L'image (47) affichée sur l'écran (45) est changée dans la même direction que la direction de glissement du doigt pointeur (68a) et de la même distance, c'est-à-dire, dans une même proportion.
PCT/JP2013/001799 2012-04-27 2013-03-15 Dispositif d'entrée, programme et procédé d'aide à l'entrée Ceased WO2013161170A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-104123 2012-04-27
JP2012104123A JP2013232119A (ja) 2012-04-27 2012-04-27 入力装置、入力支援方法及びプログラム

Publications (1)

Publication Number Publication Date
WO2013161170A1 true WO2013161170A1 (fr) 2013-10-31

Family

ID=49482528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/001799 Ceased WO2013161170A1 (fr) 2012-04-27 2013-03-15 Dispositif d'entrée, programme et procédé d'aide à l'entrée

Country Status (2)

Country Link
JP (1) JP2013232119A (fr)
WO (1) WO2013161170A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106134186A (zh) * 2014-02-26 2016-11-16 微软技术许可有限责任公司 遥现体验

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015099526A (ja) * 2013-11-20 2015-05-28 富士通株式会社 情報処理装置および情報処理プログラム
JP6410537B2 (ja) * 2014-09-16 2018-10-24 キヤノン株式会社 情報処理装置、その制御方法、プログラム、及び記憶媒体
KR102339839B1 (ko) 2014-12-26 2021-12-15 삼성전자주식회사 제스처 입력 처리 방법 및 장치
JP2017021449A (ja) * 2015-07-07 2017-01-26 富士通株式会社 情報処理装置、表示制御方法および表示制御プログラム
JP7142196B2 (ja) * 2016-12-27 2022-09-27 パナソニックIpマネジメント株式会社 電子機器、タブレット端末、入力制御方法、及びプログラム
JP2019144955A (ja) * 2018-02-22 2019-08-29 京セラ株式会社 電子機器、制御方法およびプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009525538A (ja) * 2006-01-30 2009-07-09 アップル インコーポレイテッド マルチポイント感知装置を用いたジェスチャリング
WO2011005977A2 (fr) * 2009-07-10 2011-01-13 Apple Inc. Détection par pression et par pointage
WO2011056387A1 (fr) * 2009-11-03 2011-05-12 Qualcomm Incorporated Méthodes d'exécution de gestes multipoints sur une surface tactile monopoint
JP2012133729A (ja) * 2010-12-24 2012-07-12 Sony Corp 情報処理装置、情報処理方法、およびプログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009525538A (ja) * 2006-01-30 2009-07-09 アップル インコーポレイテッド マルチポイント感知装置を用いたジェスチャリング
WO2011005977A2 (fr) * 2009-07-10 2011-01-13 Apple Inc. Détection par pression et par pointage
WO2011056387A1 (fr) * 2009-11-03 2011-05-12 Qualcomm Incorporated Méthodes d'exécution de gestes multipoints sur une surface tactile monopoint
JP2012133729A (ja) * 2010-12-24 2012-07-12 Sony Corp 情報処理装置、情報処理方法、およびプログラム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106134186A (zh) * 2014-02-26 2016-11-16 微软技术许可有限责任公司 遥现体验
CN106134186B (zh) * 2014-02-26 2021-02-26 微软技术许可有限责任公司 遥现体验

Also Published As

Publication number Publication date
JP2013232119A (ja) 2013-11-14

Similar Documents

Publication Publication Date Title
US10216407B2 (en) Display control apparatus, display control method and display control program
JP5721662B2 (ja) 入力受付方法、入力受付プログラム、及び入力装置
US9772762B2 (en) Variable scale scrolling and resizing of displayed images based upon gesture speed
KR101224588B1 (ko) 멀티포인트 스트록을 감지하기 위한 ui 제공방법 및 이를적용한 멀티미디어 기기
TWI585673B (zh) 輸入裝置與使用者介面互動
US9035883B2 (en) Systems and methods for modifying virtual keyboards on a user interface
US9013422B2 (en) Device, method, and storage medium storing program
JP5828800B2 (ja) 表示装置、表示制御方法及びプログラム
US10073493B2 (en) Device and method for controlling a display panel
US20100138782A1 (en) Item and view specific options
EP2613247B1 (fr) Procédé et appareil d'affichage d'un clavier pour un terminal à écran tactile
KR102168648B1 (ko) 사용자 단말 장치 및 그 제어 방법
CN107077295A (zh) 一种快速分屏的方法、装置、电子设备、显示界面以及存储介质
US10579248B2 (en) Method and device for displaying image by using scroll bar
WO2013161170A1 (fr) Dispositif d'entrée, programme et procédé d'aide à l'entrée
US9298364B2 (en) Mobile electronic device, screen control method, and storage medium strong screen control program
TW200928916A (en) Method for operating software input panel
WO2010060502A1 (fr) Options spécifiques à un élément et à une vue
JP5854928B2 (ja) タッチ検出機能を有する電子機器、プログラムおよびタッチ検出機能を有する電子機器の制御方法
WO2014112029A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2012174247A (ja) 携帯電子機器、接触操作制御方法および接触操作制御プログラム
KR101165388B1 (ko) 이종의 입력 장치를 이용하여 화면을 제어하는 방법 및 그 단말장치
JP5969320B2 (ja) 携帯端末装置
JP5955421B2 (ja) 入力装置、入力支援方法及びプログラム
JP2016099948A (ja) 電子機器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13781732

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13781732

Country of ref document: EP

Kind code of ref document: A1