WO2012043079A1 - 情報処理装置 - Google Patents
情報処理装置 Download PDFInfo
- Publication number
- WO2012043079A1 WO2012043079A1 PCT/JP2011/068355 JP2011068355W WO2012043079A1 WO 2012043079 A1 WO2012043079 A1 WO 2012043079A1 JP 2011068355 W JP2011068355 W JP 2011068355W WO 2012043079 A1 WO2012043079 A1 WO 2012043079A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- information processing
- operation input
- detection
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1643—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an information processing apparatus that receives an operation input performed by a user, a control method thereof, and an information storage medium storing the control program.
- the information processing apparatus accepts the position of the object on the detection surface and the moving direction of the object as an operation input by the user.
- the position and moving direction of the object are acquired as the position and orientation with respect to the touch sensor itself. That is, the position of the object is evaluated as an absolute position in the detection surface of the touch sensor, and the moving direction of the object is evaluated as an inclination with respect to the reference axis set on the detection surface of the touch sensor. For this reason, for example, the user needs to perform an operation input while confirming the position and orientation of the detection surface with his / her eyes at a position where the detection surface of the touch sensor is in front of the user.
- the present invention has been made in consideration of the above circumstances, and its purpose is information processing that allows a user to perform operation input to the touch sensor at an arbitrary position or orientation regardless of the position or orientation of the touch sensor.
- An apparatus, a control method thereof, and an information storage medium storing the control program are provided.
- An information processing apparatus is a touch sensor that detects the position of an object on a detection surface, and includes two touch sensors arranged opposite to each other and one touch of the two touch sensors. Based on one or a plurality of detection positions by the sensor, the reference determination means for determining at least one of a reference point and a reference axis, and the detection content by the other touch sensor of the two touch sensors, the reference determination means And an operation input specifying means for specifying the operation input content of the user by evaluating based on at least one of the reference point and the reference axis to be determined.
- the reference determining means determines the reference axis and its positive / negative direction based on three or more detection positions by the one touch sensor, and the operation input specifying means is the other touch sensor.
- the direction indicated by the user may be specified as the relative direction of the reference axis with respect to the positive and negative directions based on the detected content.
- the information processing apparatus further includes a reference sensor determining unit that determines any one of the two touch sensors as a reference sensor used for determining at least one of the reference point and the reference axis.
- the determining means may determine at least one of the reference point and the reference axis based on one or a plurality of detection positions by the reference sensor.
- the information processing apparatus further includes a state transition unit that determines to shift to an operation input state using a reference sensor based on a detection result of at least one of the two touch sensors, and the reference determination unit includes: In the operation input state using the reference sensor, at least one of the reference point and the reference axis is determined, and the operation input specifying means determines the detection content by the other touch sensor in the operation input state using the reference sensor. The evaluation may be performed based on at least one of the reference point and the reference axis.
- the information processing apparatus control method is a touch sensor that detects the position of an object on a detection surface, and is an information processing apparatus that is connected to two touch sensors that are arranged to face each other.
- the information storage medium is a touch sensor for detecting the position of an object on the detection surface, and the computer connected to the two touch sensors arranged to face each other is connected to the two touches.
- Reference determining means for determining at least one of a reference point and a reference axis based on one or more detection positions by one touch sensor of the sensors, and detection by the other touch sensor of the two touch sensors
- a computer storing a program for evaluating the contents based on at least one of the reference point and the reference axis determined by the reference determining means and functioning as an operation input specifying means for specifying the user's operation input contents
- a readable information storage medium is a touch sensor for detecting the position of an object on the detection surface, and the computer connected to the two touch sensors arranged to face each other is connected to the two touches.
- Reference determining means for determining at least one of a reference point and a reference axis based on one or more detection positions by one touch sensor of the sensors, and detection by the other touch sensor of the two touch sensors
- FIG. 1A and 1B are perspective views showing an appearance of an information processing apparatus 1 according to an embodiment of the present invention.
- FIG. 1A shows a state of the information processing apparatus 1 viewed from the front surface (front side). Shows the state seen from the back side.
- the information processing apparatus 1 according to the present embodiment is assumed to be a portable device such as a portable game machine.
- the housing 10 of the information processing apparatus 1 has a substantially rectangular flat plate shape as a whole.
- the horizontal direction (width direction) of the housing 10 is defined as the X-axis direction
- the vertical direction (height direction) is defined as the Y-axis direction
- the thickness direction (depth direction) is defined as the Z-axis direction.
- a touch panel 12 is provided on the surface of the housing 10.
- the touch panel 12 has a substantially rectangular shape and includes a display 14 and a surface touch sensor 16.
- the display 14 may be various image display devices such as a liquid crystal display panel and an organic EL display panel.
- the surface touch sensor 16 is disposed so as to overlap the display 14 and includes a substantially rectangular detection surface having a shape and a size corresponding to the display surface of the display 14. When an object such as a user's finger or stylus comes into contact with the detection surface, the contact position of the object is detected.
- the surface touch sensor 16 does not necessarily detect the position of the object only when the object touches the detection surface. When the object comes close to the detectable range on the detection surface, the surface touch sensor 16 does not detect the position of the object. The position may be detected.
- the surface touch sensor 16 may be of any type as long as it is a device that can detect the position of an object on the detection surface, such as a capacitance type, a pressure sensitive type, or an optical type.
- the surface touch sensor 16 is a multi-point detection type touch sensor that can detect contact of an object at a plurality of locations.
- the surface touch sensor 16 may be a sensor that can detect the area (contact area) of a portion in contact with the detection surface of the object and the strength (pressure) at which the object presses the detection surface.
- the back surface touch sensor 18 is disposed on the back surface side of the housing 10 so as to face the front surface touch sensor 16.
- the back surface touch sensor 18 includes a substantially rectangular detection surface having a shape and a size corresponding to the detection surface of the front surface touch sensor 16, and detects the position of an object on the detection surface, similarly to the front surface touch sensor 16. . That is, the display surface of the display 14, the detection surface of the front surface touch sensor 16, and the detection surface of the back surface touch sensor 18 are all substantially the same type and the same size, and are parallel to the XY plane of the housing 10. Arranged in a straight line along the thickness direction (Z-axis direction) of the housing 10.
- the back surface touch sensor 18 is a multipoint detection type touch sensor capable of detecting contact of an object at a plurality of locations, similarly to the front surface touch sensor 16. Similar to the front surface touch sensor 16, the back surface touch sensor 18 may be of various types. In the present embodiment, the front surface touch sensor 16 and the back surface touch sensor 18 are touch sensors of substantially the same type and the same size, but if the two touch sensors are arranged to face each other, the two touch sensors are not necessarily the same type. , It does not have to be the same size.
- the user touches his / her finger on the detection surface of the front surface touch sensor 16 or the back surface touch sensor 18 or moves his / her finger while the finger is in contact with the detection surface.
- an operation input to the information processing apparatus 1 is performed.
- the operation in which the user touches a point on the detection surface, moves the finger linearly in any direction on the detection surface with the position as the starting point, and then releases the finger is referred to as a slide operation below.
- the slide operation is an operation in which the user instructs the information processing apparatus 1 in the direction.
- both the front surface touch sensor 16 and the back surface touch sensor 18 are multipoint detection type touch sensors, the user can perform various operation inputs by simultaneously bringing his / her fingers into contact with these touch sensors. it can.
- the information processing apparatus 1 performs various operations other than the front surface touch sensor 16 and the rear surface touch sensor 18 to accept user operation inputs such as buttons and switches.
- the member may be provided on the front surface, the back surface, the side surface, or the like of the housing 10.
- FIG. 2 is a configuration block diagram showing the internal configuration of the information processing apparatus 1.
- the information processing apparatus 1 includes a control unit 20, a storage unit 22, and an image processing unit 24 therein.
- the control unit 20 is configured to include a CPU, for example, and executes various types of information processing according to programs stored in the storage unit 22. A specific example of processing executed by the control unit 20 will be described later.
- the storage unit 22 is, for example, a memory element such as a RAM or a ROM, a disk device, and the like, and stores a program executed by the control unit 20 and various data.
- the storage unit 22 also functions as a work memory for the control unit 20.
- the image processing unit 24 includes, for example, a GPU and a frame buffer memory, and draws an image to be displayed on the display 14 in accordance with an instruction output from the control unit 20.
- the image processing unit 24 includes a frame buffer memory corresponding to the display area of the display 14, and the GPU writes an image into the frame buffer memory at predetermined time intervals in accordance with instructions from the control unit 20. Then, the image written in the frame buffer memory is converted into a video signal at a predetermined timing and displayed on the display 14.
- the information processing apparatus 1 uses one of the two touch sensors of the front surface touch sensor 16 and the back surface touch sensor 18 as a reference sensor and the other as an operation input sensor. Then, based on the detected position of the object obtained by the reference sensor, at least one of the reference point and the reference axis of the operation input sensor is determined, and the operation input is performed using the determined reference point and / or reference axis. Evaluate the sensor detection results.
- the user performs operation input by simultaneously bringing other fingers into contact with the detection surface of the operation input sensor while maintaining the state in which his or her fingers are in contact with the detection surface of the reference sensor.
- the user performs operation input by moving the thumb of the same hand on the detection surface of the operation input sensor while bringing the index finger, middle finger, ring finger, etc. of one hand into contact with the detection surface of the reference sensor.
- the operation input may be performed by bringing the finger of the other hand into contact with the detection surface of the operation input sensor while bringing the finger of one hand into contact with the detection surface of the reference sensor.
- the user can perform an operation input to the operation input sensor based on a relative position and / or direction with respect to the position of the finger in contact with the reference sensor.
- an operation input method using the reference sensor and the operation input sensor is referred to as a reference sensor using operation.
- the information processing apparatus 1 functionally includes a reference sensor determining unit 30, a reference point / reference axis determining unit 32, and an operation input specifying unit 34 as shown in FIG. Is done. These functions are realized by the control unit 20 executing a program stored in the storage unit 22. This program may be provided by being stored in various computer-readable information storage media such as an optical disk and a memory card, or may be provided to the information processing apparatus 1 via a communication network such as the Internet.
- the reference sensor determination unit 30 uses which of the two touch sensors of the front surface touch sensor 16 and the back surface touch sensor 18 as a reference sensor and which is used as an operation input sensor. To decide. As described above, when the user performs an operation using the reference sensor, the information processing apparatus 1 accepts the user's operation input to the operation input sensor in a state where the reference sensor detects the touch of the user's finger. . That is, when the user performs an operation input, both the reference sensor and the operation input sensor detect the touch of the user's finger. Therefore, the reference sensor determination unit 30 may determine which one to use as the reference sensor based on the detection results of the front surface touch sensor 16 and the back surface touch sensor 18.
- the reference sensor determination unit 30 first detects a finger contact when any of the two touch sensors detects a finger contact in a state where both the touch sensors do not detect the finger contact of the user.
- the detected touch sensor may be determined as a reference sensor.
- the touch sensor that detects the predetermined number or more of finger contacts may be used as the reference sensor.
- any one of the touch sensors may be fixedly used as the reference sensor. In this case, the reference sensor determination unit 30 may be omitted.
- the reference point / reference axis determination unit 32 determines at least one of the reference point RP and the reference axis RA used as a reference in the reference sensor using operation based on one or a plurality of detection positions by the reference sensor. Whether the reference point / reference axis determination unit 32 determines only the reference point RP, only the reference axis RA, or both the reference point RP and the reference axis RA is being executed by the information processing apparatus 1 It may be determined by the program.
- the reference point / reference axis determination unit 32 determines the detection position in the detection plane.
- the position coordinates may be used as the position coordinates of the reference point RP in the detection surface of the operation input sensor.
- the reference point / reference axis determination unit 32 may use the representative points of the plurality of detection positions (for example, the centroids of the plurality of detection positions) as the reference point RP.
- the reference sensor needs to detect a plurality of positions.
- the reference point / reference axis determination unit 32 determines the reference axis RA based on the plurality of detection positions. Specifically, when there are two detection positions by the reference sensor, the reference point / reference axis determination unit 32 determines a straight line connecting these two detection positions as the reference axis RA. Further, when there are three or more detection positions by the reference sensor, for example, the reference point / reference axis determination unit 32 specifies a rectangle having the minimum area including all of the three or more detection positions, and the direction parallel to the long side.
- the reference axis RA is determined.
- the ellipse having the minimum area including all of these three or more detection positions may be specified, and the long axis may be used as the reference axis RA.
- the reference point / reference axis determination unit 32 specifies two detection positions that are the most distant from each other among the three or more detection positions detected by the reference sensor, and a straight line that connects the two detection positions. May be the reference axis RA.
- a straight line that approximates three or more detection positions may be calculated, and the straight line may be used as the reference axis RA.
- the reference point / reference axis determination unit 32 determines the direction of the reference axis RA determined based on these detection positions (that is, on the reference axis RA). Which direction is the positive direction) may be determined. This will be described later.
- the operation input specifying unit 34 specifies the operation input content of the user by evaluating the detection content by the operation input sensor based on the reference point RP and / or the reference axis RA determined by the reference point / reference axis determination unit 32. To do. Specifically, when the reference point / reference axis determining unit 32 has determined the reference point RP, the operation input specifying unit 34 is a coordinate indicating the detection position by the operation input sensor as a relative position viewed from the reference point RP. The value is acquired as a value indicating the operation input content performed by the user.
- the operation input specifying unit 34 causes the user to move his / her finger.
- the direction instructed by the user is specified by evaluating the moved direction as a relative direction with respect to the reference axis RA.
- the information processing apparatus 1 executes the acceptance process of the reference sensor using operation as described below by executing the game program.
- the back surface touch sensor 18 functions as a reference sensor
- the front surface touch sensor 16 functions as an operation input sensor.
- the user maintains a state in which one finger (for example, an index finger) is in contact with an arbitrary position of the back surface touch sensor 18 while holding the housing 10 of the information processing apparatus 1 with one hand.
- the reference point / reference axis determination unit 32 determines the position on the detection surface of the front surface touch sensor 16 corresponding to the position of the user's index finger detected by the back surface touch sensor 18 as the reference point RP.
- FIG. 4 is an explanatory diagram for explaining a first example of the operation using the reference sensor, in which the user performs the operation input by moving the thumb within the input range IR while designating the position of the reference point RP with the index finger. Is shown.
- the operation input specifying unit 34 uses the position coordinates of the reference point RP to calculate the relative coordinates of the detection position detected by the surface touch sensor 16 (hereinafter referred to as the input position IP) with respect to the reference point RP.
- the absolute position coordinate of the reference point RP is (xr, yr) and the absolute position coordinate of the input position IP is (xi, yi)
- the relative coordinate is represented by (xi-xr, yi-yr).
- the operation input specifying unit 34 outputs this coordinate value to the game program as a value indicating the user's operation input content.
- the operation input specifying unit 34 calculates the distance between the two points and the angle indicating the direction of the input position IP viewed from the reference point RP based on the position coordinates of the reference point RP and the input position IP.
- the angle value may be output as a value indicating the user's operation input content.
- the user moves the finger on the detection surface of the surface touch sensor 16 around the reference point RP, and as if performing an operation of tilting the analog stick, the user can freely select 360 degrees.
- the operation input for instructing the direction can be realized.
- the reference point RP is a position facing the position where the user is touching the finger on the detection surface of the back surface touch sensor 18, where in the detection surface of the front surface touch sensor 16 is set as the reference point RP.
- the user can decide freely. That is, the user can set the position where the user can easily perform an operation input as the reference point RP according to the ease of holding the housing 10 or the display content of the display 14.
- the direction of the input position IP viewed from the reference point RP is evaluated using the X axis and Y axis preset in the housing 10 of the information processing apparatus 1 as the reference axes.
- the reference point / reference axis determination unit 32 determines not only the reference point RP but also the reference axis RA based on the positions of these fingers.
- the operation input specifying unit 34 may evaluate the operation input content of the user based on both the reference point RP and the reference axis RA.
- the operation input specifying unit 34 has the relative position coordinates of the position obtained by reversely rotating the input position IP around the reference point RP by an angle corresponding to the inclination of the reference axis RA with respect to the X axis (or Y axis). May be calculated as a value indicating the content of the operation input performed by the user.
- the information processing apparatus 1 displays the display 14 so that the user can easily grasp where the reference point RP and the input range IR are set in the detection surface of the operation input sensor during the execution of the reference sensor using operation.
- An image such as a marker indicating the reference point RP or the input range IR may be displayed at a position corresponding to the reference point RP and the input range IR.
- a method will be described in which a user can perform an operation input for instructing a direction without directly viewing and confirming the housing 10 of the information processing apparatus 1.
- a reference sensor utilization operation acceptance process when the information processing apparatus 1 executes an audio reproduction program for reproducing audio data such as music and functions as a music player will be described below.
- the user keeps the information processing apparatus 1 in the clothes pocket, bag, etc., and the housing 10 has either the front surface or the back surface, and the right or upward direction of the detection surface (that is, the X axis). Even if it is not possible to grasp which side is the positive direction or the Y-axis positive direction), it is possible to perform an operation input instructing a desired direction.
- both the front surface touch sensor 16 and the rear surface touch sensor 18 can function as a reference sensor depending on how the user holds the housing 10. Accordingly, when the user's fingers touch a predetermined number (for example, three) or more of any of the touch sensors during execution of the audio reproduction process, for example, the reference sensor determination unit 30 determines that touch sensor as the reference sensor. . As a result, the user can perform an operation input while using any one of the touch sensors as the reference sensor, even when the information processing apparatus 1 is in a state where it is not possible to grasp which is the front side and which is the back side.
- the reference point / reference axis determination unit 32 subsequently determines the reference axis RA based on the detection position of the reference sensor.
- the inclination of the reference axis RA is calculated based on the position coordinates of three or more detection positions by the reference sensor, as already described.
- the reference point / reference axis determination unit 32 determines not only the inclination of the reference axis RA with respect to the detection surface but also which side is the positive direction and which side is the negative direction on the reference axis RA. Is determined based on the detection position of the reference sensor.
- 5A and 5B are diagrams for explaining an example of a method of determining the positive / negative direction of the reference axis RA, and the detection result of the reference sensor (here, assumed to be the back surface touch sensor 18) and the reference axis RA. Shows the relationship.
- the user has three fingers of the index finger, middle finger, and ring finger of one hand in contact with the reference sensor, and the contact position of these fingers is within the detection plane of the reference sensor.
- the reference point / reference axis determination unit 32 has a detection position DP (detection position DP2 in FIGS.
- the positive direction of the reference axis RA is determined depending on which side is located with respect to the straight line connecting the detection positions DP (detection positions DP1 and DP3 in FIGS. 5A and 5B) located at both ends. More specifically, in the examples of FIGS. 5A and 5B, the right hand direction when the intermediate detection position DP2 is viewed in the direction protruding from the detection positions DP1 and DP3 at both ends is the reference axis RA. The positive direction. In FIGS. 5A and 5B, the positive direction of the reference axis RA is indicated by an arrow.
- the reference axis RA is determined depending on which side the parabola is a convex curve when viewed from the reference axis RA.
- the positive direction and the negative direction may be determined. Normally, when the user brings a plurality of fingers of one hand into contact with the reference sensor, the contact positions of these fingers are assumed to line up so as to draw a convex parabola toward the opposite side of the palm.
- the operation input specifying unit 34 receives the user's operation input to the operation input sensor and evaluates the content based on the inclination and positive / negative of the reference axis RA.
- the user performs a slide operation in an arbitrary direction with respect to the operation input sensor.
- the operation input specifying unit 34 determines whether the direction of the slide operation is close to the positive direction or the negative direction of the reference axis RA, and outputs the determination result to the sound reproduction program.
- the audio reproduction program executes, for example, a process for increasing / decreasing the volume of audio reproduction or switching the music being reproduced, depending on whether the direction of the slide operation performed by the user corresponds to the positive or negative direction of the reference axis RA. Execute the process.
- the user can easily operate regardless of the direction of the housing 10.
- An operation input for instructing the direction to the operation input sensor can be performed by bringing the finger into contact with the reference sensor so that becomes the reference axis RA. Therefore, for example, it is possible to perform an operation input for instructing the direction without directly checking the direction with the information processing apparatus 1 in the pocket or the like. Further, even when the information processing apparatus 1 is operated while directly looking, it is possible to perform an operation input instructing the direction while holding the information processing apparatus 1 without bothering.
- the direction of the slide operation is evaluated as either the positive direction or the negative direction of the reference axis RA.
- the present invention is not limited to this, and the slide operation along the direction orthogonal to the reference axis RA is performed. May be acceptable.
- the user can perform an instruction operation in four directions, up, down, left, and right, with the positive direction of the reference axis RA as the left direction.
- the positive / negative direction of the reference axis RA is determined based on the detection results of three or more locations by the reference sensor.
- the reference point / reference axis determination unit 32 does not necessarily determine whether the reference axis RA is positive or negative. It is not necessary to determine the direction.
- the user can perform an operation input to the information processing apparatus 1 by properly using two types of sliding operations in a direction parallel to the reference axis RA and sliding operations in a direction orthogonal to the reference axis RA.
- the reference point / reference axis determination unit 32 may determine the direction of the reference axis RA by a method such as setting the direction closer to the X-axis positive direction of the housing 10 as the positive direction. In this way, even when the reference sensor detects only two detection positions, the positive direction of the reference axis RA can be determined.
- the reference point / reference axis determination unit 32 may determine the positive direction / negative direction of the reference axis RA using a sensor for posture detection built in the housing 10. For example, if a sensor capable of detecting the direction of gravitational acceleration, such as a three-axis acceleration sensor, is built in the casing 10, the reference point / reference axis determination unit 32 is configured so that the casing 10 is operated when the reference sensor using operation is performed. It is possible to determine the orientation with respect to the vertical direction. Therefore, the reference point / reference axis determination unit 32 may determine the positive / negative direction of the reference axis RA according to the relationship between the orientation of the housing 10 and the vertical direction.
- a sensor capable of detecting the direction of gravitational acceleration such as a three-axis acceleration sensor
- the reference point / reference axis determination unit 32 sets the direction closer to the X-axis positive direction of the reference axis RA as the positive direction and the vertical direction to the Y-axis positive direction. If they are close, the direction closer to the negative X-axis direction of the reference axis RA may be determined as the positive direction. In this way, the user can instruct the left and right directions in such a direction that the direction closer to the ground is downward. Further, the reference point / reference axis determination unit 32 may set the direction closer to the vertical direction of the reference axis RA as the negative direction. In this way, regardless of the orientation of the housing 10, the user can instruct the up and down direction in such a direction that the direction closer to the ground is downward.
- the information processing apparatus 1 may be able to execute not only a reference sensor using operation using the reference sensor and the operation input sensor as described above but also an operation input using normal absolute coordinates.
- a state in which an operation input using normal absolute coordinates is referred to as a normal input state
- a state in which a reference sensor using operation is received that is, the detection result of the operation input sensor is not the absolute coordinate but the reference point RP or the reference axis.
- the state evaluated as a relative value to RA is referred to as a reference input state.
- the information processing apparatus 1 may have a function capable of shifting to a state (such as a key lock state) that restricts operation input by a normal method so that an erroneous operation does not occur while being trapped. is there.
- the information processing apparatus 1 includes a switch for shifting to such an operation input restricted state on the surface of the housing 10, and the user operates the switch to input information to the information processing apparatus 1. Move to restricted state. Therefore, the information processing apparatus 1 may accept only the reference sensor using operation even in such an operation input restricted state. In this case, the information processing apparatus 1 shifts to the operation input restricted state, shifts to the reference input state, and returns to the normal input state at the timing when the operation input restricted state is released.
- the information processing apparatus 1 may determine whether or not to shift to the reference input state using a brightness sensor that can detect the brightness of the surrounding environment.
- the information processing apparatus 1 includes a brightness sensor on the surface of the housing 10 and periodically determines whether or not the detection result is below a predetermined threshold value. When it is determined that the threshold value is below the predetermined threshold value, the process shifts to the reference input state. When the detection result of the brightness sensor is equal to or higher than the predetermined threshold value, the normal input state is restored.
- the information processing apparatus 1 may shift to the reference input state based on the detection result of at least one of the front surface touch sensor 16 and the rear surface touch sensor 18. For example, when performing an operation using the reference sensor, the user needs to bring a finger into contact with both of the two touch sensors. Therefore, the information processing apparatus 1 may shift to the reference input state when both the front surface touch sensor 16 and the rear surface touch sensor 18 detect contact of the user's finger at the same time. Further, as in the second example described above, when the reference point / reference axis determination unit 32 determines the reference axis RA and its positive / negative direction, one of the touch sensors needs to detect the contact of three or more fingers. There is.
- the information processing apparatus 1 may shift to the reference input state when any of the touch sensors detects three or more positions. Moreover, not only these conditions but it is good also as shifting to a reference
- the information processing apparatus 1 shifts to the reference input state according to the detection result of the brightness sensor or the touch sensor, the detection result of these sensors satisfies the condition in order to prevent an erroneous operation.
- the detection result of these sensors satisfies the condition in order to prevent an erroneous operation.
- a sound is output from the speaker to be notified, the built-in motor is operated to vibrate the housing 10, and a predetermined image ( An icon image or the like may be displayed.
- the user designates the reference point RP and / or the reference axis RA while contacting his / her finger or the like with the detection surface of the reference sensor.
- the operation input can be performed at a desired position and orientation on the detection surface of the operation input sensor regardless of the orientation and orientation of the housing 10. it can.
- the embodiments of the present invention are not limited to those described above.
- the determination method of the reference point RP and the reference axis RA in the above description is merely an example, and the information processing apparatus 1 may determine the reference point RP and the reference axis RA based on the detection result of the reference sensor by other methods.
- the information processing apparatus 1 implements various types of information processing by receiving input of relative positions and directions with respect to the reference point RP and the reference axis RA from the user. You can do it.
- the front surface touch sensor 16 and the rear surface touch sensor 18 are provided in the housing 10 of the information processing apparatus 1, but the operation device including the front surface touch sensor 16 and the rear surface touch sensor 18 is
- the information processing apparatus 1 may be prepared separately.
- the information processing apparatus 1 is a consumer game machine, a personal computer, or the like, and an operation device including two touch sensors facing each other is connected to the information processing apparatus 1 by wired or wireless communication. May be.
- the operation device transmits the detection results of the respective touch sensors to the information processing apparatus 1, and the information processing apparatus 1 executes the same processing as described above using these detection results. Good.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (6)
- それぞれ検出面上における物体の位置を検出するタッチセンサであって、互いに対向して配置される2つのタッチセンサと、
前記2つのタッチセンサのうちの一方のタッチセンサによる1又は複数の検出位置に基づいて、基準点及び基準軸の少なくとも一方を決定する基準決定手段と、
前記2つのタッチセンサのうちの他方のタッチセンサによる検出内容を、前記基準決定手段が決定する前記基準点及び基準軸の少なくとも一方に基づいて評価して、ユーザの操作入力内容を特定する操作入力特定手段と、
を含むことを特徴とする情報処理装置。 - 請求項1に記載の情報処理装置において、
前記基準決定手段は、前記一方のタッチセンサによる3以上の検出位置に基づいて、前記基準軸及びその正負方向を決定し、
前記操作入力特定手段は、前記他方のタッチセンサによる検出内容から、ユーザが指示する方向を前記基準軸の正負方向に対する相対的な向きとして特定する
ことを特徴とする情報処理装置。 - 請求項1に記載の情報処理装置において、
前記2つのタッチのセンサのうち、いずれか一方を、前記基準点及び基準軸の少なくとも一方の決定に用いる基準センサとして決定する基準センサ決定手段をさらに含み、
前記基準決定手段は、前記基準センサによる1又は複数の検出位置に基づいて、前記基準点及び基準軸の少なくとも一方を決定する
ことを特徴とする情報処理装置。 - 請求項1に記載の情報処理装置において、
前記2つのタッチセンサの少なくとも一方の検出結果に基づいて、基準センサを用いる操作入力状態に移行することを決定する状態移行手段をさらに含み、
前記基準決定手段は、前記基準センサを用いる操作入力状態において、前記基準点及び基準軸の少なくとも一方を決定し、
前記操作入力特定手段は、前記基準センサを用いる操作入力状態において、前記他方のタッチセンサによる検出内容を前記基準点及び基準軸の少なくとも一方に基づいて評価する
ことを特徴とする情報処理装置。 - それぞれ検出面上における物体の位置を検出するタッチセンサであって、互いに対向して配置される2つのタッチセンサと接続される情報処理装置の制御方法であって、
前記2つのタッチセンサのうちの一方のタッチセンサによる1又は複数の検出位置に基づいて、基準点及び基準軸の少なくとも一方を決定する基準決定ステップと、
前記2つのタッチセンサのうちの他方のタッチセンサによる検出内容を、前記基準決定ステップで決定される前記基準点及び基準軸の少なくとも一方に基づいて評価して、ユーザの操作入力内容を特定する操作入力特定ステップと、
を含むことを特徴とする情報処理装置の制御方法。 - それぞれ検出面上における物体の位置を検出するタッチセンサであって、互いに対向して配置される2つのタッチセンサと接続されるコンピュータを、
前記2つのタッチセンサのうちの一方のタッチセンサによる1又は複数の検出位置に基づいて、基準点及び基準軸の少なくとも一方を決定する基準決定手段、及び、
前記2つのタッチセンサのうちの他方のタッチセンサによる検出内容を、前記基準決定手段が決定する前記基準点及び基準軸の少なくとも一方に基づいて評価して、ユーザの操作入力内容を特定する操作入力特定手段、
として機能させるためのプログラムを記憶した、コンピュータ読み取り可能な情報記憶媒体。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201180046427.XA CN103124951B (zh) | 2010-09-27 | 2011-08-11 | 信息处理装置 |
| US13/825,242 US9128550B2 (en) | 2010-09-27 | 2011-08-11 | Information processing device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010215845A JP5529700B2 (ja) | 2010-09-27 | 2010-09-27 | 情報処理装置、その制御方法、及びプログラム |
| JP2010-215845 | 2010-09-27 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012043079A1 true WO2012043079A1 (ja) | 2012-04-05 |
Family
ID=45892554
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2011/068355 Ceased WO2012043079A1 (ja) | 2010-09-27 | 2011-08-11 | 情報処理装置 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US9128550B2 (ja) |
| JP (1) | JP5529700B2 (ja) |
| CN (1) | CN103124951B (ja) |
| WO (1) | WO2012043079A1 (ja) |
Families Citing this family (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140153790A1 (en) * | 2009-10-06 | 2014-06-05 | Cherif Atia Algreatly | Biometrics Touchscreen |
| JP2013089202A (ja) * | 2011-10-21 | 2013-05-13 | Sony Computer Entertainment Inc | 入力制御装置、入力制御方法、及び入力制御プログラム |
| US9030407B2 (en) * | 2011-12-21 | 2015-05-12 | Nokia Technologies Oy | User gesture recognition |
| CN102799310B (zh) * | 2012-07-02 | 2015-07-08 | 华为终端有限公司 | 触摸屏的解锁方法及装置 |
| JP6025528B2 (ja) | 2012-11-29 | 2016-11-16 | 三菱電機株式会社 | タッチパネル装置 |
| JP5958319B2 (ja) * | 2012-12-13 | 2016-07-27 | 富士通株式会社 | 情報処理装置、プログラム、及び方法 |
| TWI474266B (zh) * | 2012-12-20 | 2015-02-21 | Inst Information Industry | 觸控方法及手持裝置 |
| US20140210746A1 (en) * | 2013-01-25 | 2014-07-31 | Seung II KIM | Display device and method for adjusting display orientation using the same |
| US10089786B2 (en) * | 2013-08-19 | 2018-10-02 | Qualcomm Incorporated | Automatic customization of graphical user interface for optical see-through head mounted display with user interaction tracking |
| KR102165445B1 (ko) | 2013-09-30 | 2020-10-14 | 엘지전자 주식회사 | 디지털 디바이스 및 그 제어 방법 |
| JP2016024580A (ja) * | 2014-07-18 | 2016-02-08 | 富士通株式会社 | 情報処理装置、入力制御方法、および入力制御プログラム |
| JP6255321B2 (ja) * | 2014-08-20 | 2017-12-27 | アルプス電気株式会社 | 情報処理装置とその指先操作識別方法並びにプログラム |
| USD831111S1 (en) | 2016-03-02 | 2018-10-16 | ACCO Brands Corporation | Dry erase board |
| KR102363707B1 (ko) * | 2017-08-03 | 2022-02-17 | 삼성전자주식회사 | 압력 센서를 포함하는 전자 장치 및 전자 장치 제어 방법 |
| CN108646932B (zh) * | 2018-04-20 | 2021-11-26 | 歌尔股份有限公司 | 一种用于电子设备的振动检测方法、装置及电子设备 |
| US11822743B2 (en) * | 2019-06-12 | 2023-11-21 | Nippon Telegraph And Telephone Corporation | Touch panel information terminal apparatus and information input processing method implemented with dual input devices arranged on two surfaces |
| USD1013779S1 (en) | 2020-08-19 | 2024-02-06 | ACCO Brands Corporation | Office panel with dry erase surface |
| USD1028083S1 (en) * | 2020-08-21 | 2024-05-21 | Dennis McGill, III | Message board set |
| JP1705696S (ja) * | 2021-01-29 | 2022-01-21 | 文具ケース |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007141029A (ja) * | 2005-11-21 | 2007-06-07 | Matsushita Electric Ind Co Ltd | 携帯情報装置 |
| WO2009031214A1 (ja) * | 2007-09-05 | 2009-03-12 | Panasonic Corporation | 携帯端末装置、及び表示制御方法 |
| JP2009187290A (ja) * | 2008-02-06 | 2009-08-20 | Yamaha Corp | タッチパネル付制御装置およびプログラム |
Family Cites Families (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7800592B2 (en) * | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
| US7289083B1 (en) * | 2000-11-30 | 2007-10-30 | Palm, Inc. | Multi-sided display for portable computer |
| JP3847641B2 (ja) | 2002-02-28 | 2006-11-22 | 株式会社ソニー・コンピュータエンタテインメント | 情報処理装置、情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、及び情報処理方法 |
| JP2003296022A (ja) * | 2002-04-01 | 2003-10-17 | Pioneer Electronic Corp | タッチパネル一体型表示装置 |
| JP3852368B2 (ja) * | 2002-05-16 | 2006-11-29 | ソニー株式会社 | 入力方法及びデータ処理装置 |
| US20070188450A1 (en) * | 2006-02-14 | 2007-08-16 | International Business Machines Corporation | Method and system for a reversible display interface mechanism |
| US8674948B2 (en) * | 2007-01-31 | 2014-03-18 | Perceptive Pixel, Inc. | Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques |
| US7936341B2 (en) * | 2007-05-30 | 2011-05-03 | Microsoft Corporation | Recognizing selection regions from multiple simultaneous inputs |
| KR200450989Y1 (ko) * | 2008-07-25 | 2010-11-16 | 이노디지털 주식회사 | 양면 터치스크린을 구비한 플랫 패널 형상의 모바일 장치 |
| KR101021857B1 (ko) | 2008-12-30 | 2011-03-17 | 삼성전자주식회사 | 듀얼 터치 센서를 이용하여 제어 신호를 입력하는 장치 및 방법 |
| US8456466B1 (en) * | 2009-04-01 | 2013-06-04 | Perceptive Pixel Inc. | Resolving ambiguous rotations in 3D manipulation |
| US8493364B2 (en) * | 2009-04-30 | 2013-07-23 | Motorola Mobility Llc | Dual sided transparent display module and portable electronic device incorporating the same |
| US20110090155A1 (en) * | 2009-10-15 | 2011-04-21 | Qualcomm Incorporated | Method, system, and computer program product combining gestural input from multiple touch screens into one gestural input |
| US8232990B2 (en) * | 2010-01-05 | 2012-07-31 | Apple Inc. | Working with 3D objects |
| JP5379176B2 (ja) * | 2011-01-25 | 2013-12-25 | 株式会社ソニー・コンピュータエンタテインメント | 携帯型電子機器 |
| JP5470350B2 (ja) * | 2011-10-21 | 2014-04-16 | 株式会社ソニー・コンピュータエンタテインメント | 入力制御装置、入力制御方法、及び入力制御プログラム |
| JP2013089202A (ja) * | 2011-10-21 | 2013-05-13 | Sony Computer Entertainment Inc | 入力制御装置、入力制御方法、及び入力制御プログラム |
| JP5414764B2 (ja) * | 2011-10-21 | 2014-02-12 | 株式会社ソニー・コンピュータエンタテインメント | 入力制御装置、入力制御方法、及び入力制御プログラム |
| JP5460679B2 (ja) * | 2011-11-28 | 2014-04-02 | ソニー株式会社 | 情報処理装置、情報処理方法、およびコンテンツファイルのデータ構造 |
-
2010
- 2010-09-27 JP JP2010215845A patent/JP5529700B2/ja active Active
-
2011
- 2011-08-11 US US13/825,242 patent/US9128550B2/en active Active
- 2011-08-11 CN CN201180046427.XA patent/CN103124951B/zh active Active
- 2011-08-11 WO PCT/JP2011/068355 patent/WO2012043079A1/ja not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007141029A (ja) * | 2005-11-21 | 2007-06-07 | Matsushita Electric Ind Co Ltd | 携帯情報装置 |
| WO2009031214A1 (ja) * | 2007-09-05 | 2009-03-12 | Panasonic Corporation | 携帯端末装置、及び表示制御方法 |
| JP2009187290A (ja) * | 2008-02-06 | 2009-08-20 | Yamaha Corp | タッチパネル付制御装置およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5529700B2 (ja) | 2014-06-25 |
| CN103124951B (zh) | 2016-01-20 |
| CN103124951A (zh) | 2013-05-29 |
| US20130181930A1 (en) | 2013-07-18 |
| US9128550B2 (en) | 2015-09-08 |
| JP2012073662A (ja) | 2012-04-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5529700B2 (ja) | 情報処理装置、その制御方法、及びプログラム | |
| US9423876B2 (en) | Omni-spatial gesture input | |
| JP5205157B2 (ja) | 携帯型画像表示装置、その制御方法、プログラム及び情報記憶媒体 | |
| JP5983503B2 (ja) | 情報処理装置及びプログラム | |
| EP2431853A2 (en) | Character input device | |
| JP6157885B2 (ja) | 携帯端末装置の表示制御方法 | |
| WO2010007813A1 (ja) | 携帯型画像表示装置、その制御方法及び情報記憶媒体 | |
| KR20130142824A (ko) | 원격 제어 장치 및 그 제어 방법 | |
| CN109558061B (zh) | 一种操作控制方法及终端 | |
| CN102004601A (zh) | 信息处理装置、信息处理方法和计算机程序 | |
| JP2014170568A (ja) | グラフィカルユーザインターフェースとの物理的相互作用を解釈するためのシステムと方法 | |
| CN105339863A (zh) | 便携式装置及其控制方法 | |
| JP2012008666A (ja) | 情報処理装置および操作入力方法 | |
| US20130088437A1 (en) | Terminal device | |
| WO2013107382A1 (zh) | 一种电子装置 | |
| JP6183820B2 (ja) | 端末、及び端末制御方法 | |
| KR101432483B1 (ko) | 제어영역을 이용한 터치스크린 제어방법 및 이를 이용한 단말 | |
| KR20140033726A (ko) | 터치 스크린을 포함한 전자 장치에서 다섯 손가락을 구별하기 위한 방법 및 장치 | |
| JP5827695B2 (ja) | 情報処理装置、情報処理方法、プログラム及び情報記憶媒体 | |
| JP5841023B2 (ja) | 情報処理装置、情報処理方法、プログラム及び情報記憶媒体 | |
| JP5855481B2 (ja) | 情報処理装置、その制御方法およびその制御プログラム | |
| KR20100058250A (ko) | 모바일 디바이스의 사용자 인터페이스 | |
| WO2013177998A1 (zh) | 一种操控部件,使用该操控部件的信息处理系统及其信息处理方法 | |
| JP5997388B2 (ja) | エミュレーション装置、エミュレーション方法、プログラム及び情報記憶媒体 | |
| JP5773818B2 (ja) | 表示制御装置、表示制御方法及びコンピュータプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201180046427.X Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11828640 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13825242 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11828640 Country of ref document: EP Kind code of ref document: A1 |