WO2010064388A1 - 表示入力装置 - Google Patents
表示入力装置 Download PDFInfo
- Publication number
- WO2010064388A1 WO2010064388A1 PCT/JP2009/006391 JP2009006391W WO2010064388A1 WO 2010064388 A1 WO2010064388 A1 WO 2010064388A1 JP 2009006391 W JP2009006391 W JP 2009006391W WO 2010064388 A1 WO2010064388 A1 WO 2010064388A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- touch panel
- image
- detection target
- input device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04805—Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
Definitions
- the present invention particularly relates to a display input device suitable for use in an in-vehicle information device such as a navigation system.
- a touch panel is an electronic component that combines a display device such as a liquid crystal panel and a coordinate position input device such as a touch pad, and is touched simply by touching an image area such as an icon displayed on the liquid crystal panel with a finger. It is a display input device that can sense the position information of an image and operate the device, and is often incorporated in a device that is mainly required to be handled intuitively, such as an in-vehicle navigation system.
- a nearby icon is enlarged when a finger is brought close, so that an erroneous operation can be prevented and a selection operation is facilitated. Since the size of the icon to be changed changes, the operation is uncomfortable, and conversely, the operability may be impaired.
- Patent Document 2 when the enlargement / reduction is controlled, the position of the touch panel surface and the finger are too far apart, and the enlargement / reduction is shaken due to the shaking of the finger in the Z-axis direction, making the control difficult. May be.
- an easy-to-understand image display is possible on a touch panel with a small button icon display area, but there is a drawback that peripheral icons other than the pressed button icon are difficult to see. It was.
- the present invention has been made to solve the above-described problems, and an object of the present invention is to provide a display input device that is easy to control and has excellent operability without feeling uncomfortable.
- a display input device includes a touch panel that displays and inputs an image, a proximity sensor that detects a movement of a detection target that faces the touch panel in a non-contact manner, and a proximity sensor.
- a control unit that processes an image around a certain range of the display area in the vicinity of the detection target on the touch panel and distinguishes it from the image in the certain range display area when a predetermined amount of approach of the detection target is detected on the touch panel It is equipped with.
- FIG. 1 is a block diagram showing a configuration of a display input device according to Embodiment 1 of the present invention.
- the display input device according to Embodiment 1 of the present invention includes a touch panel display device (hereinafter abbreviated as a touch panel) 1, an external sensor 2, and a control unit 3.
- a touch panel display device hereinafter abbreviated as a touch panel
- an external sensor 2 an external sensor
- the touch panel 1 displays and inputs information.
- a touch sensor 11 that inputs information is stacked on an LCD panel 10 that displays information.
- the touch sensor 11 further includes a touch sensor 11.
- a plurality of proximity sensors 12 for non-contact detection of the movement of the detection target in a two-dimensional manner, such as a finger and a pen positioned facing the touch panel 1, are mounted in units of cells.
- the proximity sensor 12 uses, for example, infrared rays as a detection cell, an infrared light emitting LED (Light Emitted Diode) and a light receiving transistor are arranged opposite to each other in a grid pattern on the outer periphery of the touch sensor 11 and the detection target is approached. The approach and the coordinate position are detected by the shielding or reflected light.
- the proximity sensor 12 is not limited to the infrared rays described above, and for example, the proximity sensor 12 is a static sensor that detects an approach based on a change in capacitance generated between a detection target and two flat plates arranged in parallel like a capacitor. A capacitance type may be substituted.
- the flat plate has a grounding surface on one side facing the detection target, and the other side serves as a sensor detection surface, and detects the approach of the detection target by a change in capacitance formed between the two poles and coordinates thereof. The position can be detected.
- the external sensor 2 is mounted everywhere in the vehicle and includes at least a GPS (Global Positioning System) sensor 21, a vehicle speed sensor 22, and an acceleration sensor 23.
- the GPS sensor 21 receives a radio wave from a GPS satellite, and the control unit 3 generates a signal for positioning the latitude and longitude and outputs the signal to the control unit 3.
- the vehicle speed sensor 22 measures a vehicle speed pulse for determining whether or not the vehicle is traveling and outputs the measured vehicle speed pulse to the control unit 3.
- the acceleration sensor 23 is a sensor that estimates, for example, the acceleration applied to the weight by measuring the amount of displacement of the weight attached to the spring.
- a triaxial acceleration sensor for example, from 0 (gravity acceleration only) to several hundred Hz
- the direction (attitude) with respect to the ground is measured from the sum of the acceleration vectors in the X and Y directions, and is output to the control unit 3.
- the control unit 3 detects a proximity of a predetermined amount of detection target such as a finger or a pen on the touch panel 1 by the proximity sensor 12. In addition, it has a function of processing an image outside the display area of a certain range displayed on the touch panel 1 and displaying it separately from the image of the display area of the certain range.
- a certain range of display area is processed. It was decided to display it separately from the image. Therefore, the control unit 3 includes a CPU (hereinafter referred to as a navigation CPU 30) that mainly controls the touch panel 1 for navigation processing, a drawing circuit 31, a memory 32, and a map DB (Data Base) 33.
- a display area of a certain range means that when a software keyboard is displayed in the display area of the touch panel 1, the detection is performed when a detection target such as a finger is brought close to the touch panel 1. This is a partial arrangement of candidate keys that are pressed by a target. “Outside a certain range of display area” means all key arrangements except the above-described candidate keys. For this reason, in the following description, for the sake of convenience, an image displayed in a certain range display area is referred to as “internal icon”, and an image displayed outside the certain range display area and processed to be distinguished from the internal icon is referred to as “external icon”. I will call it.
- the navigation CPU 30 performs a navigation process according to a menu selected by the user such as a route search displayed on the touch panel 1 as a route search.
- the navigation CPU 30 refers to the map information stored in the map DB 33 and performs navigation such as route search or destination guidance based on various sensor signals acquired from the external sensor 2.
- the navigation CPU 30 processes the external icon displayed on the touch panel 1 and displays it separately from the internal icon when the proximity sensor 12 detects a predetermined amount of detection target such as a finger or a pen on the touch panel 1.
- image information is generated according to a program stored in the memory 32 and the drawing circuit 31 is controlled.
- the structure of the program executed by the navigation CPU 30 in that case is shown in FIG. 2, and details thereof will be described later.
- the drawing circuit 31 develops the image information generated by the navigation CPU 30 at a constant speed on a built-in or external bitmap memory unit, and also develops the image information on the bitmap memory unit by a built-in display control unit. Image information is read out in synchronization with the display timing of the touch panel 1 (LCD panel 10) and displayed on the touch panel 1.
- the above-described bitmap memory unit and display control unit are shown in FIG. 3 and will be described in detail later.
- the memory 32 stores an image information storage area and the like assigned to the work area. Further, the map DB 33 stores maps and facility information necessary for navigation such as route search and guidance.
- FIG. 2 is a functional block diagram showing the structure of a program executed by the navigation CPU 30 of FIG. 1 included in the display input device (control unit 3) according to Embodiment 1 of the present invention.
- the navigation CPU 30 includes a main control unit 300, a proximity coordinate position calculation unit 301, a touch coordinate position calculation unit 302, an image information generation unit 303, an image information transfer unit 304, and a UI ( (User Interface) providing unit 305 and operation information processing unit 306.
- a UI User Interface
- the proximity coordinate position calculation unit 301 has a function of calculating the XY coordinate position of the finger and passing it to the main control unit 300 when the proximity sensor 12 detects the approach of the finger to the touch panel 1.
- the touch coordinate position calculation unit 302 has a function of calculating the XY coordinate position and delivering it to the main control unit 300 when a touch on the touch panel 1 by a detection target such as a finger is detected by the touch sensor 11.
- the image information generation unit 303 has a function of generating image information to be displayed on the touch panel 1 (LCD panel 10) under the control of the main control unit 300 and outputting the image information to the image information transfer unit 304.
- the image information generation unit 303 processes the image of the external icon displayed on the touch panel 1 and displays it separately from the internal icon, for example, a candidate key pressed by the finger when the finger approaches the touch panel 1
- the external icon reduced in size is generated by thinning out the pixels constituting the key array excluding the candidate key at a certain rate, while keeping the partial array (internal icon) of the above.
- the image information transfer unit 304 has a function of transferring the image information generated by the image information generation unit 303 to the drawing circuit 31 based on timing control by the main control unit 300.
- the method of reducing processing by thinning out bit images has been described.
- a beautiful reduced image can be formed by a predetermined reduction calculation process.
- a reduced-size image may be prepared in advance and presented.
- the UI providing unit 305 displays a setting screen on the touch panel 1 during environment setting, and captures a user setting input via the touch panel 1 as a reduction ratio when an image outside a display area within a certain range is reduced. It has a function to variably set the reduction ratio.
- the operation information processing unit 306 is controlled by the main control unit 300, operation information defined in information on a certain display area based on the touch coordinate position calculated by the touch coordinate position calculation unit 302, for example, a soft keyboard If it is, the image information based on the touched key is generated and output to the image information transfer unit 304. If the button is an icon button, navigation processing such as a destination search defined for the icon button is executed to execute the image processing. Information is generated, output to the image information transfer unit 304, and displayed on the touch panel 1 (LCD monitor 10).
- a predetermined capacity work area is allocated in the memory 32, and an image generated by the image information generation unit 303 is included in this work area.
- An image information storage area 322 in which information is temporarily stored is included.
- FIG. 3 is a block diagram showing an internal configuration of the drawing circuit 31 shown in FIG.
- the drawing circuit 31 includes a drawing control unit 310, an image buffer unit 311, a drawing unit 312, a bitmap memory unit 313, and a display control unit 314, each of which includes an address, Data and control lines are commonly connected via a local bus 315 composed of a plurality of lines.
- the drawing control unit 310 decodes a drawing command output from the navigation CPU 30 (image information transfer unit 304), and performs drawing preprocessing for straight line drawing, rectangular drawing, straight line inclination, and the like. Then, the drawing control unit 310 activates the drawing unit 312, and the drawing unit 312 transfers the image information decoded by the drawing control unit 310 to the bitmap memory unit 313 by high-speed writing (drawing). Then, the display control unit 314 reads the image information held in the bitmap memory unit 313 via the local bus 315 in synchronization with the display timing of the LCD panel 10 of the touch panel 1 and supplies it to the touch panel 1 (LCD panel 10). To obtain the desired display.
- FIG. 4 is a flowchart showing the operation of the display input device according to Embodiment 1 of the present invention
- FIGS. 5 and 6 show an example of display transition of the soft keyboard image displayed on the touch panel 1 at that time.
- FIG. Hereinafter, the operation of the display input device according to the first embodiment of the present invention shown in FIGS. 1 to 3 will be described in detail with reference to FIGS.
- step ST ⁇ b> 41 a soft keyboard used when searching for facilities is displayed in the display area of the touch panel 1 (step ST ⁇ b> 41).
- the proximity sensor 12 detects the proximity of the finger (“YES” in step ST ⁇ b> 42), and the XY by the proximity coordinate position calculation unit 301 of the navigation CPU 30.
- the proximity coordinate position calculation unit 301 calculates the finger coordinates (X, Y) on the touch panel 1 of the finger close to the touch panel 1, and outputs it to the main control unit 300 (step ST43).
- the main control unit 300 that has acquired the finger coordinates activates the image information generation process by the image information generation unit 303, and the image information generation unit 303 receives a part of the software keyboard located in the vicinity of the finger coordinates.
- the external icon image to be removed is reduced, combined with the internal icon image, and updated (step ST44).
- the image information generation unit 303 uses one of the already generated soft keyboards as shown in a circle in FIG. 5A, for example, to reduce the external icon image displayed on the touch panel 1.
- the image information in the partial area is synthesized. Information of a partial area around the coordinate position is emphasized, and software keyboard image information is generated.
- the user can set the reduction ratio when the external icon is reduced, which enables flexible reduction processing and provides convenience.
- the UI providing unit 305 displays a setting screen on the touch panel 1, and the reduction rate when the image information generation unit 303 performs the reduction process by taking the operation input by the user is variable. Control.
- the reduction ratio may be set dynamically according to the usage scene even when the environment is set in advance.
- the image information generated by the image information generation unit 303 is stored in the image information storage area 322 of the memory 32 and is output to the image information transfer unit 304.
- the image information transfer unit 304 transfers the image information updated accordingly to the drawing circuit 31 together with the drawing command, and the drawing circuit 31 receives the image transferred by the drawing unit 312 under the control of the drawing control unit 310.
- the information is expanded and drawn in the bitmap memory unit 313 at high speed.
- the display control unit 314 reads the updated software keyboard image shown in FIG. 5A drawn in the bitmap memory unit 313, for example, and displays it on the touch panel 1 (LCD panel 10).
- the touch coordinate position calculation unit 302 calculates the touch coordinate position and performs operation information processing.
- the unit 306 is activated, and the operation information processing unit 306 executes an operation process based on a key corresponding to the touch coordinates calculated by the touch coordinate position calculation unit 302 (step ST46).
- the operation process based on the key corresponding to the touch coordinate is generated by generating image information based on the touched key in the case of a soft keyboard and outputting it to the image information transfer unit 304.
- navigation processing such as destination search defined for the icon button is executed to generate image information, output it to the image information transfer unit 304, and display it on the touch panel 1 (LCD monitor 10).
- the control unit 3 causes the proximity sensor 12 to approach the touch panel 1 by a predetermined amount of a detection target such as a finger.
- a detection target such as a finger.
- an image (external icon) outside the display area in a certain range displayed on the touch panel 1 is processed by, for example, reduction processing and distinguished from an image (internal icon) in the display area in the certain range.
- the image outside the display area in the certain range is reduced to be distinguished from the image in the display area in the certain range.
- the shape of the external icon displayed on the touch panel 1 may be changed from a rectangular shape to a circular shape and displayed separately from the image of the internal icon.
- a process of narrowing the interval (key interval) between two or more images in the external icon displayed on the touch panel 1 is performed so as to be distinguished from the image in the display area within a certain range.
- the interval between two or more images in the display area of a certain range may be enlarged and displayed separately from the images outside the display area of the certain range.
- the image information generation unit 303 can be realized by reducing or enlarging the image at the position where the interval of the external icon is changed and updating the image.
- the external icon is instantaneously reduced and displayed, and once it is reduced to normal search display, it is instantly enlarged and reduced from step ST42 to step ST41, but the size is gradually changed like an animation effect.
- the display size may not be returned to normal as soon as the finger gets closer and closer, but may be returned after a certain time (for example, about 0.5 seconds).
- a certain time for example, about 0.5 seconds
- the touch panel display device that detects the proximity of the finger and the touch of the finger is used.
- the touch panel display device that detects the touch and the press of the finger is used, the external icon is reduced when touched. It is also possible to configure so that the normal size is displayed when the touch is released, and a predetermined operation corresponding to the icon is performed when pressed.
- FIG. FIG. 7 is a block diagram showing, in a functional manner, the structure of a program executed by the navigation CPU 30 included in the display input device (control unit 3) according to Embodiment 2 of the present invention.
- the difference from the first embodiment shown in FIG. 2 is that the navigation CPU 30 of the first embodiment has a display attribute in the program structure except for the UI providing unit 305.
- the information generation unit 307 is added.
- the display attribute information generation unit 307 processes the external icon displayed on the touch panel 1 for each image information generated by the image information generation unit 303 under the control of the main control unit 300, and displays it separately from the internal icons. In order to do this, attribute information is generated when display modification control of an image is performed based on display attributes such as gradation, color, blinking, inversion, and emphasis.
- the display attribute information generation unit 307 writes and stores the display attribute information generated by the display attribute information generation unit 307 in combination with the image information generated by the image information generation unit 303 in the image information storage area 322 of the memory 32. . For this reason, the image information transfer unit 304 draws a set of image information generated by the image information generation unit 303 and display attribute information generated by the display attribute information generation unit 307 based on timing control by the main control unit 300. Transfer to circuit 31.
- FIG. 8 is a flowchart showing the operation of the display input device according to Embodiment 2 of the present invention
- FIG. 9 is a diagram showing an example of a software keyboard image displayed on the touch panel 1 at that time.
- step ST81 to ST83 it is assumed that the normal search display screen shown in FIG. 9A is displayed on the touch panel 1, and thereafter the finger coordinates (X, Y) are displayed after the user brings the finger close to the touch panel 1. Since the processing (steps ST81 to ST83) until output to the main control unit 300 is the same as the processing of steps ST41 to ST43 described in the first embodiment, the description is omitted to avoid duplication.
- the control unit 3 performs display modification control based on the display attribute information on the external icon displayed on the touch panel 1, and displays it separately from the internal icon (step ST84).
- the main control unit 300 that has acquired the finger coordinates from the proximity coordinate position calculation unit 301 controls the image information generation unit 303 and the display attribute information generation unit 307, and the image information generation unit 303 uses the acquired finger coordinates.
- the display attribute information generating unit 307 Based on the image information generated by the image information generating unit 303, the display attribute information generating unit 307 generates image information combined with the external icon of the software keyboard located near the finger coordinates and the internal icon. Display attribute information for applying gray scale processing to the external icon displayed in 1 is generated.
- the image information generated by the image information generation unit 303 and the display attribute information generated by the display attribute information generation unit 307 are stored in pairs in the image information storage area 322 of the memory 32 and output to the image information transfer unit 304. Is done. Subsequently, the image information and the display attribute information transferred from the image information transfer unit 304 are transferred together with the drawing command to the drawing circuit 31, and the drawing circuit 31 (drawing control unit 310) that has received the drawing command performs linear drawing or rectangular drawing.
- the drawing unit 312 is activated by decoding a drawing command and the like, and the drawing unit 312 draws the image information decoded by the drawing control unit 310 in the bitmap memory unit 313 at high speed.
- the display control unit 314 reads the image information held in the bitmap memory unit 313 in synchronization with the display timing of the LCD panel 10 of the touch panel 1, and further generates the image information by the display attribute information generation unit 307. Based on the display attribute information output by the transfer unit 304, the external icon is subjected to display modification processing by gray scale (gradation control) and displayed on the touch panel 1 (LCD panel 10). An example of the software keyboard displayed at this time is shown in FIG.
- the touch coordinate position calculation unit 302 calculates the touch coordinate position and operates information.
- the processing unit 306 is activated.
- the operation information processing unit 306 executes an operation process based on the key corresponding to the touch coordinates calculated by the touch coordinate position calculation unit 302, and the series of processes described above ends (step ST86).
- the control unit 3 detects that the proximity sensor 12 has detected a predetermined amount of an object to be detected such as a finger on the touch panel 1.
- the image (external icon) outside the display area within a certain range displayed on the touch panel 1 is processed by, for example, gray scale processing, and distinguished from the image (internal icon) within the display area within the certain range.
- the internal icon is emphasized, and the operability is improved because the input operation becomes easy.
- FIG. FIG. 10 is a flowchart showing the operation of the display input apparatus according to Embodiment 3 of the present invention.
- the display input device according to the third embodiment described here is applied to a three-dimensional touch panel that can also measure the distance in the Z direction between the panel surface and the finger. Therefore, the touch panel 1 capable of detecting the position in the XY directions shown in FIG. 1 is replaced with a three-dimensional touch panel capable of measuring the distance in the Z direction.
- a technique for measuring a three-dimensional position is disclosed in Patent Document 2 described above, and will be described here as an application of this technique.
- step ST101 a soft keyboard used for facility search is displayed on the touch panel 1 in step ST101, as in the first and second embodiments.
- the proximity sensor 12 detects the proximity of the finger ("YES" in step ST102), and the proximity coordinate position calculation unit 301 of the navigation CPU 30 operates.
- the proximity coordinate position calculation unit 301 calculates finger coordinates (X, Y, Z) including the Z axis, and outputs them to the main control unit 300 (step ST103).
- the main control unit 300 that has acquired the three-dimensional finger coordinates determines the reduction ratio according to the distance of the Z axis (vertical direction) of the finger facing the touch panel by the proximity sensor 12 and displays a certain range displayed on the touch panel.
- the image outside the area is reduced and displayed (step ST104).
- the image information generation unit 303 reduces the external icons excluding a part of the software keyboard located in the vicinity of the finger coordinates based on the acquired finger coordinates in the XY directions according to the reduction rate determined by the coordinates in the Z direction. And update it with the internal icon.
- the relationship between the distance in the Z-axis direction (horizontal axis) between the panel surface of the touch panel 1 and the finger used at this time and the reduction ratio (vertical axis) is shown in the graph of FIG.
- the distance in the Z-axis direction becomes maximum at 4 cm (1: normal size display), and as the distance in the Z-axis direction approaches 4 cm to 1 cm, the reduction rate gradually decreases, and 1 cm to 0 cm.
- the reduction rate of the external icon hardly changes, and changes at a reduction rate of 0.5 times or less.
- the reduction ratio of 1.0 indicates the original size
- the reduction ratio of 0.5 indicates that the size of one side is 0.5 times.
- touch coordinate position calculation unit 302 calculates the touch coordinate position and performs operation information processing.
- the processing (step ST106) in which the operation information processing unit 306 starts the operation processing based on the key corresponding to the touch coordinates calculated by the touch coordinate position calculation unit 302 is shown in FIG. Same as 1.
- the control unit 3 detects that a proximity sensor 12 detects a predetermined amount of a detection target such as a finger on the touch panel 1
- An internal icon is emphasized by reducing and displaying an image (external icon) outside the display area of a certain range displayed on the touch panel 1 according to a reduction ratio corresponding to the distance in the vertical direction of the detection target facing the touch panel. Therefore, the input operation becomes easy, and the operability is improved.
- the external icon is not limited to being reduced according to the distance to the Z-axis direction. For example, the level of a display attribute such as gray scale is changed according to the distance in the Z-axis direction. Also good.
- the control unit 3 detects that the proximity sensor 12 has detected a predetermined amount of approach of the detection target on the touch panel 1.
- the processing load on the control unit 3 is processed by processing an image (external icon) outside the display area within a certain range displayed on the touch panel 1 and displaying it separately from the image (internal icon) within the display area within the certain range. Therefore, it is possible to provide a display input device having an excellent operability that does not feel uncomfortable.
- only the software keyboard has been described as information to be displayed as information of one or more fixed ranges of display areas. The specific information displayed in any display area of the touch panel 1 is not limited. Further, although only the finger is illustrated as the detection target, the same effect can be obtained even with a detection object such as a pen instead of the finger.
- the display input device is applied to an in-vehicle information device such as a navigation system.
- the present invention may be applied to a personal computer, FA (Factory Automation) computer input / output means, or to a guidance system such as a public institution or event venue.
- FA Vectory Automation
- control unit 3 (navigation CPU 30) shown in FIGS. 2 and 7 may be realized entirely by hardware, or at least a part thereof may be realized by software.
- the control unit 3 processes an image (external icon) outside the display area within a certain range displayed on the touch panel 1 when the proximity sensor 12 detects a predetermined amount of approach of the detection target on the touch panel 1.
- Data processing to be displayed separately from an image (internal icon) in a display area within a certain range may be realized on a computer by one or a plurality of programs, and at least a part thereof may be realized by hardware. .
- the display input device is suitable for use in an in-vehicle information device of a navigation system and the like because it is easy to control and has excellent operability with no sense of incongruity in operation.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Input From Keyboards Or The Like (AREA)
- Navigation (AREA)
Abstract
Description
例えば、指を近づけたときに指の近傍に位置するキースイッチを拡大して表示し、選択操作を容易化した表示入力装置(例えば、特許文献1参照)、垂直方向の距離を検出し、その距離に応じた拡大率で情報を表示するCRT装置(例えば、特許文献2参照)、アニメーション機能により、周辺のボタンアイコンが回転移動して押下されたボタンアイコンに収束して表示される表示装置および表示方法(例えば、特許文献3参照)等が知られている。
更に、特許文献3に開示された技術によれば、ボタンアイコンの表示面積が小さいタッチパネルにおいてはわかりやすい画像表示が可能であるが、押下したボタンアイコン以外の周辺アイコンは見えにくいといった欠点を有していた。
実施の形態1.
図1は、この発明の実施の形態1に係る表示入力装置の構成を示すブロック図である。図1に示されるように、この発明の実施の形態1に係る表示入力装置は、タッチパネル式表示装置(以下、タッチパネルと略称する)1と、外部センサ2と、制御部3と、により構成される。
近接センサ12は、その検出セルが上記した赤外線に限らず、例えば、検出対象とコンデンサのように平行に配置された2枚の平板との間に生じる静電容量の変化により接近を検出する静電容量型で代替してもよい。この場合、平板は、一方の片側が検出対象に向く接地面、他方の片側がセンサ検出面となり、この2極間に形成される静電容量の変化により検出対象の接近を検出するとともにその座標位置を検出することができる。
GPSセンサ21は、GPS衛星からの電波を受信して、制御部3が、緯度、経度を測位するための信号を生成して制御部3に出力する。車速センサ22は、車両が走行中か否かを判定するための車速バルスを計測して制御部3に出力する。加速度センサ23は、例えば、バネに取り付けた錘が変位する量を計測して錘にかかる加速度を推定するセンサであり、3軸加速度センサの場合、例えば、0(重力加速度のみ)から数100Hzまでの加速度変動に追従し、X、Y方向の加速度ベクトルの合計から地面に対する向き(姿勢)を測定して制御部3に出力する。
このため、制御部3は、ナビゲーション処理を主にタッチパネル1の制御を行うCPU(以下、ナビCPU30という)と、描画回路31と、メモリ32と、地図DB(Data Base)33、とにより構成される。
また、ナビCPU30は、近接センサ12でタッチパネル1に指やペン等検出対象の所定量の接近が検出された場合に、タッチパネル1に表示された外部アイコンを加工し、内部アイコンと区別して表示する制御部3としての機能を実現するために、メモリ32に記憶されたプログラムにしたがい画像情報を生成して描画回路31の制御を行う。その場合のナビCPU30が実行するプログラムの構造は図2に示されており、その詳細は後述する。
上記したビットマップメモリ部および表示制御部は、図3に示されており、その詳細は後述する。
また、地図DB33には、ルート検索、誘導等、ナビゲーションに必要な地図や施設情報等が格納されている。
図2に示されるように、ナビCPU30は、主制御部300と、近接座標位置計算部301と、タッチ座標位置計算部302と、画像情報生成部303と、画像情報転送部304と、UI(User Interface)提供部305と、操作情報処理部306とを含む。
タッチ座標位置計算部302は、指等の検出対象によるタッチパネル1へのタッチがタッチセンサ11により検出された場合に、そのXY座標位置を計算して主制御部300に引き渡す機能を有する。
画像情報生成部303は、タッチパネル1に表示された外部アイコンの画像を加工し、内部アイコンと区別して表示するために、例えば、タッチパネル1に指が近づいたときに、指により押下される候補キーの一部配列(内部アイコン)はそのままにし、候補キーを除くキー配列を構成する画素を一定の割合で間引くことにより縮小された外部アイコンを生成する。このように、元の画像の画素を一定の割合で間引くことにより更新された外部アイコンと内部アイコンとを合成して生成される画像情報を、描画指令とともに描画回路31に出力する。また、画像情報転送部304は、画像情報生成部303により生成された画像情報を主制御部300によるタイミング制御に基づき描画回路31に転送する機能を有する。ここではビット画像の間引きで縮小処理の方法で説明したが、ビット画像ではなくベクタ画像の場合は所定の縮小計算処理によりきれいな縮小画像ができる。また、予め縮小サイズの画像を準備しておいてそれを提示しても良い。
また、操作情報処理部306は、主制御部300による制御の下、タッチ座標位置計算部302で計算されたタッチ座標位置に基づく一定の表示領域の情報に定義された操作情報、例えば、ソフトキーボードであれば、タッチされたキーに基づく画像情報を生成して画像情報転送部304へ出力し、アイコンボタンであれば、そのアイコンボタンに定義された目的地検索等のナビゲーション処理を実行して画像情報を生成し、画像情報転送部304へ出力し、それぞれタッチパネル1(LCDモニタ10)に表示する機能を有する。
そして、表示制御部314は、ビットマップメモリ部313に保持された画像情報をタッチパネル1のLCDパネル10の表示タイミングに同期してローカルバス315経由で読出し、タッチパネル1(LCDパネル10)に供給して所望の表示を得る。
以下、図4~図6を参照しながら、図1~図3に示すこの発明の実施の形態1に係る表示入力装置の動作について詳細に説明する。
ここで、近接座標位置計算部301は、タッチパネル1に近接した指のタッチパネル1上における指座標(X、Y)を計算し、主制御部300に出力する(ステップST43)。
すなわち、画像情報生成部303は、タッチパネル1に表示された外部アイコンの画像を縮小処理するために、例えば、図5(a)の円内に示されるように、既に生成済みのソフトキーボードの一部領域(内部アイコン)を除いて隣接する周辺の画像情報(外部アイコン)をメモリ32の画像情報記憶領域322から一定の割合で読み出し間引き、一部領域内の画像情報を合成することにより、指座標位置周辺の一部領域の情報が強調されソフトウエアキーボード画像情報を生成する。
具体的には、主制御部300による制御の下、UI提供部305がタッチパネル1に設定画面を表示し、ユーザによる操作入力を取り込んで画像情報生成部303が縮小処理する際の縮小率を可変制御する。縮小率の設定は事前の環境設定時であっても利用シーンにあわせて動的に設定してもよい。
画像情報転送部304は、これをうけて更新された画像情報を描画指令とともに描画回路31に転送し、描画回路31は、描画制御部310による制御の下で描画部312が転送をうけた画像情報を展開してビットマップメモリ部313に高速で描画する。そして、表示制御部314がビットマップメモリ部313に描画された、例えば、図5(a)に示す更新されたソフトウエアキーボード画像を読み出し、タッチパネル1(LCDパネル10)に表示を行う。
また、図6(a)に示されるように、タッチパネル1に表示された外部アイコンにおける2以上の画像の間隔(キー間隔)を狭める処理を行い、一定範囲の表示領域の画像と区別して表示してもよく、また、図6(b)に示されるように、一定範囲の表示領域における2以上の画像の間隔を拡大し、一定範囲の表示領域外の画像と区別して表示してもよい。いずれも上記した画像情報生成部303が、外部アイコンの間隔を変更する位置の画像に縮小、あるいは拡大処理を施し画像更新することにより実現が可能である。
ステップST44では外部アイコンを瞬時に縮小表示に、一度縮小してから通常の検索表示にするステップST42からステップST41にかけて瞬時に、拡大縮小を行ったがアニメーション効果のように徐々にサイズを変えるようにしても使い勝手の良い操作感を得ることができる。また、指が遠くなって近づいてすぐに表示サイズを通常にもどすのではなく、一定時間(例えば0.5秒程度)たってから戻しても良い。ただし、指が近接したままX,Y軸方向に移動している場合は瞬時に表示内容を変えるほうが操作感が良い。
上記実施例では、指の近接と、指のタッチを検出するタッチパネル表示装置を用いたが、指の接触と押下とを検出するタッチパネル表示装置を用いて、タッチした場合には外部アイコンを縮小して表示し、タッチをはずれると通常のサイズとし、押下の場合にアイコンに応じた所定の操作をするように構成しても良い。
図7は、この発明の実施の形態2に係る表示入力装置(制御部3)が有するナビCPU30が実行するプログラムの構造を機能展開して示したブロック図である。
この発明の実施の形態2に係る表示入力装置において、図2に示す実施の形態1との差異は、実施の形態1のナビCPU30が、UI提供部305を除いて有するプログラム構造に、表示属性情報生成部307が付加されたことにある。
表示属性情報生成部307は、メモリ32の画像情報記憶領域322に、表示属性情報生成部307により生成される表示属性情報を、画像情報生成部303により生成される画像情報と組みで書き込み記憶する。このため、画像情報転送部304は、画像情報生成部303により生成される画像情報と表示属性情報生成部307により生成される表示属性情報とを組で、主制御部300によるタイミング制御に基づき描画回路31に転送する。
以下、図8、図9を参照しながら、この発明の実施の形態2に係る表示入力装置の動作につき、特に、実施の形態1の動作との差に着目して説明する。
具体的に、近接座標位置計算部301から指座標を取得した主制御部300は、画像情報生成部303と表示属性情報生成部307を制御し、画像情報生成部303は、取得した指座標に基づき、指座標近傍に位置するソフトウエアキーボードの外部アイコンと、内部アイコンと合成した画像情報を生成し、表示属性情報生成部307は、画像情報生成部303により生成された画像情報のうち、タッチパネル1に表示された外部アイコンにグレースケール処理を施す表示属性情報を生成する。
続いて、画像情報転送部304から転送される画像情報および表示属性情報は、描画回路31に描画指令と共に転送され、描画指令をうけた描画回路31(描画制御部310)は、直線描画や矩形描画等の指令をデコードして描画部312を起動し、描画部312は、描画制御部310によりデコードされた画像情報をビットマップメモリ部313に高速描画する。
このとき表示されるソフトウエアキーボードの一例が図9に示されている。
なお、上記した実施の形態2に係る表示入力装置によれば、グレースケール処理を施すことにより、外部アイコンと内部アイコンとを区別して表示したが、階調制御に限らず、色、点滅、反転、強調等、他の表示属性制御で代替してもよい。
図10は、この発明の実施の形態3に係る表示入力装置の動作を示すフローチャートである。なお、以下に説明する実施の形態3では、実施の形態1同様、図1に示す表示入力装置と同じ構成を用い、また、図2に示すプログラム構造と同じ構造を用いることとする。
但し、ここで説明する実施の形態3に係る表示入力装置は、パネル面と指との間のZ方向の距離も測定可能な3次元タッチパネルに適用したものである。したがって、図1に示すXY方向の位置検知が可能なタッチパネル1をZ方向の距離も測定可能な3次元タッチパネルに置換えるものとする。3次元位置を計測する技術は、上記した特許文献2に開示されており、ここではこの技術を適用するものとして説明する。
すなわち、画像情報生成部303は、取得したXY方向の指座標に基づき、指座標近傍に位置するソフトウエアキーボードの一部領域を除く外部アイコンを、Z方向の座標によって決まる縮小率にしたがって縮小処理し、内部アイコンと合成し更新する。このとき用いられる、タッチパネル1のパネル面と指との間のZ軸方向の距離(横軸)と、縮小率(縦軸)との関係が図11のグラフに示されている。図11に示されるように、Z軸方向の距離が4cmで最大(1:通常サイズによる表示)となり、Z軸方向の距離が4cm~1cmに近づくにつれて徐々に縮小率が減少し、1cm~0cmまでの間、外部アイコンの縮小率はほとんど変化が無く、0.5倍以下の縮小率で推移する。図11の縮小率1.0とは元のサイズを、縮小率0.5とは一辺のサイズが0.5倍になることを示す。
なお、外部アイコンは、Z軸方向との距離に応じて縮小処理されることに制限されるものではなく、Z軸方向の距離に応じて、例えば、グレースケール等表示属性のレベルを変更してもよい。
なお、上記した実施の形態1~実施の形態3に係る表示入力装置によれば、1以上の一定範囲の表示領域の情報としてソフトウエアキーボードのみを表示の対象として説明したが、ソフトウエアキーボードに限らず、タッチパネル1の任意の表示領域に表示された特定の情報であってもよい。また、検出対象として、指のみ例示したが、指に代わるペン等の検出物であっても同様の効果が得られる。
例えば、制御部3が、近接センサ12でタッチパネル1に検出対象の所定量の接近が検出された場合に、タッチパネル1に表示された一定範囲の表示領域外における画像(外部アイコン)を加工し、一定範囲の表示領域における画像(内部アイコン)と区別して表示するデータ処理は、1または複数のプログラムによりコンピュータ上で実現してもよく、また、その少なくとも一部をハードウェアで実現してもよい。
Claims (8)
- 画像の表示および入力を行うタッチパネルと、
前記タッチパネルに対向して位置する検出対象の動きを非接触で検出する近接センサと、
前記近接センサで前記タッチパネルに前記検出対象の所定量の接近が検出された場合に、前記タッチパネルにおける前記検出対象近傍の一定範囲の表示領域周辺の画像を加工し、前記一定範囲の表示領域における画像と区別して表示する制御部とを備えたことを特徴とする表示入力装置。 - 前記制御部は、
前記タッチパネルにおける前記検出対象近傍の一定範囲の表示領域周辺の画像を縮小処理し、前記一定範囲の表示領域における画像と区別して表示することを特徴とする請求項1記載の表示入力装置。 - 前記制御部は、
前記タッチパネルにおける前記検出対象近傍の一定範囲の表示領域周辺の画像を縮小処理する際の縮小率を、前記タッチパネルを介して入力されるユーザ設定により変化させることを特徴とする請求項1記載の表示入力装置。 - 前記タッチパネルは、複数の操作キーを表示するものであり、
前記制御部は、
前記タッチパネルにおける前記検出対象近傍の一定範囲の表示領域周辺の前記操作キーの間隔を狭める処理を行い、前記検出対象近傍の一定範囲の表示領域の画像と区別して表示することを特徴とする請求項1記載の表示入力装置。 - 前記制御部は、
前記タッチパネルに表示された前記検出対象近傍の一定範囲の表示領域における前記操作キーの間隔を拡大処理し、前記検出対象近傍の一定範囲の表示領域周辺の画像と区別して表示することを特徴とする請求項4記載の表示入力装置。 - 前記制御部は、
前記タッチパネルにおける前記検出対象近傍の一定範囲の表示領域周辺の画像の形状を変更し、前記検出対象近傍の一定範囲の表示領域の画像と区別して表示することを特徴とする請求項1記載の表示入力装置。 - 前記制御部は、
前記タッチパネルにおける前記検出対象近傍の一定範囲の表示領域周辺の画像に表示属性に基づく修飾処理を施し、前記検出対象近傍の一定範囲の表示領域の画像と区別して表示することを特徴とする請求項1記載の表示入力装置。 - 前記制御部は、
前記近接センサにより前記タッチパネルに対向する検出対象の垂直方向の距離を検知し、前記垂直方向の距離に応じて変化する縮小率にしたがい、前記タッチパネルにおける前記検出対象近傍の一定範囲の表示領域周辺の画像を縮小表示することを特徴とする請求項1記載の表示入力装置。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/129,533 US20110221776A1 (en) | 2008-12-04 | 2009-11-26 | Display input device and navigation device |
| CN200980149045.2A CN102239470B (zh) | 2008-12-04 | 2009-11-26 | 显示输入装置及导航装置 |
| JP2010541213A JP5231571B2 (ja) | 2008-12-04 | 2009-11-26 | 表示入力装置およびナビゲーション装置 |
| DE112009003521T DE112009003521T5 (de) | 2008-12-04 | 2009-11-26 | Anzeigeeingabevorrichtung |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008309789 | 2008-12-04 | ||
| JP2008-309789 | 2008-12-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010064388A1 true WO2010064388A1 (ja) | 2010-06-10 |
Family
ID=42233047
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2009/006391 Ceased WO2010064388A1 (ja) | 2008-12-04 | 2009-11-26 | 表示入力装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20110221776A1 (ja) |
| JP (2) | JP5231571B2 (ja) |
| CN (1) | CN102239470B (ja) |
| DE (1) | DE112009003521T5 (ja) |
| WO (1) | WO2010064388A1 (ja) |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2012008954A (ja) * | 2010-06-28 | 2012-01-12 | Brother Ind Ltd | 入力装置、複合機および入力制御プログラム |
| JP2012032853A (ja) * | 2010-07-28 | 2012-02-16 | Sony Corp | 情報処理装置、情報処理方法およびコンピュータプログラム |
| JP2012190261A (ja) * | 2011-03-10 | 2012-10-04 | Panasonic Corp | 近接操作支援装置 |
| JP2012203676A (ja) * | 2011-03-25 | 2012-10-22 | Ntt Docomo Inc | 携帯端末および画面表示変更方法 |
| JP2012208633A (ja) * | 2011-03-29 | 2012-10-25 | Ntt Docomo Inc | 情報端末、表示制御方法及び表示制御プログラム |
| JP5189709B2 (ja) * | 2010-07-07 | 2013-04-24 | パナソニック株式会社 | 端末装置およびgui画面生成方法 |
| WO2013067776A1 (zh) * | 2011-11-08 | 2013-05-16 | 中兴通讯股份有限公司 | 一种终端显示界面的控制方法及终端 |
| JP2013143144A (ja) * | 2012-01-09 | 2013-07-22 | Samsung Electronics Co Ltd | ディスプレイ装置およびそのアイテム選択方法 |
| JP2013196203A (ja) * | 2012-03-16 | 2013-09-30 | Fujitsu Ltd | 入力制御装置、入力制御プログラム、及び入力制御方法 |
| JP2013539113A (ja) * | 2010-08-24 | 2013-10-17 | クアルコム,インコーポレイテッド | 電子デバイスディスプレイの上方の空気中で物体を移動させることによって電子デバイスアプリケーションと相互作用するための方法および装置 |
| JP2014170337A (ja) * | 2013-03-04 | 2014-09-18 | Mitsubishi Electric Corp | 情報表示制御装置、情報表示装置および情報表示制御方法 |
| JP2015099436A (ja) * | 2013-11-18 | 2015-05-28 | 三菱電機株式会社 | インターフェース装置 |
| JP2018010660A (ja) * | 2017-08-24 | 2018-01-18 | 三菱電機株式会社 | 端末用プログラム |
| JP2020107031A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社デンソー | 指示ジェスチャ検出装置、およびその検出方法 |
Families Citing this family (74)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8381135B2 (en) | 2004-07-30 | 2013-02-19 | Apple Inc. | Proximity detector in handheld device |
| US20120249463A1 (en) * | 2010-06-04 | 2012-10-04 | Smart Technologies Ulc | Interactive input system and method |
| FR2971066B1 (fr) | 2011-01-31 | 2013-08-23 | Nanotec Solution | Interface homme-machine tridimensionnelle. |
| US9417754B2 (en) | 2011-08-05 | 2016-08-16 | P4tents1, LLC | User interface system, method, and computer program product |
| TWI461990B (zh) * | 2011-08-30 | 2014-11-21 | Wistron Corp | 光學影像式觸控裝置與觸控影像處理方法 |
| JP5978592B2 (ja) * | 2011-10-26 | 2016-08-24 | ソニー株式会社 | ヘッド・マウント・ディスプレイ及び表示制御方法 |
| JP5880024B2 (ja) * | 2011-12-22 | 2016-03-08 | 株式会社バッファロー | 情報処理装置及びプログラム |
| US9594499B2 (en) * | 2012-02-21 | 2017-03-14 | Nokia Technologies Oy | Method and apparatus for hover-based spatial searches on mobile maps |
| US9378581B2 (en) * | 2012-03-13 | 2016-06-28 | Amazon Technologies, Inc. | Approaches for highlighting active interface elements |
| KR20130115737A (ko) * | 2012-04-13 | 2013-10-22 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법 |
| EP3410287B1 (en) | 2012-05-09 | 2022-08-17 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
| WO2013169845A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for scrolling nested regions |
| WO2013169875A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for displaying content associated with a corresponding affordance |
| WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
| WO2013169854A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
| WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
| WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
| WO2013169851A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
| CN107977084B (zh) | 2012-05-09 | 2021-11-05 | 苹果公司 | 用于针对在用户界面中执行的操作提供触觉反馈的方法和装置 |
| DE202013012233U1 (de) | 2012-05-09 | 2016-01-18 | Apple Inc. | Vorrichtung und grafische Benutzerschnittstelle zum Anzeigen zusätzlicher Informationen in Antwort auf einen Benutzerkontakt |
| WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
| HK1208275A1 (en) | 2012-05-09 | 2016-02-26 | 苹果公司 | Device, method, and graphical user interface for moving and dropping a user interface object |
| KR101683868B1 (ko) | 2012-05-09 | 2016-12-07 | 애플 인크. | 제스처에 응답하여 디스플레이 상태들 사이를 전이하기 위한 디바이스, 방법, 및 그래픽 사용자 인터페이스 |
| US11073959B2 (en) * | 2012-06-08 | 2021-07-27 | Apple Inc. | Simulating physical materials and light interaction in a user interface of a resource-constrained device |
| CN102915206B (zh) * | 2012-09-19 | 2015-08-12 | 东莞宇龙通信科技有限公司 | 屏幕键盘的按键大小调整方法和系统 |
| US9411510B2 (en) * | 2012-12-07 | 2016-08-09 | Apple Inc. | Techniques for preventing typographical errors on soft keyboards |
| HK1215094A1 (zh) | 2012-12-29 | 2016-08-12 | Apple Inc. | 用於根據具有模擬三維特徵的控制圖標的外觀變化來移動光標的設備、方法和圖形用戶界面 |
| CN104903834B (zh) | 2012-12-29 | 2019-07-05 | 苹果公司 | 用于在触摸输入到显示输出关系之间过渡的设备、方法和图形用户界面 |
| WO2014105279A1 (en) | 2012-12-29 | 2014-07-03 | Yknots Industries Llc | Device, method, and graphical user interface for switching between user interfaces |
| CN104885050B (zh) | 2012-12-29 | 2017-12-08 | 苹果公司 | 用于确定是滚动还是选择内容的设备、方法和图形用户界面 |
| EP3467634B1 (en) | 2012-12-29 | 2020-09-23 | Apple Inc. | Device, method, and graphical user interface for navigating user interface hierarchies |
| CN104903835B (zh) | 2012-12-29 | 2018-05-04 | 苹果公司 | 用于针对多接触手势而放弃生成触觉输出的设备、方法和图形用户界面 |
| KR20140087731A (ko) * | 2012-12-31 | 2014-07-09 | 엘지전자 주식회사 | 포터블 디바이스 및 사용자 인터페이스 제어 방법 |
| EP2759921B1 (en) * | 2013-01-25 | 2020-09-23 | Morpho, Inc. | Image display apparatus, image displaying method and program |
| FR3002052B1 (fr) * | 2013-02-14 | 2016-12-09 | Fogale Nanotech | Procede et dispositif pour naviguer dans un ecran d'affichage et appareil comprenant une telle navigation |
| US20140240242A1 (en) * | 2013-02-26 | 2014-08-28 | Honeywell International Inc. | System and method for interacting with a touch screen interface utilizing a hover gesture controller |
| US10120540B2 (en) * | 2013-03-14 | 2018-11-06 | Samsung Electronics Co., Ltd. | Visual feedback for user interface navigation on television system |
| US10275084B2 (en) | 2013-03-27 | 2019-04-30 | Hyon Jo Ji | Touch control method in mobile terminal having large screen |
| WO2014157961A1 (ko) * | 2013-03-27 | 2014-10-02 | Ji Man Suk | 대형화면을 갖는 휴대단말기에서의 터치제어방법 |
| US20140327645A1 (en) * | 2013-05-06 | 2014-11-06 | Nokia Corporation | Touchscreen accessory attachment |
| US9921739B2 (en) * | 2014-03-03 | 2018-03-20 | Microchip Technology Incorporated | System and method for gesture control |
| KR101655810B1 (ko) | 2014-04-22 | 2016-09-22 | 엘지전자 주식회사 | 차량용 디스플레이 장치 |
| KR102324083B1 (ko) * | 2014-09-01 | 2021-11-09 | 삼성전자주식회사 | 화면 확대 제공 방법 및 그 전자 장치 |
| US10042445B1 (en) * | 2014-09-24 | 2018-08-07 | Amazon Technologies, Inc. | Adaptive display of user interface elements based on proximity sensing |
| JP6452409B2 (ja) * | 2014-11-28 | 2019-01-16 | キヤノン株式会社 | 画像表示装置、画像表示方法 |
| KR102337216B1 (ko) | 2015-01-05 | 2021-12-08 | 삼성전자주식회사 | 영상 표시 장치 및 영상 표시 방법 |
| US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
| US9645732B2 (en) | 2015-03-08 | 2017-05-09 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
| US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
| US9990107B2 (en) | 2015-03-08 | 2018-06-05 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
| US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
| US9785305B2 (en) | 2015-03-19 | 2017-10-10 | Apple Inc. | Touch input cursor manipulation |
| US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
| US10067653B2 (en) | 2015-04-01 | 2018-09-04 | Apple Inc. | Devices and methods for processing touch inputs based on their intensities |
| US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
| US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
| US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
| US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
| US9674426B2 (en) | 2015-06-07 | 2017-06-06 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
| US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
| US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
| US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
| US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
| US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
| US10248308B2 (en) | 2015-08-10 | 2019-04-02 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
| JP6520817B2 (ja) * | 2016-05-10 | 2019-05-29 | 株式会社デンソー | 車両用操作装置 |
| KR20170138279A (ko) * | 2016-06-07 | 2017-12-15 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
| CN106201306B (zh) * | 2016-06-27 | 2019-11-26 | 联想(北京)有限公司 | 一种控制方法及电子设备 |
| CN109416612A (zh) * | 2016-08-05 | 2019-03-01 | 京瓷办公信息系统株式会社 | 显示输入装置、图像形成装置、显示输入装置的控制方法 |
| US10146495B2 (en) * | 2016-12-21 | 2018-12-04 | Curt A Nizzoli | Inventory management system |
| JP7349441B2 (ja) * | 2018-09-19 | 2023-09-22 | 富士フイルム株式会社 | タッチパネルディスプレイ付きデバイス及びその制御方法、並びにプログラム |
| JP6568331B1 (ja) * | 2019-04-17 | 2019-08-28 | 京セラ株式会社 | 電子機器、制御方法、及びプログラム |
| JP6816798B2 (ja) * | 2019-08-22 | 2021-01-20 | 富士ゼロックス株式会社 | 表示装置及びプログラム |
| FR3124872B1 (fr) * | 2021-07-02 | 2024-11-29 | Faurecia Interieur Ind | Dispositif électronique et procédé d'affichage de données sur un écran d’affichage, système d’affichage, véhicule et programme d’ordinateur associés |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006103357A (ja) * | 2004-09-30 | 2006-04-20 | Mazda Motor Corp | 車両用情報表示装置 |
Family Cites Families (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5963671A (en) * | 1991-11-27 | 1999-10-05 | International Business Machines Corporation | Enhancement of soft keyboard operations using trigram prediction |
| TWI238348B (en) * | 2002-05-13 | 2005-08-21 | Kyocera Corp | Portable information terminal, display control device, display control method, and recording media |
| KR20030097310A (ko) * | 2002-06-20 | 2003-12-31 | 삼성전자주식회사 | 디스플레이장치의 화상크기조절방법 및 그화상크기조절시스템과 화상크기조절방법을 수행하는프로그램이 저장된 기록매체 |
| US8042044B2 (en) * | 2002-11-29 | 2011-10-18 | Koninklijke Philips Electronics N.V. | User interface with displaced representation of touch area |
| JP3846432B2 (ja) | 2003-02-26 | 2006-11-15 | ソニー株式会社 | 表示装置、表示方法及びそのプログラム |
| US6990637B2 (en) * | 2003-10-23 | 2006-01-24 | Microsoft Corporation | Graphical user interface for 3-dimensional view of a data collection based on an attribute of the data |
| US7432911B2 (en) * | 2004-02-26 | 2008-10-07 | Research In Motion Limited | Keyboard for mobile devices |
| JP4037378B2 (ja) * | 2004-03-26 | 2008-01-23 | シャープ株式会社 | 情報処理装置、画像出力装置、情報処理プログラムおよび記録媒体 |
| EP1596271A1 (en) * | 2004-05-11 | 2005-11-16 | Hitachi Europe S.r.l. | Method for displaying information and information display system |
| JP2006031499A (ja) * | 2004-07-20 | 2006-02-02 | Denso Corp | 情報入力表示装置 |
| US7443316B2 (en) * | 2005-09-01 | 2008-10-28 | Motorola, Inc. | Entering a character into an electronic device |
| US20070209025A1 (en) * | 2006-01-25 | 2007-09-06 | Microsoft Corporation | User interface for viewing images |
| JP4876982B2 (ja) * | 2007-03-07 | 2012-02-15 | 日本電気株式会社 | 表示装置および携帯情報機器 |
-
2009
- 2009-11-26 WO PCT/JP2009/006391 patent/WO2010064388A1/ja not_active Ceased
- 2009-11-26 DE DE112009003521T patent/DE112009003521T5/de not_active Ceased
- 2009-11-26 JP JP2010541213A patent/JP5231571B2/ja not_active Expired - Fee Related
- 2009-11-26 CN CN200980149045.2A patent/CN102239470B/zh active Active
- 2009-11-26 US US13/129,533 patent/US20110221776A1/en not_active Abandoned
-
2013
- 2013-03-21 JP JP2013058246A patent/JP5430782B2/ja active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006103357A (ja) * | 2004-09-30 | 2006-04-20 | Mazda Motor Corp | 車両用情報表示装置 |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2012008954A (ja) * | 2010-06-28 | 2012-01-12 | Brother Ind Ltd | 入力装置、複合機および入力制御プログラム |
| US9423935B2 (en) | 2010-07-07 | 2016-08-23 | Panasonic Intellectual Property Management Co., Ltd. | Terminal apparatus and GUI screen generation method |
| JP5189709B2 (ja) * | 2010-07-07 | 2013-04-24 | パナソニック株式会社 | 端末装置およびgui画面生成方法 |
| JP2012032853A (ja) * | 2010-07-28 | 2012-02-16 | Sony Corp | 情報処理装置、情報処理方法およびコンピュータプログラム |
| JP2013539113A (ja) * | 2010-08-24 | 2013-10-17 | クアルコム,インコーポレイテッド | 電子デバイスディスプレイの上方の空気中で物体を移動させることによって電子デバイスアプリケーションと相互作用するための方法および装置 |
| JP2012190261A (ja) * | 2011-03-10 | 2012-10-04 | Panasonic Corp | 近接操作支援装置 |
| JP2012203676A (ja) * | 2011-03-25 | 2012-10-22 | Ntt Docomo Inc | 携帯端末および画面表示変更方法 |
| JP2012208633A (ja) * | 2011-03-29 | 2012-10-25 | Ntt Docomo Inc | 情報端末、表示制御方法及び表示制御プログラム |
| WO2013067776A1 (zh) * | 2011-11-08 | 2013-05-16 | 中兴通讯股份有限公司 | 一种终端显示界面的控制方法及终端 |
| JP2013143144A (ja) * | 2012-01-09 | 2013-07-22 | Samsung Electronics Co Ltd | ディスプレイ装置およびそのアイテム選択方法 |
| JP2013196203A (ja) * | 2012-03-16 | 2013-09-30 | Fujitsu Ltd | 入力制御装置、入力制御プログラム、及び入力制御方法 |
| JP2014170337A (ja) * | 2013-03-04 | 2014-09-18 | Mitsubishi Electric Corp | 情報表示制御装置、情報表示装置および情報表示制御方法 |
| JP2015099436A (ja) * | 2013-11-18 | 2015-05-28 | 三菱電機株式会社 | インターフェース装置 |
| JP2018010660A (ja) * | 2017-08-24 | 2018-01-18 | 三菱電機株式会社 | 端末用プログラム |
| JP2020107031A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社デンソー | 指示ジェスチャ検出装置、およびその検出方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2010064388A1 (ja) | 2012-05-10 |
| JP5430782B2 (ja) | 2014-03-05 |
| CN102239470A (zh) | 2011-11-09 |
| JP5231571B2 (ja) | 2013-07-10 |
| JP2013146095A (ja) | 2013-07-25 |
| DE112009003521T5 (de) | 2013-10-10 |
| US20110221776A1 (en) | 2011-09-15 |
| CN102239470B (zh) | 2018-03-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5430782B2 (ja) | 表示入力装置および車載情報機器 | |
| JP5511682B2 (ja) | 表示入力装置及びナビゲーションシステム | |
| JP5355683B2 (ja) | 表示入力装置および車載情報機器 | |
| JP5312655B2 (ja) | 表示入力装置および車載情報装置 | |
| JP5052677B2 (ja) | 表示入力装置 | |
| JP5620440B2 (ja) | 表示制御装置、表示制御方法及びプログラム | |
| KR20100104804A (ko) | Ddi, ddi 제공방법 및 상기 ddi를 포함하는 데이터 처리 장치 | |
| KR20140137996A (ko) | 휴대 단말기에서 화면을 표시하는 방법 및 장치 | |
| JPWO2017022031A1 (ja) | 情報端末装置 | |
| US20130293505A1 (en) | Multi-dimensional interaction interface for mobile devices | |
| JP5933468B2 (ja) | 情報表示制御装置、情報表示装置および情報表示制御方法 | |
| KR101165388B1 (ko) | 이종의 입력 장치를 이용하여 화면을 제어하는 방법 및 그 단말장치 | |
| JP2014170339A (ja) | 情報表示制御装置、情報表示装置および情報表示制御方法 | |
| JP2014052817A (ja) | 画像表示方法および画像表示装置 | |
| JP5889230B2 (ja) | 情報表示制御装置、情報表示装置および情報表示制御方法 | |
| JP5984718B2 (ja) | 車載情報表示制御装置、車載情報表示装置および車載表示装置の情報表示制御方法 | |
| JP2017182260A (ja) | 表示処理装置、及び表示処理プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200980149045.2 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09830159 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2010541213 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13129533 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1120090035213 Country of ref document: DE Ref document number: 112009003521 Country of ref document: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 09830159 Country of ref document: EP Kind code of ref document: A1 |