WO2014192103A1 - 情報表示装置 - Google Patents
情報表示装置 Download PDFInfo
- Publication number
- WO2014192103A1 WO2014192103A1 PCT/JP2013/064927 JP2013064927W WO2014192103A1 WO 2014192103 A1 WO2014192103 A1 WO 2014192103A1 JP 2013064927 W JP2013064927 W JP 2013064927W WO 2014192103 A1 WO2014192103 A1 WO 2014192103A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- unit
- display
- user
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
- B60K35/235—Head-up displays [HUD] with means for detecting the driver's gaze direction or eye points
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/60—Instruments characterised by their location or relative disposition in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
- H04N7/144—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display camera and display on the same optical axis, e.g. optically multiplexing the camera and display for eye to eye contact
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/182—Distributing information between displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/18—Information management
- B60K2360/186—Displaying information according to relevancy
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the present invention relates to an information display device that superimposes and displays information provided to a user on a real-world image or video.
- AR Augmented Reality
- Google glass promoted by Google registered trademark
- This device using AR technology generates information on objects in the field of view or objects that cannot be detected by the eye in the field of view that is directly viewed by humans or captured by the camera. This presents support for human perception.
- Patent Document 1 discloses a technique that makes it easy for a user to understand a character by displaying information about the person appearing in the field of view in a theater while being superimposed on the video. is doing.
- an image in a field of view is divided into quadrants, and it is determined whether information can be displayed based on whether a person can be detected in each quadrant, and information is displayed in an area where no person is detected.
- Non-Patent Document 1 discloses a “peripheral visual field region information presentation method by gaze measurement for an augmented reality environment”.
- a gaze recognition device is used to detect a user's gazing point, the periphery of the detected gazing point is a central visual field, and the outer region is a peripheral visual field, and information is presented to the peripheral visual field. This prevents the user's view from being obstructed.
- the visual field recognition device is used to divide the visual field into a central visual field and a peripheral visual field and present information to the peripheral visual field. Prevents the field of vision from being obstructed. However, depending on the shape of the object that the user is looking at, the object may enter the peripheral visual field, and what the user is looking at may be blocked by the presented information.
- This invention is made in order to solve such a problem, and it aims at providing the information display apparatus which can display the information provided to a user, without interrupting what the user sees. To do.
- an image input unit that inputs an image corresponding to the user's field of view, and a line of sight that detects a gaze point that represents the position of the line of sight within the user's field of view.
- a detection unit an object recognition unit that extracts a region in the image of the object including the gazing point detected by the line-of-sight detection unit as an image from the image input from the image input unit, and the object recognition unit.
- the display position determination unit Based on the information about the first region, the display position determination unit that determines the position where the user's line of sight is not in the visual field as the display position, and the display position determined by the display position determination unit And an information display unit that displays information to be presented.
- information is displayed in a region in the field of view where the user's line of sight is not hit, so that information provided to the user can be displayed without blocking what the user is looking at. it can.
- FIG. 1 is a diagram showing an information display device according to Embodiment 1 of the present invention.
- FIG. 1 (a) is a block diagram showing an electrical configuration of the information display device
- FIG. 1 (b) is an information display. It is a figure which shows the image of the structure at the time of applying an apparatus to spectacles.
- the information display device includes an image input unit 1, a line-of-sight detection unit 2, an object recognition unit 3, a timer 4, a display position determination unit 5, and an information display unit 6.
- the image input unit 1 is configured by a camera, for example, and inputs an image corresponding to the field of view obtained by photographing the user's field of view.
- the image input by the image input unit 1 is sent to the object recognition unit 3.
- the line-of-sight detection unit 2 detects a line of sight indicating which part of the field of view the user is viewing.
- the position (gaze point) of the line of sight detected by the line-of-sight detection unit 2 is sent to the object recognition unit 3.
- the object recognition unit 3 recognizes and extracts an object including a gazing point sent from the line-of-sight detection unit 2 from the image sent from the image input unit 1. That is, the object recognition unit 3 extracts a region (first region) in the image of the object including the gazing point.
- the object recognition unit 3 recognizes the shape and area of the object by performing, for example, contour extraction.
- Information representing the object recognized and extracted by the object recognition unit 3 is sent to the display position determination unit 5.
- an existing technique as shown in Non-Patent Document 2 can be used.
- Timer 4 measures a certain time, for example, several seconds.
- the timer 4 is started by an instruction from the display position determining unit 5, and when the set time has elapsed, the timer 4 notifies the display position determining unit 5 and then stops.
- the display position determination unit 5 determines the display position, that is, in which area in the field of view, the information is displayed based on the information from the object recognition unit 3. Information indicating the display position determined by the display position determination unit 5 is sent to the information display unit 6. At this time, the display position determination unit 5 starts the timer 4 and fixes the information indicating the display position until receiving a notification from the timer 4 that a predetermined time has elapsed. In other words, the display position determination unit 5 notifies the information display unit 6 not to change the display position until a notification that the predetermined time has elapsed is received from the timer 4.
- the reason why the information indicating the display position is fixed and the information display is maintained for a certain time is as follows. That is, when the information display position is determined based on the gazing point from the line-of-sight detection unit 2, the information display position also changes when the line of sight moves. In this situation, the user may not be able to see the displayed information. Therefore, once the information is displayed, the display position determination unit 5 starts the timer 4 so that the display position of the displayed information is not changed for a certain period of time. Thereby, even when the user's line of sight moves, the display position of the information is fixed for a certain period of time, so that the user can confirm the presented information more reliably.
- the information display unit 6 displays information at the display position indicated by the information sent from the display position determination unit 5.
- information is projected into the field of view using a lens unit of spectacles as a screen.
- FIG. 2 is a flowchart showing the operation of the information display device when information provided to the user is displayed in the field of view.
- the process is started, first, the user's line of sight is detected (step ST11). That is, the line-of-sight detection unit 2 detects a portion (gaze point) that the user in the field of view is looking at.
- FIG. 3 shows an example in which a gazing point is placed on the lower left automobile in the field of view.
- the gazing point detected by the line-of-sight detection unit 2 is sent to the object recognition unit 3 as a coordinate value (x, y) in the image, for example.
- the object recognizing unit 3 includes the coordinate value (x, y) that is an example of the position information of the gazing point sent from the line-of-sight detecting unit 2 from the image sent from the image input unit 1.
- the region is recognized as an object that the user is looking at, and the region where the object exists is extracted as indicated by a broken line in FIG. That is, the object recognition unit 3 extracts a region in the image of the object including the gazing point.
- information representing the region of the object for example, a point sequence (x1, y1), (x2, y2),..., (Xn, yn) indicating the contour of the region is obtained.
- Information representing the area of the object is sent to the display position determination unit 5.
- the display position of the information is determined (step ST13). That is, the display position determination unit 5 determines in which area (position) in the field of view information is displayed. At this time, the display position is determined so as to display at a position in the field of view corresponding to an area (second area) different from the area (first area) that the user is viewing.
- Various algorithms can be used to determine which part of the area that the user does not see to use for displaying information. For example, the following method can be used simply. .
- the region different from the region viewed by the user may include a portion of the region viewed by the user as long as the display object does not obstruct the object viewed by the user.
- the area of each region can be expressed by the following equation.
- the display position determination unit 5 obtains the area using these equations, selects the largest region as the display region, and sends information on the selected region to the information display unit 6 as the display position.
- step ST14 information is displayed (step ST14). That is, the information display unit 6 displays information at the display position sent from the display position determination unit 5 as shown in FIG.
- the information display unit 6 can be configured to perform display after performing processing such as shifting the display position.
- the user in the field of view is based on the information about the object region (first region) in the image extracted by the object recognition unit 3. Since the information is displayed at a position where the line of sight is not hit, it is possible to display good information without the information blocking the object viewed by the user.
- the information display device may be applied to a driver seat of a car, for example.
- a line-of-sight detection unit 2 that captures the driver's line of sight is installed in the instrument panel portion
- an image input unit (camera) 1 is installed outside so as to capture the driver's field of view
- the information display unit 6 is a front screen. It is composed of a head-up display that displays information. With this configuration, the driver can receive information according to the surrounding situation.
- Embodiment 2 In the information display device according to the second embodiment of the present invention, the determination of the display position of the information presented to the user is simplified as compared with the first embodiment.
- the configuration of the information display device is the same as the configuration of the information display device according to the first embodiment shown in FIG. 1 except for the operation of the display position determination unit 5.
- the display position determination unit 5 of the information display device has an algorithm for determining which part of the field in the field of view that the user does not see is used for displaying information. Different from that of 1. That is, in the first embodiment, the display position determination unit 5 determines the largest area among the upper, lower, left, and right areas of the object recognized by the object recognition unit 3 as the information display position. The display position of the object recognized by the object recognition unit 3 is determined so as to display information to be presented to the user at a far position on the opposite side of the center of the field of view. That is, the display position determination unit 5 determines the position in the field of view corresponding to the region opposite to the region viewed by the user as the display position with respect to the center in the image corresponding to the field of view.
- step ST11 the user's line of sight is detected (step ST11).
- step ST12 the region of the object that the user is looking at is extracted (step ST12).
- step ST13 the information display position is determined (step ST13). That is, as shown in FIG. 7, the display position determination unit 5 displays the information presented to the user with respect to the object extracted by the object recognition unit 3 in step ST12, that is, on the opposite side of the center of view, that is, the user
- the display position is determined so as to display at a position far from the area where the user is viewing. In other words, an area having the base point at the farthest position P1 passing through the center of the field of view from the object is determined as the display position. Thereafter, information is displayed (step ST14).
- information is displayed at the farthest place from the object the user is looking at.
- Good information can be displayed without obstructing information on the object.
- it is not necessary to obtain the areas of the upper, lower, left, and right regions of the object in order to determine the information display position, so that the processing time until the display position is determined can be shortened.
- the information is displayed at a position farthest from the “object” that the user is looking at.
- the information is further simplified, and the information is displayed at a position farthest from the “gaze point” of the user. It can be modified to display information.
- FIG. 8 is a block diagram showing a configuration of an information display device according to a modification of the second embodiment.
- This information display device is configured to remove the object recognition unit 3 from the information display device according to Embodiment 1 and to send the image obtained by the image input unit 1 directly to the display position determination unit 5. Yes.
- the display position determination unit 5 determines the display position, that is, in which area the information is displayed, based on the image from the image input unit 1 and the information from the line-of-sight detection unit 2. Specifically, the display position determination unit 5 displays the information to be presented to the user at a distant position on the opposite side of the center of the field of view with respect to the gazing point detected by the line-of-sight detection unit 2. Is determined. In other words, the display position determination unit 5 determines the position in the field of view corresponding to the region on the opposite side of the user's gaze point as the display position with respect to the center in the image corresponding to the field of view.
- step ST11 the user's line of sight is detected (step ST11).
- step ST13 the information display position is determined (step ST13). That is, the display position determination unit 5 uses the information presented to the user as shown in FIG. 9 on the opposite side of the center of the field of view to the gazing point detected by the line-of-sight detection unit 2 in step ST11, that is, use
- the display position is determined so as to be displayed at a position far from the area the person is looking at. In other words, the region having the base point at the farthest position P2 passing through the center of the field of view from the gazing point is determined as the display position. Thereafter, information is displayed (step ST14).
- information is displayed at a position farthest from the user's point of sight, so that the user can see it. It is possible to display good information without obstructing information on the object.
- information display device As described above, according to the information display device according to the modification of the second embodiment, as shown in FIG. 9, information is displayed at a position farthest from the user's point of sight, so that the user can see it. It is possible to display good information without obstructing information on the object.
- there is no need to recognize the object further reducing the processing time until the information display position is determined. it can.
- the information when the display position is determined, the information may be displayed by shifting the display position if display is difficult due to the size of the displayed information. Can be configured to display.
- Embodiment 3 In the information display device according to Embodiment 3 of the present invention, in the information display device according to Embodiment 1, the display position (area) of information to be presented to the user is further narrowed down so that the field of view is not hindered as much as possible. Is.
- the configuration of the information display device is the same as the configuration of the information display device according to the first embodiment shown in FIG. 1 except for the operations of the object recognition unit 3 and the display position determination unit 5.
- the object recognition unit 3 recognizes and extracts an object including a gazing point sent from the line-of-sight detection unit 2 from the image sent from the image input unit 1, and further, An object existing in the display area determined by the display position determination unit 5 is recognized and extracted. Information indicating the region of the object recognized by the object recognition unit 3 is sent to the display position determination unit 5.
- the display position determination unit 5 further includes an object recognition unit as shown as a region surrounded by a broken line in FIG. 10A in the region (display region) having the largest area determined in the same manner as in the first embodiment. An empty area other than the object area (third area) extracted in 3 is determined as a display position. Information indicating the display position determined by the display position determination unit 5 is sent to the information display unit 6.
- the processing is started, first, the user's line of sight is detected (step ST11). Next, the region of the object that the user is looking at is extracted (step ST12). Next, the information display position is determined (step ST13). That is, the display position determination unit 5 determines in which area (position) in the field of view information is displayed. At this time, the display position is determined as follows, for example, so as to display in an area different from the area that the user is viewing.
- the display position determination unit 5 determines the largest region among the upper, lower, left, and right regions of the object recognized by the object recognition unit 3 as an information display region.
- the object recognition unit 3 recognizes and extracts an object region existing in the display region determined by the display position determination unit 5.
- an existing technique as shown in Non-Patent Document 2 can be used.
- the display position determination unit 5 determines an empty area other than the object recognized by the object recognition unit 3 as the display position.
- the display position determination unit 5 is a position in the field of view corresponding to a region different from the region (third region) where the object exists in the display region (second region) extracted by the object recognition unit 3. Is determined as the display position.
- the region determined may partially include a region where an object exists (third region).
- Information indicating the display position determined by the display position determination unit 5 is sent to the information display unit 6. Thereafter, information is displayed (step ST14).
- a region having the largest area around the object is specified, and within the specified region, Further, an area having no object is specified, and then, as shown in FIG. 10B, the specified area is determined as an information display area and information is displayed. Therefore, it is possible to display good information without obstructing the object the user is looking at.
- the user since information is displayed so as not to obstruct objects other than the user's gaze within the field of view, the user can view the information while grasping the status of the object other than the object being gaze within the field of view. It becomes. Note that if display is difficult in the area due to the size of the information to be displayed, the display position can be shifted to display the information.
- Embodiment 4 FIG.
- the information display apparatus according to Embodiment 4 of the present invention controls the display of information in accordance with the movement of the user's point of gaze.
- the configuration of the information display device is the same as the configuration of the information display device according to the first embodiment shown in FIG. 1 except for the operation of the display position determination unit 5.
- the display position determination unit 5 controls the position of the displayed information so as not to change for a certain period of time based on the information from the timer 4.
- the display position determination unit 5 moves the viewpoint of the user. Accordingly, the display position of information is made variable.
- the operation of the information display device is the same as that of the information display device according to the first embodiment shown in the flowchart of FIG. 2 except for the operation of the display position determination unit 5.
- the description will be focused on the parts different from the first embodiment with reference to the flowchart shown in FIG. 2, and the description of the same parts will be simplified.
- step ST11 the user's line of sight is detected (step ST11).
- step ST12 the region of the object that the user is looking at is extracted (step ST12).
- step ST13 the information display position is determined (step ST13). That is, the display position determination unit 5 determines in which area (position) in the field of view information is displayed. At this time, the display position is determined so as to display in an area different from the area the user is viewing. Information indicating the display position determined by the display position determination unit 5 is sent to the information display unit 6. Thereafter, information is displayed (step ST14).
- the display position determination unit 5 determines that the user is looking at the information when the user's gazing point moves to the information display area within a certain time measured by the timer 4. The display position of the information is not changed unless the gazing point sent from the line-of-sight detection unit 2 is separated from the information display area. At this time, the timer 4 is reset.
- the display position determination unit 5 starts the timer 4 when the user's point of interest moves away from the information display area, and allows the display position of the information to be changed after a predetermined time, for example, several seconds have elapsed. .
- a predetermined time for example, several seconds have elapsed.
- the timer 4 is reset and the information display position is maintained.
- the information display position can be changed after a certain period of time, if the amount of movement of the gazing point is small, for example, smaller than a predetermined threshold, the information display position is changed. It can also be configured to perform processing such as not changing.
- the information display device since the display position of information is changed according to the movement of the user's point of sight, it is possible to display good information. Become.
- FIG. 11 is a block diagram showing the configuration of the information display apparatus according to the fifth embodiment.
- This information display device is configured by adding an acceleration sensor 11, a position information sensor 12, an operation state detection unit 13, and a display control unit 14 to the information display device according to the first embodiment shown in FIG.
- the acceleration sensor 11 detects the acceleration of the information display device.
- the acceleration sensor 11 for example, an equivalent sensor used in a mobile phone or the like can be used.
- Information indicating the acceleration detected by the acceleration sensor 11 is sent to the operation state detection unit 13.
- the position information sensor 12 for example, a GPS (Global Positioning System) equivalent to a mobile phone or the like can be used.
- the position information sensor 12 detects the position of the information display device on the earth by receiving a signal from a satellite. Information indicating the position detected by the position information sensor 12 is sent to the operation state detection unit 13.
- the operation state detection unit 13 determines the physical operation state of the information display device based on information from the acceleration sensor 11 and the position information sensor 12, and sends information related to the operation state to the display control unit 14.
- the display control unit 14 generates information instructing whether or not to display information according to the information on the operation status from the operation state detection unit 13, and sends the information to the information display unit 6. Thereby, the information display part 6 displays the information provided to a user, or stops a display.
- the operation of the information display device configured as described above will be described.
- the description will focus on the control of information display timing. For example, when the user is walking, if the information is displayed on the user to obstruct the field of view, a dangerous situation may occur.
- the operation state detection unit 13 determines the operation states of the information display device and the user, and controls information display.
- the operation state detection unit 13 determines whether the user is walking or running based on information from the acceleration sensor 11. If a state such as walking or running is detected, information indicating that is sent to the display control unit 14.
- the display control unit 14 When the display control unit 14 receives information related to the operation status from the operation state detection unit 13, the display control unit 14 determines whether the status is appropriate for information display, and displays information to stop the information display if the status is inappropriate. Instruct the unit 6. In response to this instruction, the information display unit 6 stops displaying information.
- the operation state detection unit 13 detects a stopped state based on information from the acceleration sensor 11 and sends the information to the display control unit 14.
- the display control unit 14 determines that the state is suitable for information display, and instructs the information display unit 6 to display information.
- the display control unit 14 has been described with the condition settings such as “unsuitable for information display when walking or running” and “appropriate for information display when stopped”.
- the condition setting can be performed in detail. For example, it is possible to set a condition such as “the stopped and depressed state is appropriate for information display”.
- the user can set these conditions so that they can be stored in the information display device.
- it can be configured to stop displaying information when it is detected that there are many vehicles in the vicinity.
- it can be judged by using the positional information and map data of the positional information sensor 12 whether a vehicle exists around. For example, when the position of the user is near a road on the map, it can be determined that there is a vehicle around.
- the map data may be stored in the information display device or acquired from an external device such as a server.
- the information display device since it is configured to control whether or not information can be displayed according to the operation status of the user, the user can pay attention to the information. Avoid dangerous situations.
- FIG. 12 is a block diagram showing a configuration of the information display apparatus according to the sixth embodiment.
- This information display device removes the acceleration sensor 11, the position information sensor 12, the operation state detection unit 13, and the display control unit 14 from the information display device according to the fifth embodiment shown in FIG.
- a voice recognition unit 22 and a display control unit 23 are added.
- the voice input unit 21 is composed of, for example, a microphone, and the voice input from the voice input unit 21 for inputting the user's voice is sent to the voice recognition unit 22 as voice information.
- the voice recognition unit 22 recognizes voice from the voice information sent from the voice input unit 21.
- the voice recognition result obtained by the voice recognition unit 22 is sent to the display control unit 23.
- the display control unit instructs the information display unit 6 whether to display information according to the voice recognition result sent from the voice recognition unit 22. Thereby, the information display part 6 displays the information provided to a user, or stops a display.
- the voice input unit 21 inputs the voice around the user or the voice of the user and sends it to the voice recognition unit 22.
- the voice recognition unit 22 determines what is the voice input from the voice input unit 21 and sends the determination result to the display control unit 23 as a voice recognition result. For example, when an emergency vehicle passes around the user, the voice recognition unit 22 recognizes the emergency vehicle from the sound of the emergency vehicle input from the voice input unit 21, and displays the voice recognition result as a display control unit. 23.
- the display control unit 23 determines that the situation is inappropriate for information display, and instructs the information display unit 6 to stop the information display. .
- the information display can be directly controlled by recognizing the voice uttered by the user in the voice recognition unit 22. For example, when the user utters a voice such as “information display stop”, the voice recognition unit 22 detects the instruction and notifies the display control unit 23 that “information display stop” has been recognized. Thereby, the display control unit 23 determines that the information display is inappropriate, and instructs the information display unit 6 to stop the information display. It can also be instructed by voice that information can be displayed.
- the information display device since it is configured to control whether or not information can be displayed according to the surrounding voice, it is possible to display good information according to the surrounding situation. Is possible.
- FIG. 13 is a block diagram illustrating a configuration of the information display device according to the seventh embodiment.
- This information display device includes a command determination unit 31 instead of the display control unit 23 of the information display device according to Embodiment 6 shown in FIG. 12, and further includes a communication device 32 and an image search device 33. Has been.
- the command determination unit 31 acquires an image to be searched based on information from the object recognition unit 3 and the line-of-sight detection unit 2 when the voice recognition result sent from the voice recognition unit 22 indicates a search command. To the communication device 32. Further, the command determination unit 31 sends the search result information received from the communication device 32 to the information display unit 6.
- the communication device 32 sends the search target image sent from the command determination unit 31 to the image search device 33 and requests image search. Further, the communication device 32 sends the search result information sent from the image search device 33 to the command determination unit 31.
- the image search device 33 performs an image search based on the search target image sent from the communication device 32, and sends the search result information to the communication device 32.
- the user makes an utterance such as “search for what he / she sees” by voice while matching the viewpoint to the object to be searched for images.
- the voice input unit 21 inputs a voice uttered by the user and sends it to the voice recognition unit 22.
- the voice recognition unit 22 informs the command determination unit 31 that the voice input from the voice input unit 21 is a line-of-sight search command.
- the command determination unit 31 has a series of necessary processing information corresponding to the information from the voice recognition unit 22, and starts a series of processing related to the line-of-sight search from the information that it is a line-of-sight search command. That is, the command determination unit 31 obtains gaze point information from the line-of-sight detection unit 2 and causes the object recognition unit 3 to extract the region of the object at that position.
- the operation of the object recognition unit 3 is the same as that described in the first embodiment. As illustrated in FIG. 14, when the object recognition unit 3 extracts an area of the object that the user is viewing, the command determination unit 31 acquires an image of the area and searches for an image via the communication device 32. The image is sent to the apparatus 33 and an image search is requested.
- a Google image search site can be used as the image search device 33.
- the image search device 33 sends information related to the searched image to the communication device 32 as search result information.
- the communication device 32 sends the search result information from the image search device 33 to the command determination unit 31.
- the command determination unit 31 acquires search result information from the communication device 32 and sends the information to the information display unit 6.
- the information display unit 6 receives the information indicating the display position from the display position determination unit 5 and the search result information from the command determination unit 31 and displays the information in an appropriate area.
- information related to the image viewed by the user is acquired from the image search device 33 according to the voice uttered by the user. Since the acquired information is displayed in an appropriate area, information desired by the user can be provided.
- the present invention can be used for, for example, a car navigation system for displaying various information in an actual field of view viewed by a user through a window.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
実施の形態1.
図1は、この発明の実施の形態1に係る情報表示装置を示す図であり、図1(a)は情報表示装置の電気的な構成を示すブロック図、図1(b)は、情報表示装置を眼鏡に適用した場合の構造のイメージを示す図である。情報表示装置は、画像入力部1、視線検知部2、物体認識部3、タイマ4、表示位置決定部5および情報表示部6を備えている。
上領域:Xw・ymin
下領域:Xw・(Yh-ymax)
左領域:xmin・Yh
右領域:(Xw-xmax)・Yh
この発明の実施の形態2に係る情報表示装置は、利用者に提示する情報の表示位置の決定を、実施の形態1に較べて、簡略化したものである。この情報表示装置の構成は、表示位置決定部5の動作を除き、図1に示した実施の形態1に係る情報表示装置の構成と同じである。
この発明の実施の形態3に係る情報表示装置は、実施の形態1に係る情報表示装置において、利用者に提示する情報の表示位置(領域)をさらに絞り込み、できるだけ視界が妨げられないようにしたものである。この情報表示装置の構成は、物体認識部3および表示位置決定部5の動作を除き、図1に示した実施の形態1に係る情報表示装置の構成と同じである。
この発明の実施の形態4に係る情報表示装置は、利用者の注視点の動きに応じて情報の表示を制御するようにしたものである。この情報表示装置の構成は、表示位置決定部5の動作を除き、図1に示した実施の形態1に係る情報表示装置の構成と同じである。
この発明の実施の形態5に係る情報表示装置は、利用者の動作状態に応じて情報の表示を制御するようにしたものである。図11は、実施の形態5に係る情報表示装置の構成を示すブロック図である。この情報表示装置は、図1に示した実施の形態1に係る情報表示装置に加速度センサ11、位置情報センサ12、動作状態検知部13および表示制御部14が追加されて構成されている。
この発明の実施の形態6に係る情報表示装置は、周囲の音声の状態に応じて情報の表示を制御するようにしたものである。図12は、実施の形態6に係る情報表示装置の構成を示すブロック図である。この情報表示装置は、図11に示した実施の形態5に係る情報表示装置から加速度センサ11、位置情報センサ12、動作状態検知部13および表示制御部14を除去するとともに、音声入力部21、音声認識部22および表示制御部23を追加して構成されている。
この発明の実施の形態7に係る情報表示装置は、実施の形態6に係る情報表示装置に画像検索機能を追加したものである。図13は、実施の形態7に係る情報表示装置の構成を示すブロック図である。この情報表示装置は、図12に示した実施の形態6に係る情報表示装置の表示制御部23に代えてコマンド判定部31を設け、さらに、通信装置32および画像検索装置33を追加して構成されている。
Claims (9)
- 利用者の視界に対応する画像を入力する画像入力部と、
利用者の視界内における視線の位置を表す注視点を検知する視線検知部と、
前記画像入力部から入力された画像から、前記視線検知部で検知された前記注視点を含む物体の前記画像内における領域を第1の領域として抽出する物体認識部と、
前記物体認識部で抽出された前記第1の領域についての情報に基づき、視界内で利用者の視線があたっていない位置を表示位置として決定する表示位置決定部と、
前記表示位置決定部で決定された表示位置に、利用者に提示する情報を表示する情報表示部
とを備えた情報表示装置。 - 前記表示位置決定部は、前記画像内のうち前記第1の領域とは異なる第2の領域に対応する視界内の位置を前記表示位置として決定する
ことを特徴とする請求項1に記載の情報表示装置。 - 前記物体認識部は、前記第2の領域内に存在する物体の領域を第3の領域として抽出し、
前記表示位置決定部は、前記第2の領域内であって前記第3の領域とは異なる領域に対応する視界内の位置を前記表示位置として決定する
ことを特徴とする請求項2に記載の情報表示装置。 - 時間を計測するタイマを備え、
前記表示位置決定部は、前記タイマにおいて設定された時間が計測されるまで前記表示位置を維持する
ことを特徴とする請求項1乃至3のいずれかに記載の情報表示装置。 - 前記表示位置決定部は、前記視線検知部で検知された利用者の前記注視点の移動に応じて、情報の前記表示位置を制御する
ことを特徴とする請求項1乃至4のいずれかに記載の情報表示装置。 - 加速度を検出する加速度センサおよび位置を検出する位置情報センサからの情報に基づき利用者の動作状態を検知する動作状態検知部と、
前記動作状態検知部で検知された利用者の動作状態に基づき情報表示の可否を指示する表示制御部とを備え、
前記情報表示部は、前記表示制御部からの指示に基づき情報表示を行う
ことを特徴とする請求項1乃至5のいずれかに記載の情報表示装置。 - 周囲の音声を入力する音声入力部と、
前記音声入力部で入力された音声を認識する音声認識部と、
前記音声認識部からの音声認識結果に応じて、情報表示の可否を指示する表示制御部とを備え、
前記情報表示部は、前記表示制御部からの指示に基づき情報表示を行う
ことを特徴とする請求項1乃至5のいずれかに記載の情報表示装置。 - 周囲の音声を入力する音声入力部と、
前記音声入力部で入力された音声を認識する音声認識部と、
前記音声認識部からの音声認識結果が検索のコマンドを示している場合に、前記物体認識部および前記視線検知部からの情報に基づき検索対象となる画像を取得して、該画像検索を依頼するコマンド判定部と、
前記コマンド判定部から受け取った画像に関連する情報を検索し、検索結果を前記コマンド判定部に送る画像検索装置とを備え、
前記コマンド判定部は、前記画像検索装置から受け取った検索結果の情報を前記情報表示部に送り、
前記情報表示部は、前記コマンド判定部から受け取った検索結果の情報に基づき情報表示を行う
ことを特徴とする請求項1乃至5のいずれかに記載の情報表示装置。 - 前記表示位置決定部は、前記画像内の中央に対して前記第1の領域とは反対側の領域に対応する視界内の位置を前記表示位置として決定する
ことを特徴とする請求項1記載の情報表示装置。
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/783,828 US20160054795A1 (en) | 2013-05-29 | 2013-05-29 | Information display device |
| CN201380076824.0A CN105229584A (zh) | 2013-05-29 | 2013-05-29 | 信息显示装置 |
| KR1020157036710A KR20160016907A (ko) | 2013-05-29 | 2013-05-29 | 정보 표시 장치 |
| JP2015519547A JPWO2014192103A1 (ja) | 2013-05-29 | 2013-05-29 | 情報表示装置 |
| PCT/JP2013/064927 WO2014192103A1 (ja) | 2013-05-29 | 2013-05-29 | 情報表示装置 |
| EP13885922.8A EP3007048A4 (en) | 2013-05-29 | 2013-05-29 | Information display device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2013/064927 WO2014192103A1 (ja) | 2013-05-29 | 2013-05-29 | 情報表示装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014192103A1 true WO2014192103A1 (ja) | 2014-12-04 |
Family
ID=51988176
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2013/064927 Ceased WO2014192103A1 (ja) | 2013-05-29 | 2013-05-29 | 情報表示装置 |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US20160054795A1 (ja) |
| EP (1) | EP3007048A4 (ja) |
| JP (1) | JPWO2014192103A1 (ja) |
| KR (1) | KR20160016907A (ja) |
| CN (1) | CN105229584A (ja) |
| WO (1) | WO2014192103A1 (ja) |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016161734A (ja) * | 2015-03-02 | 2016-09-05 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、および、プログラム |
| JP2018501562A (ja) * | 2015-01-05 | 2018-01-18 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | 電子デバイス、情報要求を増補する方法、および情報要求を増補するためのコンピュータ・プログラム製品 |
| JP2018049528A (ja) * | 2016-09-23 | 2018-03-29 | 富士ゼロックス株式会社 | 情報処理装置、画像形成装置およびプログラム |
| JP2019515361A (ja) * | 2016-04-27 | 2019-06-06 | ロヴィ ガイズ, インコーポレイテッド | 仮想現実環境を表示するヘッドアップディスプレイ上に付加的コンテンツを表示するための方法およびシステム |
| WO2019187487A1 (ja) * | 2018-03-28 | 2019-10-03 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| JPWO2021149594A1 (ja) * | 2020-01-21 | 2021-07-29 | ||
| US11353949B2 (en) | 2016-04-27 | 2022-06-07 | Rovi Guides, Inc. | Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment |
| WO2023100637A1 (ja) * | 2021-11-30 | 2023-06-08 | 株式会社ドワンゴ | アラート表示システム、アラート表示方法、およびアラート表示プログラム |
Families Citing this family (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20150008733A (ko) * | 2013-07-15 | 2015-01-23 | 엘지전자 주식회사 | 안경형 휴대기기 및 그의 정보 투사면 탐색방법 |
| KR102154912B1 (ko) * | 2013-10-11 | 2020-09-11 | 인터디지탈 패튼 홀딩스, 인크 | 시선 구동 증강 현실 |
| JP6264870B2 (ja) * | 2013-12-13 | 2018-01-24 | 富士通株式会社 | 情報提供装置、プログラム、及びシステム |
| US10444930B2 (en) * | 2014-08-05 | 2019-10-15 | Lg Electronics Inc. | Head-mounted display device and control method therefor |
| US9955059B2 (en) * | 2014-10-29 | 2018-04-24 | Kabushiki Kaisha Toshiba | Electronic device, method, and computer program product |
| WO2016121883A1 (ja) | 2015-01-29 | 2016-08-04 | 京セラ株式会社 | 電子機器 |
| US10618528B2 (en) * | 2015-10-30 | 2020-04-14 | Mitsubishi Electric Corporation | Driving assistance apparatus |
| US9851792B2 (en) | 2016-04-27 | 2017-12-26 | Rovi Guides, Inc. | Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment |
| CN106274689B (zh) * | 2016-08-18 | 2019-01-11 | 青岛海信移动通信技术股份有限公司 | 倒车影像中的信息显示方法、装置及终端 |
| CN107765842A (zh) * | 2016-08-23 | 2018-03-06 | 深圳市掌网科技股份有限公司 | 一种增强现实方法及系统 |
| US20190271843A1 (en) * | 2016-11-02 | 2019-09-05 | Sharp Kabushiki Kaisha | Terminal apparatus, operating method, and program |
| EP4235382A3 (en) * | 2016-12-21 | 2023-10-11 | InterDigital VC Holdings, Inc. | System and method for placement of augmented reality information for users based on their activity |
| US10354153B2 (en) * | 2017-03-02 | 2019-07-16 | Ricoh Company, Ltd. | Display controller, display control method, and recording medium storing program |
| EP3537712B1 (en) * | 2018-03-09 | 2025-11-26 | Bayerische Motoren Werke Aktiengesellschaft | Method, system and computer program product for controlling a video call while driving a vehicle |
| CN109670456A (zh) * | 2018-12-21 | 2019-04-23 | 北京七鑫易维信息技术有限公司 | 一种内容推送方法、装置、终端和存储介质 |
| CN110347261A (zh) * | 2019-07-11 | 2019-10-18 | Oppo广东移动通信有限公司 | 信息显示方法、装置、存储介质及增强现实设备 |
| CN115152203A (zh) * | 2020-02-21 | 2022-10-04 | 麦克赛尔株式会社 | 信息显示装置 |
| JP7380365B2 (ja) * | 2020-03-19 | 2023-11-15 | マツダ株式会社 | 状態推定装置 |
| CN112506345B (zh) * | 2020-12-10 | 2024-04-16 | 北京达佳互联信息技术有限公司 | 一种页面显示方法、装置、电子设备及存储介质 |
| US11711332B2 (en) | 2021-05-25 | 2023-07-25 | Samsung Electronics Co., Ltd. | System and method for conversation-based notification management |
| CN116092058B (zh) * | 2022-11-01 | 2025-08-22 | 合肥工业大学 | 基于视线方向对比学习的驾驶员分心行为检测方法 |
| CN116380061B (zh) * | 2022-12-15 | 2025-09-09 | 中国科学院上海微系统与信息技术研究所 | 一种移动机器人视听觉融合感知与导航方法 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08190640A (ja) * | 1995-01-12 | 1996-07-23 | Hitachi Ltd | 情報表示方法および情報提供システム |
| JPH11202256A (ja) * | 1998-01-20 | 1999-07-30 | Ricoh Co Ltd | 頭部搭載型画像表示装置 |
| JP2010081012A (ja) * | 2008-09-24 | 2010-04-08 | Casio Computer Co Ltd | 撮像装置、撮像制御方法及びプログラム |
| JP2012108793A (ja) | 2010-11-18 | 2012-06-07 | Nec Corp | 情報表示システム、装置、方法及びプログラム |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB0422500D0 (en) * | 2004-10-09 | 2004-11-10 | Ibm | Method and system for re-arranging a display |
| JP4691071B2 (ja) * | 2007-07-05 | 2011-06-01 | ヤフー株式会社 | ページアクション起動装置、ページアクション起動制御方法、および、ページアクション起動制御プログラム |
| CN101566990A (zh) * | 2008-04-25 | 2009-10-28 | 李奕 | 一种嵌入于视频的搜索方法及其系统 |
| JP2010061265A (ja) * | 2008-09-02 | 2010-03-18 | Fujifilm Corp | 人物検索登録システム |
| JP5656457B2 (ja) * | 2010-06-01 | 2015-01-21 | シャープ株式会社 | 商品情報提供端末装置および商品情報提供システム |
| US20120327116A1 (en) * | 2011-06-23 | 2012-12-27 | Microsoft Corporation | Total field of view classification for head-mounted display |
| US9323325B2 (en) * | 2011-08-30 | 2016-04-26 | Microsoft Technology Licensing, Llc | Enhancing an object of interest in a see-through, mixed reality display device |
| US8963805B2 (en) * | 2012-01-27 | 2015-02-24 | Microsoft Corporation | Executable virtual objects associated with real objects |
| CN102749991B (zh) * | 2012-04-12 | 2016-04-27 | 广东百泰科技有限公司 | 一种适用于人机交互的非接触式自由空间视线跟踪方法 |
| US10685487B2 (en) * | 2013-03-06 | 2020-06-16 | Qualcomm Incorporated | Disabling augmented reality (AR) devices at speed |
-
2013
- 2013-05-29 WO PCT/JP2013/064927 patent/WO2014192103A1/ja not_active Ceased
- 2013-05-29 EP EP13885922.8A patent/EP3007048A4/en not_active Withdrawn
- 2013-05-29 JP JP2015519547A patent/JPWO2014192103A1/ja active Pending
- 2013-05-29 KR KR1020157036710A patent/KR20160016907A/ko not_active Ceased
- 2013-05-29 US US14/783,828 patent/US20160054795A1/en not_active Abandoned
- 2013-05-29 CN CN201380076824.0A patent/CN105229584A/zh active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08190640A (ja) * | 1995-01-12 | 1996-07-23 | Hitachi Ltd | 情報表示方法および情報提供システム |
| JPH11202256A (ja) * | 1998-01-20 | 1999-07-30 | Ricoh Co Ltd | 頭部搭載型画像表示装置 |
| JP2010081012A (ja) * | 2008-09-24 | 2010-04-08 | Casio Computer Co Ltd | 撮像装置、撮像制御方法及びプログラム |
| JP2012108793A (ja) | 2010-11-18 | 2012-06-07 | Nec Corp | 情報表示システム、装置、方法及びプログラム |
Non-Patent Citations (3)
| Title |
|---|
| See also references of EP3007048A4 |
| YOSHIO ISHIGURO; JUNICHI REKIMOTO: "Peripheral Vision Annotation: Noninterference Information Presentation Method by Using Gaze Information", IPSJ (INFORMATION PROCESSING SOCIETY OF JAPAN) JOURNAL, vol. 53, no. 4, April 2012 (2012-04-01), pages 1328 - 1337 |
| YUHI GOTO; YUJI YAMAUCHI; HIRONOBU FUJIYOSHI: "Proposal of Shape Feature Amount CS-HOG Based on Color Similarity", SYMPOSIUM ON SENSING VIA IMAGE INFORMATION (SSII2012, 2012, pages IS3 - 04 |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018501562A (ja) * | 2015-01-05 | 2018-01-18 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | 電子デバイス、情報要求を増補する方法、および情報要求を増補するためのコンピュータ・プログラム製品 |
| JP2016161734A (ja) * | 2015-03-02 | 2016-09-05 | セイコーエプソン株式会社 | 表示装置、表示装置の制御方法、および、プログラム |
| US11353949B2 (en) | 2016-04-27 | 2022-06-07 | Rovi Guides, Inc. | Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment |
| JP2019515361A (ja) * | 2016-04-27 | 2019-06-06 | ロヴィ ガイズ, インコーポレイテッド | 仮想現実環境を表示するヘッドアップディスプレイ上に付加的コンテンツを表示するための方法およびシステム |
| US12050724B2 (en) | 2016-04-27 | 2024-07-30 | Rovi Guides, Inc. | Methods and systems for displaying additional content on a heads up display displaying a virtual reality environment |
| JP2018049528A (ja) * | 2016-09-23 | 2018-03-29 | 富士ゼロックス株式会社 | 情報処理装置、画像形成装置およびプログラム |
| WO2019187487A1 (ja) * | 2018-03-28 | 2019-10-03 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| WO2021149594A1 (ja) * | 2020-01-21 | 2021-07-29 | パイオニア株式会社 | 情報提供装置、情報提供方法、情報提供プログラム及び記憶媒体 |
| JP2023111989A (ja) * | 2020-01-21 | 2023-08-10 | パイオニア株式会社 | 情報提供装置 |
| JPWO2021149594A1 (ja) * | 2020-01-21 | 2021-07-29 | ||
| JP2025105844A (ja) * | 2020-01-21 | 2025-07-10 | パイオニア株式会社 | 情報提供装置、情報提供方法、情報提供プログラム及び記憶媒体 |
| WO2023100637A1 (ja) * | 2021-11-30 | 2023-06-08 | 株式会社ドワンゴ | アラート表示システム、アラート表示方法、およびアラート表示プログラム |
| JP2023081010A (ja) * | 2021-11-30 | 2023-06-09 | 株式会社ドワンゴ | アラート表示システム、アラート表示方法、およびアラート表示プログラム |
| JP7316344B2 (ja) | 2021-11-30 | 2023-07-27 | 株式会社ドワンゴ | アラート表示システム、アラート表示方法、およびアラート表示プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20160054795A1 (en) | 2016-02-25 |
| EP3007048A1 (en) | 2016-04-13 |
| JPWO2014192103A1 (ja) | 2017-02-23 |
| EP3007048A4 (en) | 2017-01-25 |
| CN105229584A (zh) | 2016-01-06 |
| KR20160016907A (ko) | 2016-02-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2014192103A1 (ja) | 情報表示装置 | |
| US11868517B2 (en) | Information processing apparatus and information processing method | |
| US20200341284A1 (en) | Information processing apparatus, information processing method, and recording medium | |
| JP6848881B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| KR101430614B1 (ko) | 웨어러블 안경을 이용한 디스플레이 장치 및 그 동작 방법 | |
| EP2891953A1 (en) | Eye vergence detection on a display | |
| US10475439B2 (en) | Information processing system and information processing method | |
| CN109643208B (zh) | 显示装置、存储介质、显示方法及控制装置 | |
| US10061995B2 (en) | Imaging system to detect a trigger and select an imaging area | |
| CN113590070A (zh) | 导航界面的显示方法、装置、终端及存储介质 | |
| US10771707B2 (en) | Information processing device and information processing method | |
| JP2020166542A (ja) | 運転支援装置、運転支援方法、運転支援プログラム及び車両 | |
| KR102330218B1 (ko) | 발달장애인의 언어 훈련을 위한 가상현실 교육 시스템 및 방법 | |
| JP2005313772A (ja) | 車両用ヘッドアップ・ディスプレイ装置 | |
| JP2014174880A (ja) | 情報提供装置、及び情報提供プログラム | |
| EP2831702B1 (en) | Information processing device, information processing method and program | |
| JP2014174879A (ja) | 情報提供装置、及び情報提供プログラム | |
| JP2014174091A (ja) | 情報提供装置、及び情報提供プログラム | |
| JP6673567B2 (ja) | 情報表示装置、情報表示システム及び情報表示方法 | |
| JP6371589B2 (ja) | 車載システム、視線入力受付方法及びコンピュータプログラム | |
| US11487355B2 (en) | Information processing apparatus and information processing method | |
| US12339730B2 (en) | Head mounted display apparatus | |
| US20240402503A1 (en) | Automatic Display Adjustment | |
| HK40055355A (en) | Method and apparatus for displaying navigation interface, terminal and storage medium | |
| JP4926301B1 (ja) | 画像描画装置、画像描画方法及び画像描画プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201380076824.0 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13885922 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2015519547 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2013885922 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14783828 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 20157036710 Country of ref document: KR Kind code of ref document: A |