[go: up one dir, main page]

WO2018180857A1 - Appareil d'affichage tête haute - Google Patents

Appareil d'affichage tête haute Download PDF

Info

Publication number
WO2018180857A1
WO2018180857A1 PCT/JP2018/011335 JP2018011335W WO2018180857A1 WO 2018180857 A1 WO2018180857 A1 WO 2018180857A1 JP 2018011335 W JP2018011335 W JP 2018011335W WO 2018180857 A1 WO2018180857 A1 WO 2018180857A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
distance
projection
display
optical system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/011335
Other languages
English (en)
Japanese (ja)
Inventor
山田範秀
菅原和弘
丹内修
橋村淳司
小嶋俊之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Priority to JP2019509649A priority Critical patent/JPWO2018180857A1/ja
Publication of WO2018180857A1 publication Critical patent/WO2018180857A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • B60K35/232Head-up displays [HUD] controlling the projection distance of virtual images depending on the condition of the vehicle or the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • B60K35/233Head-up displays [HUD] controlling the size or position in display areas of virtual images depending on the condition of the vehicle or the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Definitions

  • the present invention relates to a head-up display device in which the projection position of a virtual image is variable.
  • HUD Head-Up Display
  • display contents include vehicle speed and car navigation information.
  • the purpose of mounting the HUD in the car is to support safer driving by minimizing the movement of the driver's line of sight. It is not enough, for example, a system that detects vehicles ahead, pedestrians, obstacles, etc. with cameras and other sensors and makes the driver detect danger in advance through HUD to prevent accidents in advance. preferable.
  • it is conceivable to display a frame-shaped danger signal superimposed on a car, a person, an obstacle or the like see, for example, Patent Document 1).
  • the HUD device that changes the display distance of the virtual image includes a scanning image forming unit, a diffusing screen, a projecting unit, and a movable unit that changes the position of the diffusing screen.
  • a device that changes the projection position in the depth direction is known (see, for example, Patent Documents 2 and 3). These HUD devices reduce the movement of the driver's line of sight by reducing or increasing the display distance of the virtual image in view of the fact that the distance that the human gazes changes with the speed of the vehicle (Patent Document 2). It is not intended to adjust the display position of a virtual image with respect to an object such as a car, a person, or an obstacle.
  • a HUD device for the purpose of, for example, superimposing a virtual image display position on an object such as a car, a person, an obstacle, etc., or displaying it in the vicinity thereof to convey the danger to the driver, it is called a danger during driving. Since the event exists regardless of the distance of the line of sight, it is preferable that the danger signal can be displayed simultaneously at a long distance and a short distance.
  • the diffusion screen is driven at a high speed and an image synchronized with it is generated by the image forming means so that it appears as if it is simultaneously displayed to the human eye.
  • it is difficult for the scanning image forming means to cope with display switching at a high frame rate it is not suitable for a configuration in which virtual images are simultaneously displayed at a plurality of distances.
  • a head-up display device detects a projection optical system that projects a virtual image, an object that exists in a detection region, and performs projection.
  • An object detection unit that detects the distance from the optical system to the object as a target distance
  • a projection distance change unit that changes the projection distance of the virtual image from the projection optical system aperiodically, and a target distance for the detected object
  • an image adding unit for adding a related information image as a virtual image by the projection optical system so that the projection distance and the projection distance substantially coincide.
  • FIG. 1A is a side cross-sectional view illustrating a state in which the head-up display device according to the first embodiment is mounted on a vehicle body
  • FIG. 1B is a front view from the inside of the vehicle illustrating the head-up display device.
  • It is an expansion side sectional view explaining the example of concrete composition of the projection optical system etc. which constitute the head up display device.
  • It is a block diagram explaining the whole structure of a head-up display apparatus.
  • It is a perspective view explaining the concrete display state.
  • 5A and 5B show a long-distance projection image or frame frame of the virtual image shown in FIG. 4 and the arrangement state of the diffusion screen corresponding thereto
  • FIGS. 5C and 5D show a medium-distance projection image or frame frame.
  • FIGS. 5E and 5F show the projection image or frame frame at a short distance and the arrangement state of the diffusion screen corresponding to this. It is a figure explaining the display operation of a head up display device. It is a figure explaining the head up display device of a 2nd embodiment. 8A and 8B are diagrams for explaining a frame frame and an index of a modified example. It is a figure explaining the modification of the projection optical system shown in FIG.
  • the image display device 100 is mounted, for example, in a car body 2 of an automobile, and includes a drawing unit 10 and a display screen 20.
  • the image display device 100 displays image information displayed on a display element 11 (described later) in the drawing unit 10 on a virtual image for the driver UN via the display screen 20.
  • the drawing unit 10 of the image display device 100 is installed in the dashboard 4 of the vehicle body 2 so as to be embedded behind the display 50, and displays a display light HK corresponding to an image including driving-related information and the like on a display screen.
  • the display screen 20 is also called a combiner, and is a concave mirror or a plane mirror having semi-transparency.
  • the display screen 20 is erected on the dashboard 4 with the lower end supported, and reflects the display light HK from the drawing unit 10 toward the rear of the vehicle body 2. That is, in the illustrated case, the display screen 20 is an independent type that is installed separately from the front window 8.
  • the display light HK reflected by the display screen 20 is guided to an eye box (not shown) corresponding to the pupil HT of the driver UN sitting on the driver's seat 6 and its peripheral position.
  • the driver UN can observe the display light HK reflected by the display screen 20, that is, the projection image IM as a virtual image in front of the vehicle body 2.
  • the driver UN can observe external light transmitted through the display screen 20, that is, a real image of a front scene, a car, and the like.
  • the driver UN overlaps the external image or see-through image behind the display screen 20 and includes a projection image (virtual image) including related information such as driving related information formed by reflection of the display light HK on the display screen 20. ) IM can be observed.
  • the display screen 20 is configured separately from the front window 8, but the front window 8 is used as a display screen, projection is performed on the display range set in the front window 8, and the driver UN projects the projection image IM. It does not matter even if it is the composition which can observe.
  • the reflection area can be secured by changing the reflectance of a partial area of the glass of the front window 8 by a coat or the like. Further, if the reflection angle at the front window 8 is about 60 degrees, for example, the reflectivity is secured about 15%, and it can be used as a reflective surface having transparency even without providing a coat.
  • a display screen can be provided in a configuration sandwiched in the glass of the front window 8.
  • the drawing unit 10 includes a main body optical system 13 that is a virtual image type enlarged imaging system including a display element 11, a display control unit 18 that operates the main body optical system 13, a main body optical system 13, and the like. And a housing 14 for storing the housing.
  • the combination of the main body optical system 13 and the display screen 20 constitutes a projection optical system 30.
  • the main body optical system 13 is formed in an imaging optical system 15 that can form an intermediate image TI obtained by enlarging an image formed on the display element 11, and in the intermediate image TI or in the vicinity thereof.
  • a virtual image forming optical system 17 that converts the forced intermediate image TI ′ into a virtual image.
  • the intermediate image TI is not formed but the position where the intermediate image TI was supposed to be formed will be referred to as an image formation planned position below.
  • the display element 11 has a two-dimensional display surface 11a. An image formed on the display surface 11 a of the display element 11 is enlarged by the imaging optical system 15 and projected onto any one of the diffusion screens 16 a to 16 c constituting the screen group 16. At this time, by using the display element 11 capable of two-dimensional display, the display content of the projection image or intermediate image on each of the diffusion screens 16a to 16c is switched, that is, the projection image IM displayed as a virtual image through the display screen 20 is changed. Switching can be made relatively fast.
  • the display element 11 may be a reflective element such as DMD (Digital Mirror Device) or LCOS (Liquid Crystal On On Silicon) or a transmissive element such as liquid crystal.
  • the display element 11 when DMD or LCOS is used as the display element 11, it is easy to switch images at high speed (including high-speed intermittent display) while maintaining brightness, which is advantageous for display that changes the virtual image distance or projection distance. is there.
  • the display element 11 operates at a frame rate of 30 fps or higher, more preferably 60 fps or higher. As a result, it is possible to make it appear as if a plurality of projection images (virtual images) IM are simultaneously displayed on the driver UN at different projection distances. In particular, when switching display at 90 fps or more, DMD and LCOS are candidates for the display element 11.
  • the imaging optical system 15 is a fixed-focus lens system, and has a plurality of lenses (not shown).
  • the imaging optical system 15 can enlarge and project an image formed on the display surface 11a of the display element 11 to form an intermediate image TI at the above-described planned image formation position (the intermediate image TI itself is the display element 11). Display operation is assumed).
  • the imaging optical system 15 has a relatively large depth of focus on the image side.
  • a plurality of images constituting the screen group 16 are arranged on the optical path of the vicinity region RV including the image formation position of the intermediate image TI enlarged and projected by the image formation optical system 15 and the vicinity of the image formation position along the optical axis AX.
  • Any one of the diffusion screens 16a to 16c can be selectively inserted.
  • Each of the diffusing screens 16a to 16c is a diffusing plate for controlling a light distribution angle to a desired angle, and forms a forced intermediate image TI ′ at an image forming position (that is, at or near the image forming position of the intermediate image TI). To do.
  • the forced intermediate image TI ' can be formed at the position of the diffusion screen.
  • the positions P1 to P3 of the forced intermediate image TI ′ can be set at a plurality of discrete specified positions (specifically, three positions) along the optical axis AX, and the position of the forced intermediate image TI ′. Can be arbitrarily changed to any of the positions P1 to P3.
  • the diffusion screens 16a to 16c for example, frosted glass, a lens diffusion plate, a microlens array, or the like can be used.
  • the screen group 16 that forms the forced intermediate image TI ′ at a plurality of specified locations on the optical path along the optical axis AX, and the drive mechanism 65 that moves these back and forth on the optical path constitute a projection distance changing unit 62.
  • the projection distance changing unit 62 changes the projection distance of the virtual image from the projection optical system 30 aperiodically, and can change the projection distance to the projection image (virtual image) IM in an arbitrary order. Thereby, the freedom degree of the timing which adds a related information image by the image addition part (the main control apparatus 90 and the display control part 18) mentioned later can be raised.
  • the drive mechanism 65 of the projection distance changing unit 62 includes three actuators corresponding to the three diffusion screens 16a to 16c, and individually moves the diffusion screens 16a to 16c forward and backward on the optical path. Can do.
  • the drive mechanism 65 operates under the control of the display control unit 18, and any one of the diffusing screens 16a to 16c constituting the screen group 16 is set at a desired timing or a different operation position on the optical path of the projection optical system 30. Insert alternatively. Thereby, the projection distance can be freely set according to the insertion positions of the diffusion screens 16a to 16c.
  • the forced intermediate image TI ′ can be alternatively formed at any one of the positions P1 to P3 corresponding to the selected diffusion screens 16a to 16c, and the projection distance to the projection image (virtual image) IM is a long distance. , Medium distance, and short distance.
  • the insertion positions of the diffusing screens 16a to 16c are the positions where the intermediate image TI is formed in the projection optical system 30 or the vicinity thereof, and the intermediate image TI is relatively clear.
  • the virtual image forming optical system 17 enlarges the forced intermediate image TI 'formed on the diffusion screens 16a to 16c in cooperation with the display screen 20, and forms a projection image IM as a virtual image in front of the driver UN.
  • the virtual image forming optical system 17 includes at least one mirror, but in the illustrated example, includes two mirrors 17a and 17b.
  • the diffusing screens 16a to 16c by arranging the diffusing screens 16a to 16c on the optical path, not only a plurality of forced intermediate images TI ′ having different positions in the optical axis AX direction can be formed, but also the viewing angle, the eyebox size, As a result, the light utilization efficiency of the optical system can be increased.
  • the imaging optical system 15 satisfies the following conditional expression (1).
  • the value F represents the F number on the display element 11 side of the imaging optical system 15
  • the value P represents the pixel pitch [mm] of the display element 11
  • the value ⁇ is for obtaining a desired virtual image distance range.
  • the value 2 ⁇ F ⁇ P ⁇ m 2 / ⁇ in the conditional expression (1) is a utilization factor of the depth of focus.
  • the magnification of the relay optical system is increased to increase the observable range (that is, the eye box) when the eyes of the driver UN, who is the observer, move, and the diffusion screen. Even if the size of the screen is increased, there is an advantage that the diffusion screen does not need to be moved and the moving mechanism does not become large.
  • FIG. 3 is a block diagram for explaining the overall structure of the head-up display device 200.
  • the head-up display device 200 includes the image display device 100 as a part thereof.
  • the image display device 100 has the structure shown in FIG. 2, and a description thereof is omitted here.
  • the head-up display device 200 includes an environment monitoring unit 72 and a main control device 90 in addition to the image display device 100.
  • the environment monitoring unit 72 is an object detection unit that detects an object existing in the detection area, and identifies a moving body or person existing in the vicinity of the front, specifically, a car, a bicycle, a pedestrian, or the like as an object. And a three-dimensional measuring device for extracting three-dimensional position information of the object. Thereby, the three-dimensional display of the related information image becomes possible by the three-dimensional recognition of the object. Further, the environment monitoring unit 72 adds a related information image of a virtual image to the moving body or person, and can inform the driver UN of the head-up display device 200 of the presence of the moving body or person.
  • the environment monitoring unit (object detection unit) 72 includes an external camera 72a, an external image processing unit 72b, and a determination unit 72c as a three-dimensional measuring instrument.
  • the external camera 72a can capture an external image in the visible or infrared region.
  • the external camera 72a is installed at appropriate positions inside and outside the vehicle body 2, and images a detection area VF (see FIG. 4 described later) in front of the driver UN or the front window 8 as an external image.
  • the external image processing unit 72b performs various types of image processing such as brightness correction on the external image captured by the external camera 72a to facilitate processing by the determination unit 72c.
  • the determination unit 72c extracts or cuts out an object image from the external image that has passed through the external image processing unit 72b, thereby identifying an object such as an automobile, a bicycle, or a pedestrian (specifically, an object OB1, FIG. OB2 and OB3) is detected, and the spatial position of the object in front of the vehicle body 2 is calculated from the depth information attached to the external image and stored in the storage unit 72m as three-dimensional position information.
  • Software that enables extraction of an object image from an external image is stored in the storage unit 72m of the determination unit 72c, and software that is required from the storage unit 72m during operation for extracting an object image from the external image is stored. Data is read out.
  • the determination unit 72c can detect what the object corresponding to the object element is, for example, from the shape, size, color, etc. of each object element in the obtained image.
  • the criteria for determination include a method of detecting what an object is based on the degree of matching by performing pattern matching with information registered in advance. Further, from the viewpoint of increasing the processing speed, it is also possible to detect a lane from an image and detect an object from the shape, size, color, etc. of the target or object element in the lane.
  • the external camera 72a is a compound eye type three-dimensional camera, for example, although not shown. That is, the external camera 72a is a camera element in which an imaging lens and a CMOS (Complementary Metal Oxide Semiconductor) or other image sensor are arranged in a matrix and a drive circuit for the image sensor. Respectively.
  • the plurality of camera elements constituting the external camera 72a are adapted to focus at different positions in the depth direction, for example, or to detect relative parallax, so that an image obtained from the camera element can be detected. By analyzing the state (focus state, object position, etc.), the target distance to each region or object in the image corresponding to the detection region can be determined.
  • a combination of a two-dimensional camera and an infrared distance sensor may be used in the depth direction with respect to each part (area or object) in the captured screen.
  • a target distance that is distance information can be obtained.
  • a target distance that is distance information in the depth direction is obtained for each part (area or object) in the photographed screen.
  • a target distance that is distance information in the depth direction can be obtained for each part (region or object) in the captured screen by performing imaging while changing the focal length at high speed. it can.
  • distance information in the depth direction can be obtained for each part (region or object) in the detection region by using LIDAR (Light Detection and Ranging) technology.
  • LIDAR Light Detection and Ranging
  • the scattered light for pulsed laser irradiation can be measured, the distance to the object at a long distance and the spread can be measured, and the distance information to the object in the field of view and the information about the spread of the object can be acquired.
  • the detection accuracy of objects can be improved by a complex method that combines radar sensing technology such as LIDAR technology with technology that detects the distance of an object from image information, that is, a method that fuses multiple sensors. Can do.
  • the operating speed of the external camera 72a for detecting the object needs to be equal to or higher than the operating speed of the display element 11 from the viewpoint of speeding up the input. If the operating speed of the display element 11 is 30 fps or higher, it is equivalent to this. Need to be faster.
  • the external camera 72a is preferably capable of detecting an object at high speed by a high-speed operation such as 480 fps or 1000 fps, for example, at a speed higher than 120 fps. Further, when a plurality of sensors are to be fused, it is not always necessary that all the sensors be high speed, and at least one of the plurality of sensors needs to be high speed, but the other sensors may not be high speed. In this case, a method may be used in which sensing accuracy is increased by using data detected by a high-speed sensor as a base and supplementing with data from a non-high-speed sensor.
  • the display control unit 18 operates the projection optical system 30 under the control of the main control device 90 to display a three-dimensional projection image IM whose virtual image distance or projection distance changes behind the display screen 20.
  • the main control device 90 has a role of harmonizing the operations of the image display device 100, the environment monitoring unit 72, and the like.
  • the main controller 90 aperiodically changes the projection distance of the virtual image that is the projection image IM by the projection optical system 30 by appropriately operating the projection distance changing unit 62 via the display control unit 18, for example. That is, the main controller 90 or the like changes the projection position in the depth direction of the virtual image that is the projection image IM aperiodically. In this case, the related information image can be given to the object at an appropriate position in the process of reciprocating the projection position aperiodically. Further, main controller 90 adjusts the spatial arrangement of frame frame HW (see FIG.
  • main controller 90 generates projection image IM to be displayed on projection optical system 30 from display information including the display shape and display distance received from environment monitoring unit 72.
  • the projected image IM is, for example, a sign such as a frame frame HW (see FIG. 4) positioned in the periphery with respect to the direction of the depth position of an automobile, bicycle, pedestrian, or other object that exists behind the display screen 20.
  • the main controller 90 functions as an image adding unit in cooperation with the display control unit 18 and is detected at a timing such that the detected target distance to the object substantially coincides with the projection distance.
  • a related information image is added to the object as a virtual image by the projection optical system 30.
  • the related information is, for example, a frame HW surrounding the object or an index adjacent to the object. In this case, the presence of a moving object or a person can be notified by the frame frame HW or the index.
  • the related information image is displayed at high speed so that the observer can view the related information image as a three-dimensional display at the same time or almost simultaneously with the object.
  • it is necessary to increase the speed of detection, recognition / judgment processing, and display.
  • display delay there is no display delay (latency) when performing display such that the related information image is superimposed on an object or target existing in the actual scene, and the viewer or driver UN feels uncomfortable when viewing the display or virtual image.
  • driving operations such as accident avoidance can be performed quickly.
  • FIG. 4 is a perspective view for explaining a specific display state.
  • a detection area VF corresponding to an observation field is provided in front of the driver UN as an observer. It is assumed that there is an object OB1 of a person who is a pedestrian or the like, and a moving object OB2 which is a car or the like in the detection region VF, that is, in and around the road.
  • the main control device 90 projects a three-dimensional projection image (virtual image) IM by the image display device 100 and relates to the objects OB1, OB2, and OB3 existing at a short distance, a medium distance, and a long distance.
  • Frame frames HW1, HW2, and HW3 are added as information images.
  • the projection distances to the projection images IM1, IM2, and IM3 for displaying the frame frames HW1, HW2, and HW3 are different from the driver UN to the objects OB1, This corresponds to the distance to OB2 and OB3.
  • the projection distances of the projection images IM1, IM2, and IM3 are discrete, and cannot be accurately matched to the actual distances to the objects OB1, OB2, and OB3.
  • FIG. 5A shows the long-distance projection image IM3 or frame frame HW3 shown in FIG. 4, and the forced intermediate image TI projected onto the diffusion screen 16a inserted at the position P1 in the projection distance changing unit 62 shown in FIG. 5B.
  • FIG. 5C shows the intermediate-distance projection image IM2 or frame HW2 shown in FIG. 4, and the forced intermediate image TI projected onto the diffusion screen 16b inserted at the position P2 in the projection distance changing unit 62 shown in FIG. 5D.
  • Equivalent to '. 5E shows the short-distance projection image IM1 or frame frame HW1 shown in FIG.
  • Main controller 90 detects an object using environment monitoring unit 72 (step S11), and when an object is detected (Y in step S11), target distance corresponding to a position in the depth direction from the three-dimensional information of the object And area information related to the spread of the object are extracted (step S12).
  • Main controller 90 determines whether or not a plurality of objects are detected and the target distance covers a plurality of distance ranges (step S13).
  • the distance range means the short distance, the middle distance, and the far distance described in FIG. 4, and the plurality of distance ranges means, for example, that an object exists at the short distance and the middle distance. .
  • the main controller 90 determines based on the detected target distance or the single distance range.
  • the display control unit 18 and the projection distance changing unit 62 are operated to select the diffusing screens 16a to 16c corresponding to the target distance from the screen group 16 and arrange them on the optical path (step S14). That is, the projection distance changing unit 62 selects one of the diffusion screens 16a to 16c at the timing when the target distance is detected by the environment monitoring unit 72 serving as an object detection unit, and thereby approximates the projection image (virtual image) to the target distance. ) Set the IM projection distance.
  • Main controller 90 prepares display data corresponding to a display image to be formed on display surface 11a of display element 11 based on the target distance and area information extracted in step S12 (step S15).
  • This display data is related information corresponding to the detected object, and for example, warns the presence of the object.
  • the display data may be a frame frame HW1 that surrounds the object OB1, for example.
  • main controller 90 operates display element 11 via display control unit 18 to project a related information image (such as frame frame HW1) as a virtual image on a projection distance corresponding to the target distance (step S16).
  • a related information image such as frame frame HW1
  • main controller 90 prepares display data corresponding to a display image to be formed on display surface 11a of display element 11 based on the selected target distance and region information corresponding thereto (step S22). Thereafter, main controller 90 operates display element 11 via display control unit 18 to project a related information image (such as frame frame HW1) as a virtual image on a projection distance corresponding to the selected target distance (step S23). .
  • a related information image such as frame frame HW1
  • main controller 90 determines whether or not there is a next distance range or target distance, that is, whether or not a plurality of target distances extracted in step S12 remain undisplayed. (Step S24).
  • n is added to select the next closest target distance (step S25), so that this is matched.
  • the display control unit 18 and the projection distance changing unit 62 are operated, and the diffusion screens 16a to 16c corresponding to the target distance are selected and arranged on the optical path (step S21).
  • display data is prepared for the next distance range (step S22), and the display element 11 is operated (step S23).
  • the main controller 90 randomly selects the diffusion screens 16a to 16c, and corresponds to the projection distance of the diffusion screens 16a to 16c thus selected. If the object exists, an operation of causing the display element 11 to display a related information image corresponding to the existing object is also possible. Also in this case, it is not necessary to display the related information image at the same time, and the display time can be given light weight according to the distance or the like.
  • the image adding unit sets the target distance with respect to the detected objects OB1, OB2, and OB3. Since the related information image is added as a virtual image by the projection optical system 30 so that the projection distance substantially coincides with the projection distance, the related information images corresponding to the detected objects OB1, OB2, and OB3 are represented by the depths of the objects OB1, OB2, and OB3. It can be added at a position corresponding to the position in the direction.
  • the frame frames HW1, HW2, and HW3 are three-dimensionally displayed in the depth direction, and the frame frames HW1, HW2, and HW3 are relative to the objects OB1, OB2, and OB3 even if the viewer's viewpoint is shifted in the eye box. The position is difficult to shift.
  • the head-up display device according to the second embodiment will be described below.
  • the head-up display device according to the second embodiment is a modification of the head-up display device according to the first embodiment, and matters not specifically described are the same as those in the first embodiment.
  • a display screen 520 as a screen is affixed inside a rectangular reflection region 8 d provided in front of the driver's seat of the front window 8. .
  • the head-up display device 200 as a specific embodiment has been described above, the head-up display device according to the present invention is not limited to the above.
  • the arrangement of the image display device 100 can be turned upside down so that the display screen 20 can be arranged at the upper part of the front window 8 or at the sun visor position.
  • a display screen 20 is arranged on the screen.
  • the display screen 20 may be disposed at a position corresponding to a conventional mirror of an automobile.
  • the outline of the display screen 20 is not limited to a rectangle, but may be various shapes.
  • the imaging optical system 15 and the virtual image forming optical system 17 shown in FIG. 2 are merely examples, and the optical configurations of the imaging optical system 15 and the virtual image forming optical system 17 can be changed as appropriate.
  • the frame frame HW as the related information image RI is not limited to surrounding the entire object OB, and may be composed of a plurality of portions. Further, as shown in FIG. 8B, the related information image RI may be an index SH displayed adjacent to the object OB.
  • the object OB existing in front of the vehicle body 2 is detected by the environment monitoring unit 72, and related information images such as the frame frames HW1, HW2, and HW3 corresponding to the arrangement of the object OB are displayed on the image display device 100.
  • incidental driving-related information can be acquired using the communication network, and such driving-related information can be displayed on the image display device 100.
  • a display that warns of a car, an obstacle, etc. existing in a blind spot is also possible.
  • FIG. 9 is a diagram for explaining a modification of the projection optical system 30 shown in FIG.
  • the imaging optical system 115 enlarges and projects the image formed on the display element 11 to form an intermediate image TI.
  • the imaging optical system 115 includes a movable lens 15f as an optical element.
  • the movable lens 15f can be moved in the optical axis AX direction by the projection distance changing unit 62, and the focal length of the imaging optical system 115 is increased as the movable lens (optical element) 15f moves in the optical axis AX direction.
  • the position of the intermediate image TI as an image forming or focusing position (if the display of the display element 11 is not operating, an intermediate image as a display is not necessarily formed, but a position where an intermediate image will be formed) Can also be moved in the direction of the optical axis AX.
  • the imaging position of the imaging optical system 115 that is, the intermediate image TI is changed according to the arrangement of the diffusion screens 16a to 16c as optical elements. It is adjusted. That is, the diffusion screens 16a to 16c operate in synchronization with the movable lens 15f, and are disposed in synchronization with the intermediate image TI at the position of the moving intermediate image TI. Since the projection optical system 30 shown in FIG. 9 has a configuration in which the intermediate image is formed once, the optical system can be downsized by suppressing the increase in the size of the moving mechanism by configuring the moving lens with a small number of lenses.
  • the projection optical system 30 shown in FIG. 9 can move the imaging position or the position of the intermediate image to match the focal position with the position of the diffusing screen, so that it is possible to always display in a focused state, A clearer image can be displayed. In addition, it is possible to cope with a case where the depth of focus becomes shallow by reducing the F value of the imaging optical system 115 in order to ensure brightness. Further, when the imaging optical system 115 has a variable focus, the means or device for making the variable focus is not limited to the movable lens 15f, and a variable focus lens in which a liquid or the like is sealed can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un appareil d'affichage tête haute dans lequel une image d'informations associées peut être ajoutée sous la forme d'une image virtuelle dans une relation d'agencement comprenant même la direction de profondeur d'un objet réel. L'appareil d'affichage tête haute (200) comprend : un système optique de projection (30) utilisé pour projeter une image virtuelle ; une unité de détection d'environnement (72) servant d'unité de détection d'objet qui détecte un objet présent dans une région de détection (VF) et qui détecte la distance depuis le système optique de projection (30) jusqu'à l'objet en tant que distance cible ; un dispositif de modification de distance de projection (62) qui modifie de façon non périodique une distance de projection de l'image virtuelle à partir du système optique de projection (30) ; et une unité de commande d'affichage (18) et une unité de commande principale (90) qui est une unité d'ajout d'image servant à ajouter une image d'informations associées en tant qu'image virtuelle à l'objet détecté au moyen du système optique de projection (30), de telle sorte que la distance cible est approximativement égale à la distance de projection.
PCT/JP2018/011335 2017-03-31 2018-03-22 Appareil d'affichage tête haute Ceased WO2018180857A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019509649A JPWO2018180857A1 (ja) 2017-03-31 2018-03-22 ヘッドアップディスプレイ装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-073258 2017-03-31
JP2017073258 2017-03-31

Publications (1)

Publication Number Publication Date
WO2018180857A1 true WO2018180857A1 (fr) 2018-10-04

Family

ID=63677087

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/011335 Ceased WO2018180857A1 (fr) 2017-03-31 2018-03-22 Appareil d'affichage tête haute

Country Status (2)

Country Link
JP (1) JPWO2018180857A1 (fr)
WO (1) WO2018180857A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020136007A1 (fr) * 2018-12-28 2020-07-02 Lightspace Technologies, SIA Agencement d'affichage volumétrique pour présenter une image virtuelle et procédé associé

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08197981A (ja) * 1995-01-23 1996-08-06 Aqueous Res:Kk 車輌用表示装置
JP2002036909A (ja) * 2000-07-24 2002-02-06 Yazaki Corp 車両用表示装置
JP2013073229A (ja) * 2011-09-29 2013-04-22 Seiko Epson Corp 表示装置およびその駆動方法
JP2015074391A (ja) * 2013-10-10 2015-04-20 アイシン・エィ・ダブリュ株式会社 ヘッドアップディスプレイ装置
JP2015191222A (ja) * 2014-03-31 2015-11-02 株式会社Suwaオプトロニクス 画像表示装置
WO2016103418A1 (fr) * 2014-12-25 2016-06-30 日立マクセル株式会社 Appareil d'affichage d'informations pour véhicule

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08197981A (ja) * 1995-01-23 1996-08-06 Aqueous Res:Kk 車輌用表示装置
JP2002036909A (ja) * 2000-07-24 2002-02-06 Yazaki Corp 車両用表示装置
JP2013073229A (ja) * 2011-09-29 2013-04-22 Seiko Epson Corp 表示装置およびその駆動方法
JP2015074391A (ja) * 2013-10-10 2015-04-20 アイシン・エィ・ダブリュ株式会社 ヘッドアップディスプレイ装置
JP2015191222A (ja) * 2014-03-31 2015-11-02 株式会社Suwaオプトロニクス 画像表示装置
WO2016103418A1 (fr) * 2014-12-25 2016-06-30 日立マクセル株式会社 Appareil d'affichage d'informations pour véhicule

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020136007A1 (fr) * 2018-12-28 2020-07-02 Lightspace Technologies, SIA Agencement d'affichage volumétrique pour présenter une image virtuelle et procédé associé
US10955685B2 (en) 2018-12-28 2021-03-23 Lightspace Technologies, SIA Volumetric display arrangement and a method for representing content of an image
CN113168012A (zh) * 2018-12-28 2021-07-23 莱特斯贝斯科技有限公司 表示虚拟图像的体积显示装置及其方法

Also Published As

Publication number Publication date
JPWO2018180857A1 (ja) 2020-04-23

Similar Documents

Publication Publication Date Title
JP7189513B2 (ja) ヘッドアップディスプレイ装置
JP7003925B2 (ja) 反射板、情報表示装置および移動体
JPWO2017094427A1 (ja) ヘッドアップディスプレイ
JPWO2017138297A1 (ja) 画像表示装置及び画像表示方法
JP2018058521A (ja) 仮想表示ミラー装置
JP2017081428A (ja) 車両用表示装置
JP2019177726A (ja) 仮想リアビューミラー装置
WO2018180857A1 (fr) Appareil d'affichage tête haute
JPWO2018199245A1 (ja) 虚像表示装置及び移動体用表示システム
EP3693783B1 (fr) Dispositif d'affichage
WO2019151314A1 (fr) Dispositif d'affichage
JP2019191368A (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
JP7121349B2 (ja) 表示方法及び表示装置
JP2019120891A (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
JP2020071441A (ja) 表示装置
JP2020042155A (ja) ヘッドアップディスプレイ装置
JPWO2019124323A1 (ja) 虚像表示装置、およびヘッドアップディスプレイ装置
JP7280557B2 (ja) 表示装置及びこれによる表示方法
KR102879263B1 (ko) 눈 추적 상태를 고려한 hud 제어 방법 및 장치
JP2020065125A (ja) 表示装置
JPWO2019093500A1 (ja) 表示装置
JP2020042154A (ja) ヘッドアップディスプレイ装置
WO2020189258A1 (fr) Dispositif d'affichage, dispositif d'affichage tête haute et visiocasque
JP2021091357A (ja) 表示システム、および表示処理方法
JPWO2019138914A1 (ja) 虚像表示装置およびヘッドアップディスプレイ装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18776383

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019509649

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18776383

Country of ref document: EP

Kind code of ref document: A1