WO2010070920A1 - Dispositif de génération d'image des environs d'un véhicule - Google Patents
Dispositif de génération d'image des environs d'un véhicule Download PDFInfo
- Publication number
- WO2010070920A1 WO2010070920A1 PCT/JP2009/007009 JP2009007009W WO2010070920A1 WO 2010070920 A1 WO2010070920 A1 WO 2010070920A1 JP 2009007009 W JP2009007009 W JP 2009007009W WO 2010070920 A1 WO2010070920 A1 WO 2010070920A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- parking frame
- vehicle
- target parking
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/002—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle
- B60Q9/004—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors
- B60Q9/005—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for parking purposes, e.g. for warning the driver that his vehicle has contacted or is about to contact an obstacle using wave sensors using a video camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/102—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
Definitions
- the present invention relates to a vehicle surrounding image generation device that is configured to be connectable to a plurality of in-vehicle cameras and that generates an image that allows a vehicle driver to check a situation around the vehicle.
- This parking assistance device uses a vehicle-mounted camera to create an overhead image showing the situation around the vehicle, particularly the situation of the parking frame when looking down from above the vehicle. More specifically, the parking assist device enables the driver to intuitively perform an operation in which the driver specifies the target parking frame, that is, the position and direction of the parking frame from which the vehicle is to be parked. The frame graphic is superimposed on the parking frame position in the overhead image. Thus, the target parking frame is taught to the driver (for example, see Patent Document 1).
- an object of the present invention is to provide a vehicle surrounding image generation device that allows the driver to more clearly recognize the situation in the target parking frame.
- the present invention provides a vehicle surrounding image generation device connectable to a plurality of cameras attached to a vehicle, a parking frame specifying unit that specifies a target parking frame of the vehicle, and a specified target Based on the parking frame, a selection unit that selects at least one camera suitable for photographing the target parking frame from a plurality of cameras, selects a photographed image from the selected camera, and uses a photographed image by the selected camera. And a drawing unit that generates a display image representing the status of the identified target parking frame.
- the present invention uses the image from the camera selected based on the specified parking frame to generate a display image representing the state of the parking frame. Since the display image uses an image from the selected camera, the distortion of the three-dimensional object that can be reflected in the specified parking frame and / or around the parking frame can be relatively reduced. As described above, according to the present invention, a display image that can reduce the distortion of the three-dimensional object in the parking frame is generated. When this is displayed, the driver can display the image in the target parking frame and / or around the target parking frame. The situation can be recognized more clearly.
- FIG. 1 is a block diagram showing an overall configuration of a vehicle surrounding image display system 1 according to a first embodiment of the present invention.
- the block diagram which shows the detailed structure of the vehicle surrounding image generation apparatus 3 shown in FIG.
- the flowchart which shows the flow of a process of the vehicle surrounding image display system 1 shown in FIG.
- FIG. 5 is a schematic diagram showing an example of the optimum captured image ATP obtained in step S309 in FIG.
- FIG. 8A is a schematic diagram illustrating an example of an overhead image LDP that is the display image DP generated in step S310 of FIG. 5, and FIG. 8B is an optimal shooting that is the display image DP generated in step S310 of FIG.
- FIG. 9A is a schematic diagram illustrating an example of the positional relationship between the target parking frame TPL and the vehicle ⁇
- FIG. 9B is a schematic diagram illustrating another example of the positional relationship between the target parking frame TPL and the vehicle ⁇ .
- FIG. 11A is a schematic diagram showing a first example of the positional relationship between the target parking frame TPL and each of the cameras 2a to 2d, which is the optimum camera selection criterion in step S309
- FIG. 11B is the optimum camera selection criterion in step S309.
- FIG. 11C is a schematic diagram showing a second example of the positional relationship between the target parking frame TPL and each of the cameras 2a to 2d.
- FIG. 11C shows the target parking frame TPL and each of the cameras 2a to 2 which are the selection criteria for the optimum camera in step S309.
- FIG. 11A is a schematic diagram showing a first example of the positional relationship between the target parking frame TPL and each of the cameras 2a to 2d, which is the optimum camera selection criterion in step S309
- FIG. 11B is the optimum camera selection criterion in step S309.
- FIG. 11C is a schematic diagram showing a second example of the positional relationship between the target parking frame TPL
- FIG. 11D is a schematic diagram illustrating a fourth example of the positional relationship between the target parking frame TPL and each of the cameras 2a to 2d, which is the selection criterion for the optimum camera in step S309.
- FIG. 12A is a schematic diagram illustrating a first example of an optimal camera selection method when the target parking frame TPL is present in the overlap region
- FIG. 12B is an optimal camera when the target parking frame TPL is present in the overlap region.
- FIG. 5 is a schematic diagram showing an alternative example of the display image DP generated in step S310 of FIG.
- the block diagram which shows the whole structure of the vehicle periphery image display system 1a which concerns on the 2nd Embodiment of this invention.
- FIG. 17A is a schematic diagram illustrating an example of a display image DP in which an overhead image LDP is used and an obstacle icon OBS is further superimposed.
- FIG. 17B is an optimal captured image ATP that is further superimposed on an obstacle icon OBS.
- FIG. 15 is a schematic diagram showing a table 351 held by the additional image drawing unit 35 shown in FIG.
- the schematic diagram which shows the determination method of the approach degree in case the obstruction X which moves near the target parking frame TPL exists
- FIG. 1 is a block diagram showing an overall configuration of a vehicle surrounding image display system 1 according to the first embodiment of the present invention.
- the vehicle surrounding image display system 1 includes, for example, four cameras 2a, 2b, 2c, and 2d, a vehicle surrounding image generation device 3, and a display device 4.
- the cameras 2a to 2d have a solid-state imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), for example.
- the cameras 2a to 2d are digital cameras that can capture color images or monochrome images. Further, these cameras 2a to 2d more preferably have a wide viewing angle (wide angle of view) lens such as a fish-eye lens, for example, and it is preferable that the surroundings of the vehicle ⁇ (see FIG. 2) can be photographed over a wide range. .
- the cameras 2a to 2d are attached around the vehicle ⁇ .
- the camera 2a is attached to the center of the front end of the vehicle ⁇ so that the optical axis is directed to the front of the vehicle and the optical axes intersect with the road surface at a predetermined depression angle.
- the remaining cameras 2b, 2c, and 2d are attached to the left side, rear center, and right side of the vehicle ⁇ toward the left side, rear, and right side of the vehicle.
- the cameras 2b, 2c, and 2d are also attached so that their optical axes intersect with the road surface at a predetermined depression angle.
- FIG. 2 schematically shows a range captured by the cameras 2a to 2d in the peripheral region of the vehicle ⁇ .
- the camera 2a captures the situation of the imaging range CAMa in front of the vehicle ⁇ , and generates image data indicating the captured image CPa.
- the camera 2b captures an imaging range CAMb on the left side of the vehicle ⁇ and generates image data indicating the captured image CPb.
- the camera 2c captures an imaging range CAMc behind the vehicle ⁇ and generates image data indicating the captured image CPc.
- the camera 2d captures an imaging range CAMd on the right side of the vehicle ⁇ , and generates image data indicating the captured image CPd.
- Image data indicating each of the captured images CPa to CPd is transmitted to the vehicle surrounding image generation device 3 (see FIG. 1) via a transmission path.
- image data indicating an image” processed (generated, transmitted, stored in a buffer, etc.) in the vehicle surrounding image display system 1 is simply referred to as an “image” for simplification of description.
- the imaging range CAMa overlaps the imaging range CAMb.
- the imaging range CAMb, the imaging range CAMc, the imaging range CAMc, the imaging range CAMd, the imaging range CAMd, and the imaging range CAMa overlap.
- an overlapping portion of the imaging ranges CAMa and CAMb is referred to as an overlap region Fa.
- the overlapping portion of the imaging ranges CAMb and CAMc, the overlapping portion of the imaging ranges CAMc and CAMd, and the overlapping portion of the imaging ranges CAMd and CAMa are referred to as overlap regions Fb, Fc, and Fd, respectively.
- camera parameters representing the positions and orientations (orientations) of the cameras 2a to 2d and the distortion of the lenses of the cameras 2a to 2d are acquired in advance by actual measurement or the like. It is desirable that the vehicle surrounding image generation device 3 holds the image. Further, these camera parameters may be calculated using a calibration target (mark).
- the vehicle surrounding image generation device 3 is an ECU (Electronic Control Unit), for example, and includes a processor, a nonvolatile memory, and the like. As shown in FIG. 4, the vehicle surrounding image generation device 3 includes the same number of input buffers 31a, 31b, 31c, and 31d as the number of cameras, an overhead image generation unit 32, a parking frame specification unit 33, and a captured image selection unit. 34, an additional image drawing unit 35, and an output buffer 36.
- the overhead image generation unit 32, the parking frame specification unit 33, the captured image selection unit 34, and the additional image drawing unit 35 are realized by a processor that executes software stored in advance in the nonvolatile memory. Suppose that Such a vehicle surrounding image generation device 3 generates a display image DP output by the display device 4 at the subsequent stage.
- a display device 4 is also connected to the vehicle surrounding image generation device 3 via a transmission path.
- the display device 4 is, for example, a liquid crystal display, and displays the display image DP generated by the vehicle surrounding image generation device 3.
- the parking frame specifying unit 33 accepts a parking frame detection start trigger (FIG. 5; step S301).
- a parking frame detection start trigger (FIG. 5; step S301).
- the shift position information is information indicating the shift position of the vehicle.
- the parking frame specifying unit 33 determines that the start trigger has been received.
- the operation information is information indicating that a physical button (not shown) or a graphical button (not shown) provided in the vehicle surrounding image display system 1 is operated by the driver.
- the parking frame specifying unit 33 determines that a start trigger has been received, for example, when operation information indicating that a parking frame detection button that is one of these buttons has been pressed is input.
- the cameras 2a to 2d are connected to input buffers 31a to 31d as shown in FIG.
- the captured images CPa to CPd by the cameras 2a to 2d are periodically stored in the input buffers 31a to 31d when the parking frame specifying unit 33 receives a start trigger at the latest.
- the parking frame specifying unit 33 acquires the captured images CPa to CPd stored in the input buffers 31a to 31d (step S302).
- the parking frame specifying unit 33 performs target parking frame detection processing (step S304).
- the target parking frame detection process there is a method disclosed in Patent Document 1 (Japanese Patent Laid-Open No. 2007-230371) described above.
- the parking frame specifying unit 33 performs well-known image processing on the captured images CPa to CPd acquired in step S302 to detect white lines on the road surface from the captured images CPa to CPd.
- the parking frame specifying unit 33 detects an area surrounded by a white line or an area sandwiched by white lines as a parking area (hereinafter referred to as a parking frame).
- the parking frame identification unit 33 may detect the target parking frame using an active sensor such as a white line region detected by a radar or a near infrared camera that emits near infrared rays.
- step S304 the parking frame specifying unit 33 determines whether or not the target parking frame has been successfully detected (step S305).
- step S305 If a negative determination is made in step S305, the process flow returns to step S302, and the parking frame specifying unit 33 acquires the updated captured images CPa to CPd, and performs step S304 again.
- the parking frame specifying unit 33 calculates the relative positional relationship between the host vehicle (that is, the vehicle on which the vehicle surrounding image display system 1 is mounted) and the target parking frame (step). S306). Specifically, the relative positional relationship is calculated from the detected world coordinates of the target parking frame and the world coordinates of the host vehicle. For example, the imaging ranges CAMa to CAMd (see FIG. 2) that can be captured by the respective cameras are calculated based on the camera parameters described above, and the calculated values are stored in the nonvolatile memory described above.
- a relative positional relationship of the target parking frame with respect to the host vehicle is calculated.
- the relative positional relationship the distance and direction of the target parking frame with respect to the host vehicle and the world coordinate values of both are typical.
- step S306 the parking frame specifying unit 33 determines whether the bird's-eye view image is good as the image generated by the vehicle surrounding image generation device 3 this time (step S307).
- the determination is negative in step S307, a captured image is generated.
- the distance from the host vehicle to the target parking frame is used as the determination criterion in step S307. For example, if this distance is less than a predetermined threshold value, that is, if the distance is relatively close, it is determined that the bird's-eye view image is good. If it is a long distance, it is programmed in advance so that the camera image is judged good.
- the parking frame specifying unit 33 gives an instruction to generate an overhead image to the overhead image generation unit 32 in step S308.
- the overhead image generation unit 32 acquires the captured images CPa to CPd stored in the input buffers 31a to 31d, and performs geometric conversion on the acquired captured images CPa to CPd.
- an overhead image representing the situation when looking down around the host vehicle is generated from a virtual viewpoint preset above the host vehicle (step S308). Since the processing for generating the bird's-eye view image is well known, detailed description thereof is omitted.
- FIG. 6 is a schematic diagram showing an example of the overhead view image generated in step S308.
- the bird's-eye view image LDP is synthesized with a model image M ⁇ that imitates the host vehicle at a predetermined position. Further, a road surface RS (see the hatched portion) and white lines WL indicating some parking frames PL are drawn around the model image M ⁇ .
- a road surface RS see the hatched portion
- white lines WL indicating some parking frames PL are drawn around the model image M ⁇ .
- one parking frame PL is virtually shown by a one-dot chain line, and one other vehicle Va is also shown.
- the overhead image as described above is passed to the additional image drawing unit 35.
- step S309 the parking frame specifying unit 33 selects the target parking frame that has been successfully detected in step S304 from the cameras 2a to 2d based on the relative positional relationship of the target parking frame with respect to the host vehicle (calculated in step S306). The closest camera is selected as the optimal camera. Thereafter, the parking frame specifying unit 33 passes information indicating the optimum camera to the captured image selecting unit 34. In response to this information, the captured image selection unit 34 receives the captured image stored from the input buffer (any one of the input buffers 31a to 31d) connected to the optimum camera (any one of the cameras 2a to 2d). An image is acquired as an optimum captured image. The captured image selection unit 34 passes the acquired optimal captured image to the additional image drawing unit 35 (step S309).
- FIG. 7 is a schematic diagram showing an example of the optimum photographed image selected in step S309.
- the optimal captured image ATP is an image captured from the viewpoint of the optimal camera, the road surface RS (see the hatched portion), the white line WL indicating the target parking frame TPL, and the own vehicle A part ⁇ is drawn at least.
- the target parking frame TPL is virtually drawn with a one-dot chain line.
- the parking frame specifying unit 33 gives an instruction to draw various predetermined additional images to the additional image drawing unit 35 in step S310.
- the relative positional relationship of the target parking frame TPL is also passed.
- the additional image drawing unit 35 superimposes the additional image on the inputted overhead view image LDP or optimum captured image ATP, and at least the situation of the target parking frame TPL. Is generated (step S310).
- the additional image there are predetermined ones among the target parking frame image FP, the mask image MKP (see the hatched portion), the icon ICN, and the graphic operation button GB.
- the target parking frame image FP is, for example, a rectangular frame graphic for indicating the target parking frame TPL.
- the superimposed position is specified by the relative positional relationship sent from the parking frame specifying unit 33 and the camera parameters stored in advance.
- the mask image MKP is an image for covering an unnecessary part (for example, a part that is not desired to be displayed) in the overhead view image or the optimum captured image.
- the icon ICN is a graphic indicating in which direction the target parking frame TPL is with respect to the host vehicle.
- the operation button GB is, for example, a button for re-detecting a parking frame, and is operated by a driver, and various functions are assigned in advance.
- Mask image MKP, icon ICN, and operation button GB are superimposed on a predetermined position with respect to the input overhead image or the input optimum captured image.
- the overhead image or the optimal captured image on which various additional images are superimposed is output to the output buffer 36 (see FIG. 4) as the display image DP.
- the display device 4 receives and displays the display image DP from the output buffer 36 (step S311).
- the processing of the vehicle surrounding image display system 1 shown in FIG. 1 is repeatedly performed.
- 9A and 9B are schematic diagrams showing the positional relationship between the target parking frame and the host vehicle.
- 9A and 9B show an actual target parking frame TPL, the host vehicle ⁇ , and a drawing area LDA (see an area surrounded by a two-dot chain line) drawn as an overhead image.
- a drawing area LDA having a predetermined area is set in the vehicle surrounding display system 1 from the world coordinate positions of the target parking frame TPL and the host vehicle ⁇ . In the drawing area LDA, as shown in FIG.
- step S307 when a part or the whole of the target parking frame TPL is included, the parking frame specifying unit 33 determines that an overhead image is generated in step S307. On the other hand, if the drawing area LDA does not include a part or the whole of the target parking frame TPL as shown in FIG. 9B, in step S307, when the optimum captured image is selected, the parking frame specifying unit 33 determines.
- FIG. 10 is a schematic diagram showing the fall of the three-dimensional object MMS, which is correlated with the distance from the camera 2b, as an example of such a fall.
- the three-dimensional object MMS is drawn in a state of being projected on the road surface RS.
- the mounting height itself of the camera 2b is not substantially changed. Accordingly, the distortion changes depending on the positional relationship between the three-dimensional object MMS and the camera 2b, and the distortion (length) Lf of the three-dimensional object MMS increases as the distance between the two increases.
- the target parking frame TPL when the target parking frame TPL is close, even if the overhead image is displayed on the display device 4, the three-dimensional object that is drawn in the overhead image and can exist around the target parking frame TPL has only a small distortion. Does not occur. From the above viewpoint, when the target parking frame TPL exists nearby, the overhead image LDP is used for the display image DP as shown in FIG. 8A. On the contrary, when the target parking frame TPL exists far away, as shown in FIG. 8B, the optimum captured image ATP is used for the display image DP.
- FIG. 11A to FIG. 11D are schematic diagrams showing selection criteria for the optimal captured image based on the positional relationship between the target parking frame TPL and each of the cameras 2a to 2d.
- the determination as to which camera's captured image is used is based on the world coordinate values that define the imaging ranges CAMa to CAMd (see FIG. 2) of the cameras 2a to 2d in advance as camera parameters.
- the parking frame specifying unit 33 holds. Then, the world coordinate value of the target parking frame TPL is obtained in step S305. Therefore, the parking frame specifying unit 33 determines which imaging range of the imaging ranges CAMa to CAMd the target parking frame TPL is in, and selects a camera having an imaging range including the target parking frame TPL as the optimal camera.
- the cameras 2a to 2d are installed with respect to the vehicle ⁇ so that the imaging ranges of adjacent cameras overlap in the overlap areas Fa to Fd.
- a part or the whole of the target parking frame TPL is included in these overlap areas Fa to Fd.
- the parking frame specifying unit 33 selects the optimum camera.
- the target parking frame TPL see the portion with dots
- the overlap region Fa see the lattice-shaped hatching portion.
- the parking frame specifying unit 33 selects one of the cameras 2a and 2b as the optimum camera.
- the following two methods are exemplified as the selection method.
- the first method as shown in FIG. 12A, distances Da and Db between a point closest to the vehicle ⁇ in the target parking frame TPL and any two of the cameras 2a to 2d are obtained, and both distances Da and Db are obtained.
- the camera having the smaller one is selected as the optimum camera.
- FIG. 12B first, a point closest to the vehicle ⁇ in the target parking frame TPL and two lines La and Lb connecting the cameras 2a and 2b are obtained.
- an angle ⁇ a formed by the line La and the optical axis Xa of the camera 2a and an angle ⁇ b formed by the line Lb and the optical axis Xb of the camera 2b are obtained. Thereafter, the camera having the smaller one of the two angles ⁇ a and ⁇ b is selected as the optimum camera.
- the parking frame specifying unit 33 specifies the target parking frame TPL, and then obtains the relative positional relationship of the target parking frame TPL with respect to the host vehicle ⁇ . Further, when the target parking frame TPL is relatively far from the own vehicle ⁇ , the parking frame specifying unit 33 selects the camera most suitable for showing the specified target parking frame TPL to the driver, and is selected by this. A display image is created using a photographed image from the optimum camera. As described above with reference to FIG. 10, the captured image displayed on the display device 4 has less distortion of the three-dimensional object around the target parking frame TPL than the overhead image. Thus, according to the vehicle surrounding image display system 1, the driver selects the optimum camera in accordance with the positional relationship with the target parking frame TPL. It becomes possible to recognize this situation more clearly.
- the display image is created using the captured image itself.
- the present invention is not limited to this, and the viewpoint of the image is converted so that one or a plurality of captured images become, for example, an image when the target parking frame TPL is viewed from the viewpoint of the driver, and the converted image is displayed. It may be used as an image.
- the target parking frame TPL exists far from the host vehicle ⁇
- the optimum captured image ATP is used for the display image DP
- the target parking frame TPL exists nearby.
- the overhead image LDP is used as the display image DP.
- the present invention is not limited to this, and when the target parking frame TPL exists in the distance, as shown in FIG. 13, a composite image of both the overhead image LDP and the optimum captured image ATP is created as the display image DP. It doesn't matter.
- FIG. 14 is a block diagram showing an overall configuration of a vehicle surrounding image display system 1a according to the second embodiment of the present invention.
- the vehicle surrounding image display system 1a in FIG. 14 is different from the vehicle surrounding image display system 1 shown in FIG. 1 in that the vehicle surrounding image generation device 3 is replaced with the vehicle surrounding image generation device 3a. Since the other configurations are the same, in the following, the components corresponding to those in FIG.
- FIG. 15 is a block diagram showing a detailed configuration of the vehicle surrounding image generation device 3a shown in FIG.
- the vehicle surrounding image generation device 3a in FIG. 15 is different from the vehicle surrounding image generation device 3 in terms of functional blocks in that an obstacle detection unit 37 is added as shown in FIG. Since the other configurations are the same, in the following, in FIG. 15, the components corresponding to the configurations in FIG. 4 are denoted by the same reference numerals, and detailed descriptions thereof are omitted.
- the obstacle detection unit 37 is realized by, for example, a processor that executes software stored in advance in the nonvolatile memory.
- the obstacle detection unit 37 receives the captured images CPa to CPd and the relative positional relationship from the parking frame specifying unit 33 from the input buffers 31a to 31d. Thereafter, the obstacle detection unit 37 detects an obstacle (a three-dimensional object) that may exist around the target parking frame TPL. When the obstacle is successfully detected, the obstacle detection unit 37 passes the world coordinate value where the detected obstacle exists to the additional image drawing unit 35.
- steps S601, S602, and S603 are added instead of step S310. Since the other steps are the same, in the following, the steps corresponding to the steps in FIG. 5 are given the same step numbers, and detailed descriptions thereof are omitted.
- step S601 the obstacle detection unit 37 performs obstacle detection (step S601).
- An obstacle is a stationary object such as a wall existing in a parking lot, an already parked vehicle, or a pillar, or a moving object such as a moving person, another vehicle, or a bicycle.
- the obstacle detection unit 37 holds the captured images CPa to CPd obtained from the input buffers 31a to 31d for the past several frames.
- the obstacle detection unit 37 detects a moving object by detecting a flow (known optical flow) indicating movement using the captured images CPa for the past several frames from the present.
- the obstacle detection unit 37 detects a stationary object using the stereo principle using two frames of the captured image CPa having a time difference such as motion stereo.
- moving objects and stationary objects are detected for the other captured images CPb to CPd.
- the obstacle detection unit 37 obtains which one of the optimum cameras selected by the parking frame specifying unit 33 in order to reduce the processing load, and obstructs the optimum captured image obtained from the optimum camera. It is preferable to perform object detection.
- the obstacle detection unit 37 obtains a world coordinate value that specifies the position of the detected obstacle. If the obstacle is a moving object, the obstacle detecting unit 37 approaches the target parking frame TPL detected in step S304 from the obtained world coordinate value and information on the moving direction of the moving object. It is determined whether the vehicle ⁇ is approaching in the moving direction. This determination method will be described in detail later. Further, the obstacle detection unit 37 also obtains the moving speed of the moving object.
- the obstacle detection unit 37 passes information indicating the world coordinate value, movement direction, and movement speed of the obstacle obtained as described above to the additional image drawing unit 35. Thus, the determination process in step S602 ends.
- the additional image drawing unit 35 displays the obstacle icon in the position corresponding to the obstacle icon in the overhead image LDP or the optimum captured image ATP in the display image DP based on the information received from the obstacle detection unit 37. (Step S603).
- the obstacle icon indicates that a stationary object is present near the target parking frame TPL, or that the moving body is approaching the future path of the target parking frame TPL or the host vehicle ⁇ .
- FIG. 17A is a schematic diagram showing an example of the display image DP using the overhead image LDP and further superimposed with the obstacle icon OBS
- FIG. 17B uses the optimum captured image ATP and further displays the obstacle icon.
- It is a schematic diagram which shows an example of image DP for display on which OBS was superimposed. 17A and 17B are different from the display image DP shown in FIGS. 8A and 8B in that the obstacle icon OBS is superimposed, and thus the description thereof is omitted.
- the bird's-eye view image LDP depicts a portion at a short distance from the vehicle ⁇ in the peripheral region of the target parking frame TPL.
- the obstacle icon OBS is drawn on the display image DP in some form even in the peripheral region of the target parking frame TPL or when the obstacle is at a long distance from the vehicle ⁇ .
- the obstacle icon OBS can be drawn on the mask image MKP as shown in FIG. 17A.
- the additional image drawing unit 35 may hold a table 351 as shown in FIG. 18 in the aforementioned nonvolatile memory.
- a table 351 describes an information set of the color of the icon OBS, the moving speed range of the obstacle, and the distance range from the host vehicle ⁇ to the obstacle for each approach degree.
- the color of the icon OBS is described as red so that the driver can be most alerted
- the moving speed range is described as V2 ⁇ Vth ⁇ V3
- the distance range is , 0 ⁇ Dth ⁇ D1.
- the degree of approach is level 2
- the color of the icon OBS is described as yellow so that the driver can be alerted moderately
- the moving speed range is described as V1 ⁇ Vth ⁇ V2
- the distance range is It is described as D1 ⁇ Dth ⁇ D2.
- the degree of approach is level 3
- the color of the icon OBS is described as blue so that the driver can be most gently alerted
- the moving speed range is described as 0 ⁇ Vth ⁇ V1
- the distance range is It is described as D2 ⁇ Dth ⁇ D3.
- the additional image drawing unit 35 preferably changes the color of the above-described obstacle icon OBS based on the information received from the obstacle detection unit 37. Further, the color of the target parking frame image FP may be changed to the same color as the obstacle icon OBS.
- FIG. 19 is a schematic diagram illustrating a method for determining the degree of approach when there is an obstacle X that moves near the target parking frame TPL.
- FIG. 19 shows the host vehicle ⁇ and a vector V indicating the moving speed of the obstacle X.
- the obstacle detection unit 37 also calculates the vector V indicating the moving speed and moving direction of the obstacle X. For this reason, from these information, the additional image drawing unit 35 can first determine whether the obstacle X is approaching or not approaching the target parking frame TPL.
- the additional image drawing unit 35 determines the color of the obstacle icon OBS. If the current moving speed of the obstacle X falls within the level 1 moving speed range shown in FIG. 18, the color of the obstacle icon OBS is determined to be red. If it falls within the movement speed range of level 2, the color of the obstacle icon OBS is determined to be yellow. If the moving speed is within the level 3 range, the obstacle icon OBS is determined to be blue. For example, a display image DP as shown in FIGS. 17A and 17B is created using the determined color.
- the additional image drawing unit 35 is similar to a stationary object, and calculates the distance between the position of the detected stationary object and the target parking frame TPL.
- the color of the obstacle icon OBS is determined according to the table 351 of FIG. 18 and is superimposed on the corresponding position in the overhead image LDP or the optimal captured image ATP in the display image DP. Also in this case, the color of the target parking frame TPL may be drawn in the same color as the obstacle icon OBS.
- Obstacles may be detected using an active sensor such as a near-infrared camera by irradiating radar or near-infrared rays.
- an active sensor such as a near-infrared camera by irradiating radar or near-infrared rays.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Closed-Circuit Television Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
La présente invention se rapporte à un dispositif destiné à générer une image des environs d'un véhicule, le dispositif permettant à un conducteur de reconnaître plus clairement la situation à l'intérieur d'un cadre de stationnement cible. Un dispositif (3) destiné à générer une image des environs d'un véhicule peut être relié à une pluralité de caméras montées dans le véhicule, et est équipé d'une unité spécification de cadre de stationnement (33) destinée à spécifier un cadre de stationnement cible pour le véhicule, une unité sélection d'image capturée (34) destinée à sélectionner, parmi la pluralité de caméras, au moins une caméra appropriée pour capturer une image du cadre de stationnement cible sur la base du cadre de stationnement cible spécifié et à sélectionner une image capturée par la caméra sélectionnée, et une unité traçage d'image supplémentaire (35) destinée à générer une image d'affichage, qui représente la situation du cadre de stationnement cible spécifié, à l'aide de l'image capturée par la caméra sélectionnée.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008-323706 | 2008-12-19 | ||
| JP2008323706A JP2012040883A (ja) | 2008-12-19 | 2008-12-19 | 車両周囲画像生成装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010070920A1 true WO2010070920A1 (fr) | 2010-06-24 |
Family
ID=42268599
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2009/007009 Ceased WO2010070920A1 (fr) | 2008-12-19 | 2009-12-18 | Dispositif de génération d'image des environs d'un véhicule |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2012040883A (fr) |
| WO (1) | WO2010070920A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2017077650A1 (fr) * | 2015-11-06 | 2017-05-11 | 三菱電機株式会社 | Appareil de traitement d'image, procédé de traitement d'image et programme de traitement d'image |
| CN108140311A (zh) * | 2015-10-22 | 2018-06-08 | 日产自动车株式会社 | 停车辅助信息的显示方法及停车辅助装置 |
| EP3372456A4 (fr) * | 2015-11-02 | 2019-07-10 | Clarion Co., Ltd. | Dispositif de détection d'obstacle |
| US11064151B2 (en) * | 2016-04-26 | 2021-07-13 | Denso Corporation | Display control apparatus |
| US11795340B2 (en) | 2020-09-02 | 2023-10-24 | Terragene S.A. | Compositon sensitive to UV-C radiation and UV-C sterilization or disinfection dosimeter |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101417399B1 (ko) * | 2012-11-09 | 2014-07-08 | 현대자동차주식회사 | 차량의 주차 위치 표시 장치 및 방법 |
| JP6148848B2 (ja) * | 2012-11-27 | 2017-06-14 | クラリオン株式会社 | 車載画像処理装置 |
| JP2016060312A (ja) * | 2014-09-17 | 2016-04-25 | アルパイン株式会社 | 駐車支援装置 |
| KR102288952B1 (ko) * | 2014-12-18 | 2021-08-12 | 현대모비스 주식회사 | 차량 및 그 제어 방법 |
| KR102288951B1 (ko) * | 2014-12-18 | 2021-08-12 | 현대모비스 주식회사 | 차량 및 그 제어 방법 |
| JP6352797B2 (ja) * | 2014-12-19 | 2018-07-04 | 日立建機株式会社 | 作業機械の周囲監視装置 |
| WO2018105417A1 (fr) | 2016-12-09 | 2018-06-14 | 京セラ株式会社 | Dispositif d'imagerie, dispositif de traitement d'image, système d'affichage et véhicule |
| JP7007438B2 (ja) * | 2020-09-09 | 2022-01-24 | 京セラ株式会社 | 撮像装置、画像処理装置、表示装置、表示システム、および車両 |
| JP7398492B2 (ja) | 2022-03-14 | 2023-12-14 | 本田技研工業株式会社 | 制御装置、制御方法、及び制御プログラム |
Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11220726A (ja) * | 1998-01-30 | 1999-08-10 | Niles Parts Co Ltd | 車両周囲モニター装置 |
| JP2003149711A (ja) * | 2001-11-08 | 2003-05-21 | Clarion Co Ltd | 車両用モニタ装置 |
| JP2003169323A (ja) * | 2001-11-29 | 2003-06-13 | Clarion Co Ltd | 車両周囲監視装置 |
| JP2003267171A (ja) * | 2002-03-13 | 2003-09-25 | Nissan Motor Co Ltd | 車両後方監視装置 |
| JP2003348574A (ja) * | 2002-05-24 | 2003-12-05 | Nissan Motor Co Ltd | 車両用映像表示装置 |
| JP2004254219A (ja) * | 2003-02-21 | 2004-09-09 | Denso Corp | 車両周辺画像処理装置及びプログラム並びに記録媒体 |
| JP2006027556A (ja) * | 2004-07-21 | 2006-02-02 | Nissan Motor Co Ltd | 車両用周辺監視装置 |
| JP2006131166A (ja) * | 2004-11-09 | 2006-05-25 | Alpine Electronics Inc | 運転支援装置 |
| JP2007098967A (ja) * | 2005-09-30 | 2007-04-19 | Aisin Seiki Co Ltd | 車両周辺監視装置及びセンサユニット |
| JP2007099261A (ja) * | 2005-09-12 | 2007-04-19 | Aisin Aw Co Ltd | 駐車支援方法及び駐車支援装置 |
| JP2007176256A (ja) * | 2005-12-27 | 2007-07-12 | Aisin Aw Co Ltd | 画像表示方法及び運転支援装置 |
| JP2007183877A (ja) * | 2006-01-10 | 2007-07-19 | Nissan Motor Co Ltd | 車両用運転支援装置及び俯瞰映像の表示方法 |
-
2008
- 2008-12-19 JP JP2008323706A patent/JP2012040883A/ja active Pending
-
2009
- 2009-12-18 WO PCT/JP2009/007009 patent/WO2010070920A1/fr not_active Ceased
Patent Citations (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11220726A (ja) * | 1998-01-30 | 1999-08-10 | Niles Parts Co Ltd | 車両周囲モニター装置 |
| JP2003149711A (ja) * | 2001-11-08 | 2003-05-21 | Clarion Co Ltd | 車両用モニタ装置 |
| JP2003169323A (ja) * | 2001-11-29 | 2003-06-13 | Clarion Co Ltd | 車両周囲監視装置 |
| JP2003267171A (ja) * | 2002-03-13 | 2003-09-25 | Nissan Motor Co Ltd | 車両後方監視装置 |
| JP2003348574A (ja) * | 2002-05-24 | 2003-12-05 | Nissan Motor Co Ltd | 車両用映像表示装置 |
| JP2004254219A (ja) * | 2003-02-21 | 2004-09-09 | Denso Corp | 車両周辺画像処理装置及びプログラム並びに記録媒体 |
| JP2006027556A (ja) * | 2004-07-21 | 2006-02-02 | Nissan Motor Co Ltd | 車両用周辺監視装置 |
| JP2006131166A (ja) * | 2004-11-09 | 2006-05-25 | Alpine Electronics Inc | 運転支援装置 |
| JP2007099261A (ja) * | 2005-09-12 | 2007-04-19 | Aisin Aw Co Ltd | 駐車支援方法及び駐車支援装置 |
| JP2007098967A (ja) * | 2005-09-30 | 2007-04-19 | Aisin Seiki Co Ltd | 車両周辺監視装置及びセンサユニット |
| JP2007176256A (ja) * | 2005-12-27 | 2007-07-12 | Aisin Aw Co Ltd | 画像表示方法及び運転支援装置 |
| JP2007183877A (ja) * | 2006-01-10 | 2007-07-19 | Nissan Motor Co Ltd | 車両用運転支援装置及び俯瞰映像の表示方法 |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3367364A4 (fr) * | 2015-10-22 | 2018-10-24 | Nissan Motor Co., Ltd. | Procédé d'affichage d'informations d'aide au stationnement et dispositif d'aide au stationnement |
| CN108140311B (zh) * | 2015-10-22 | 2021-08-24 | 日产自动车株式会社 | 停车辅助信息的显示方法及停车辅助装置 |
| US10366611B2 (en) | 2015-10-22 | 2019-07-30 | Nissan Motor Co., Ltd. | Parking support information display method and parking support device |
| CN108140311A (zh) * | 2015-10-22 | 2018-06-08 | 日产自动车株式会社 | 停车辅助信息的显示方法及停车辅助装置 |
| EP3372456A4 (fr) * | 2015-11-02 | 2019-07-10 | Clarion Co., Ltd. | Dispositif de détection d'obstacle |
| GB2556797B (en) * | 2015-11-06 | 2018-10-24 | Mitsubishi Electric Corp | Image processing apparatus, image processing method, and image processing program |
| WO2017077650A1 (fr) * | 2015-11-06 | 2017-05-11 | 三菱電機株式会社 | Appareil de traitement d'image, procédé de traitement d'image et programme de traitement d'image |
| GB2556797A (en) * | 2015-11-06 | 2018-06-06 | Mitsubishi Electric Corp | Image processing apparatus, image processing method, and image processing program |
| US10417743B2 (en) | 2015-11-06 | 2019-09-17 | Mitsubishi Electric Corporation | Image processing device, image processing method and computer readable medium |
| JPWO2017077650A1 (ja) * | 2015-11-06 | 2017-12-07 | 三菱電機株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
| US11064151B2 (en) * | 2016-04-26 | 2021-07-13 | Denso Corporation | Display control apparatus |
| US11750768B2 (en) | 2016-04-26 | 2023-09-05 | Denso Corporation | Display control apparatus |
| US11795340B2 (en) | 2020-09-02 | 2023-10-24 | Terragene S.A. | Compositon sensitive to UV-C radiation and UV-C sterilization or disinfection dosimeter |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2012040883A (ja) | 2012-03-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2010070920A1 (fr) | Dispositif de génération d'image des environs d'un véhicule | |
| JP4969269B2 (ja) | 画像処理装置 | |
| CN103988499B (zh) | 车辆周边监视装置 | |
| JP5503660B2 (ja) | 運転支援表示装置 | |
| US8130270B2 (en) | Vehicle-mounted image capturing apparatus | |
| JP5516998B2 (ja) | 画像生成装置 | |
| JP4744823B2 (ja) | 周辺監視装置および俯瞰画像表示方法 | |
| JP4497133B2 (ja) | 運転支援方法及び運転支援装置 | |
| JP5953824B2 (ja) | 車両用後方視界支援装置及び車両用後方視界支援方法 | |
| JP6014433B2 (ja) | 画像処理装置、画像処理方法、及び、画像処理システム | |
| JP5495071B2 (ja) | 車両周辺監視装置 | |
| JP2009524171A (ja) | 複数の画像を結合して鳥瞰図画像にする方法 | |
| KR20120118073A (ko) | 차량 주변 감시 장치 | |
| JP2008174212A (ja) | 運転支援方法及び運転支援装置 | |
| JP7467402B2 (ja) | 画像処理システム、移動装置、画像処理方法、およびコンピュータプログラム | |
| JP7500527B2 (ja) | 画像処理システム、移動装置、画像処理方法、およびコンピュータプログラム | |
| CN112351242A (zh) | 图像处理装置以及图像处理方法 | |
| WO2016129552A1 (fr) | Dispositif de réglage de paramètre de caméra | |
| CN118413752A (zh) | 图像处理系统、可移动设备、图像处理方法和存储介质 | |
| JP4945315B2 (ja) | 運転支援システム及び車両 | |
| JP2011077806A (ja) | 車両周辺監視装置 | |
| JP2008148114A (ja) | 運転支援装置 | |
| JP2007249814A (ja) | 画像処理装置及び画像処理プログラム | |
| JP5436056B2 (ja) | 障害物検出装置および障害物検出方法 | |
| KR20220097656A (ko) | 운전자 보조 장치, 차량 및 그 제어 방법 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09833227 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 09833227 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |