WO2017110417A1 - 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 - Google Patents
車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 Download PDFInfo
- Publication number
- WO2017110417A1 WO2017110417A1 PCT/JP2016/085814 JP2016085814W WO2017110417A1 WO 2017110417 A1 WO2017110417 A1 WO 2017110417A1 JP 2016085814 W JP2016085814 W JP 2016085814W WO 2017110417 A1 WO2017110417 A1 WO 2017110417A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- target distance
- light
- image acquisition
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/484—Transmitters
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/18—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4868—Controlling received signal intensity or exposure of sensor
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
Definitions
- the present disclosure relates to a vehicle image acquisition device, a control device, a vehicle provided with a vehicle image acquisition device or a control device, and a vehicle image acquisition method.
- Patent Document 1 pulsed light is projected in front of the host vehicle at a predetermined period, reflected light from the target distance is imaged at an imaging timing set according to the target distance, and the target distances obtained thereby are different.
- a vehicular distance image data generation device that generates distance image data representing a distance to an object for each pixel based on the luminance of the same pixel in a plurality of captured images is disclosed.
- Patent Document 2 is based on the luminance gradient of a non-irradiated region in an image captured by an in-vehicle camera when the illumination device is irradiating the outside of the vehicle in order to determine the presence or absence of fog or the like in the field of view.
- a vehicular visibility status determination apparatus that determines a visibility status outside a vehicle.
- a distance range to be photographed (for example, 0 m to 200 m ahead of the host vehicle) is photographed with a predetermined distance resolution.
- the desired distance range and the predetermined distance resolution are determined by the light emission time of the pulse light, the imaging (exposure) time of the reflected light, and the delay time from the light emission start time to the exposure start time.
- the vehicular visibility situation determination apparatus described in Patent Document 2 only determines the presence or absence of fog, and cannot acquire information on the depth of fog.
- a first object of the present disclosure is to provide an image acquisition device for a vehicle, a control device, an image acquisition device for a vehicle, or a vehicle equipped with the control device and an image for the vehicle capable of preventing acquisition of an image of an unnecessary target distance region. It is to provide an acquisition method.
- a second object of the present disclosure is to provide a vehicle image acquisition device, a control device, a vehicle image acquisition device, or a vehicle image acquisition method that can acquire detailed visibility information particularly in bad weather. Is to provide.
- a third object of the present disclosure is to provide a vehicle image acquisition device, a control device, a vehicle image acquisition device, or a vehicle including the control device that can capture a near region and a far region of the host vehicle with equal contrast. It is to provide an image acquisition method.
- a vehicle image acquisition device includes: A light emitting unit that emits pulsed light in a predetermined direction; An image acquisition unit that captures reflected light returning from the target distance region at an imaging timing set according to the target distance region, and acquires a plurality of captured images having different target distance regions; A timing control unit for controlling the light emission period of the pulsed light and the imaging timing; With The timing control unit is capable of imaging the reflected light in the target distance region, in which the light emission interval time is a delay time from the light emission start time of the light emission unit to the imaging start time of the image acquisition unit. The light emission interval time is set so as to be longer than the delay time required for imaging the longest distance area.
- the delay time necessary for imaging the longest distance area may be determined from the emission intensity and diffusion angle of the pulsed light and the sensitivity of the image acquisition unit.
- the light emitting unit may decrease the light emission intensity when shooting in the short distance area and increase the light emission intensity when shooting in the long distance area.
- the control device of the present disclosure includes: A plurality of captured images having different target distance areas are acquired by imaging a light emitting unit that emits pulsed light in a predetermined direction and reflected light returning from the target distance area at an imaging timing set according to the target distance area.
- a control device for controlling a vehicle image acquisition device including an image acquisition unit,
- the light emission interval time is a delay time from the light emission start time of the light emitting unit to the image acquisition start time of the image acquisition unit, and images the longest distance region where the reflected light can be imaged in the target distance region
- the light emission interval time is set so as to be longer than the delay time necessary for this.
- a vehicle according to the present disclosure includes the vehicle image acquisition device or the control device described above.
- the vehicle image acquisition method of the present disclosure includes: An image acquisition method for a vehicle that acquires a plurality of captured images having different target distance areas by imaging reflected light of pulsed light emitted in a predetermined direction while changing an imaging timing.
- the light emission interval time indicating the light emission period of the pulsed light is a delay time from the time when the pulsed light is emitted to the time when the reflected light is imaged, and the reflected light can be imaged in the target distance region.
- the light emission interval time is set so as to be longer than the delay time required for imaging the longest distance area.
- a vehicle image acquisition device includes: A light emitting unit that emits pulsed light in a predetermined direction; An image acquisition unit that captures reflected light returning from the target distance region at an imaging timing set according to the target distance region, and acquires a plurality of captured images having different target distance regions; A timing control unit for controlling the light emission period of the pulsed light and the imaging timing; With The image acquisition unit acquires visibility information by determining a darkness with respect to the target distance area from the plurality of captured images and measuring a target distance area that is not visible.
- the darkness may be determined by providing a threshold for the brightness of each captured image.
- the depth of fog can be determined by an easy method.
- the image acquisition unit may be capable of transmitting the visibility information to an integrated control unit that controls driving of the vehicle.
- the field-of-view information acquired when fog or the like occurs can be used for vehicle operation control.
- the control device of the present disclosure includes: A plurality of captured images having different target distance areas are acquired by imaging a light emitting unit that emits pulsed light in a predetermined direction and reflected light returning from the target distance area at an imaging timing set according to the target distance area.
- a control device for controlling a vehicle image acquisition device including an image acquisition unit, The image acquisition unit is controlled so as to acquire visibility information by determining a darkness with respect to the target distance area from the plurality of captured images and measuring a target distance area that is not visible.
- the vehicle of the present disclosure is The vehicle image acquisition device or control device described above;
- An integrated control unit capable of communicating with the image acquisition unit or the control device,
- the integrated control unit controls the traveling speed of the vehicle or notifies the driver based on the visibility information.
- the visibility information acquired at the time of occurrence of fog or the like can be used for safe driving or automatic driving of the vehicle.
- the vehicle image acquisition method of the present disclosure includes: An image acquisition method for a vehicle that acquires a plurality of captured images having different target distance areas by imaging reflected light of pulsed light emitted in a predetermined direction while changing an imaging timing. Visibility information is acquired by determining a darkness with respect to the target distance area from the plurality of captured images and measuring an invisible target distance area.
- the vehicle image acquisition device of the present disclosure includes: A light emitting unit that emits pulsed light in a predetermined direction; An image acquisition unit that captures reflected light returning from the target distance region at an imaging timing set according to the target distance region, and acquires a plurality of captured images having different target distance regions; A timing control unit for controlling the light emission period of the pulsed light and the imaging timing; With The light emitting unit is controlled so that the emission intensity of the pulsed light when imaging a far area of the target distance area is higher than the emission intensity when imaging a nearby area.
- the light emission intensity may be linearly changeable according to the distance of the target distance region.
- an image having a uniform contrast can be acquired over the entire range of the target distance region.
- a light emitting unit that emits pulsed light in a predetermined direction;
- An image acquisition unit that captures reflected light returning from the target distance region at an imaging timing set according to the target distance region, and acquires a plurality of captured images having different target distance regions;
- a timing control unit for controlling the light emission period of the pulsed light and the imaging timing; With The light emitting unit is controlled so that a light emission time of the pulsed light when imaging a far region of the target distance region is longer than a light emission time when imaging a nearby region.
- the light emission time may be linearly changeable according to the distance of the target distance region.
- a uniform contrast image can be acquired over the entire range of the target distance region.
- an image acquisition device for a vehicle In order to achieve the third object, an image acquisition device for a vehicle according to still another example of the present disclosure, A light emitting unit that emits pulsed light in a predetermined direction; An image acquisition unit that captures reflected light returning from the target distance region at an imaging timing set according to the target distance region, and acquires a plurality of captured images having different target distance regions; A timing control unit for controlling the light emission period of the pulsed light and the imaging timing; With Of the target distance region, the number of times of emission of the pulsed light and the number of times of imaging of the reflected light when imaging a far region are larger than the number of times of emission and imaging when imaging a nearby region, The light emitting unit and the imaging unit are controlled.
- the number of times of light emission and the number of times of imaging may be linearly changeable according to the distance of the target distance area.
- an image having a uniform contrast can be acquired over the entire range of the target distance region.
- control device of the present disclosure includes: A plurality of captured images having different target distance areas are acquired by imaging a light emitting unit that emits pulsed light in a predetermined direction and reflected light returning from the target distance area at an imaging timing set according to the target distance area.
- a control device for controlling a vehicle image acquisition device including an image acquisition unit, Controlling the emission intensity so that the emission intensity of the pulsed light when imaging a far area of the target distance area is higher than the emission intensity when imaging a nearby area, and imaging the far area Controlling the light emission time so that the light emission time of the pulsed light is longer than the light emission time for imaging the nearby region, and the number of times the pulsed light is emitted when imaging the far region And at least one of controlling the number of times of light emission and the number of times of imaging so that the number of times of imaging of the reflected light is greater than the number of times of light emission and the number of times of imaging when the neighboring region is imaged.
- the vehicle of the present disclosure is: The vehicle image acquisition device or control device according to any of the above, A display unit capable of displaying a combined image obtained by combining a plurality of captured images acquired by the image acquisition unit.
- a vehicle image acquisition method includes: An image acquisition method for a vehicle that acquires a plurality of captured images having different target distance areas by imaging reflected light of pulsed light emitted in a predetermined direction while changing an imaging timing.
- the step of controlling the emission intensity so that the emission intensity of the pulsed light when imaging a far area of the target distance area is higher than the emission intensity when imaging a nearby area, and imaging the far area The step of controlling the emission time so that the emission time of the pulsed light is longer than the emission time of imaging the neighboring region, and the number of times of emission of the pulsed light when imaging the far region And at least one of controlling the number of times of light emission and the number of times of imaging so that the number of times of imaging of the reflected light is larger than the number of times of light emission and the number of times of imaging when the neighboring region is imaged.
- a vehicle image acquisition device a control device, a vehicle image acquisition device, a vehicle including the control device, and a vehicle image acquisition method capable of preventing acquisition of an image of an unnecessary target distance region. Can be provided.
- a vehicle image acquisition device a control device, a vehicle image acquisition device or a vehicle image acquisition method that can acquire detailed visibility information particularly in bad weather. be able to.
- a vehicle image acquisition device a control device, a vehicle image acquisition device, or a vehicle equipped with the control device capable of imaging a near region and a far region of the host vehicle with equal contrast, and vehicle image acquisition.
- a method can be provided.
- FIG. 3 is a timing chart showing light emission interval times and imaging timings according to Example 1; It is a figure which shows the light emission period and imaging timing based on Example 2, and a captured image. It is a graph which shows the relationship between the brightness from the own vehicle and the brightness of the captured image which changes according to the fog density based on Example 2.
- FIG. It is an image figure of the picked-up image which concerns on a prior art example when light-illuminating the vehicle front and imaging. It is a timing chart of the light emission period and imaging cycle concerning Example 3, and is a figure showing the example from which especially light emission intensity changes. It is an image figure of the synthesized image which synthesize
- FIG. 1 is a block diagram showing a configuration of an obstacle detection apparatus according to this embodiment to which a vehicle image acquisition apparatus is applied.
- FIG. 2 is a schematic diagram showing a temporal relationship between the operation of the light emitting unit (light emission operation) and the operation of the gate (camera gate operation) when imaging each target distance region.
- an obstacle detection device 1 provided in a vehicle V includes an image acquisition device 2 and an integrated control unit 100 that can communicate with the image acquisition device 2.
- the integrated control unit 100 functions as a vehicle ECU that controls the driving of the vehicle V.
- the image processing unit 3, the object recognition processing unit 4, and the determination unit 10 are provided. Yes.
- the image acquisition device 2 includes a light emitting unit 5, an objective lens 6, a light multiplication unit 7, a high-speed camera (image acquisition unit) 8, and a timing controller (timing control unit) 9.
- the light emitting unit 5 is, for example, a near infrared LED arranged at the front end of the vehicle V. As shown in FIG. 2, the light emitting unit 5 is in a predetermined direction (for example, in front of the vehicle V) for a predetermined light emission time tL (for example, 5 ns) according to the pulse signal output from the timing controller 9. Pulse light is emitted.
- the light emission period tP of the pulsed light emitted from the light emitting unit 5 is set to an interval of 10 ⁇ s or less, for example.
- the objective lens 6 is, for example, an optical system that is set to have an angle of view that can capture a predetermined range in front of the vehicle V, and receives reflected light from an object.
- the objective lens 6 may be disposed close to the light emitting unit 5 or may be disposed apart from the light emitting unit 5.
- the photomultiplier 7 includes a gate 7a and an image intensifier 7b.
- the gate 7a opens and closes in response to an opening / closing command signal from the timing controller 9.
- the opening time (gate time) tG of the gate 7a is set to 5 ns, which is the same as the light emission time tL.
- the gate time tG is proportional to the imaging target length (imaging target depth) of each region (target distance region) in the entire imaging region from the region 1 to the region n. The longer the gate time tG, the longer the imaging target length of each area.
- the length of the imaging target is “light speed (about 3 ⁇ 10 8 m / s) ⁇ From the “gate time (5 ns)”, it becomes 1.5 m.
- the image intensifier 7b temporarily converts extremely weak light (reflected light from an object, etc.) into electrons and then electrically amplifies it, and returns it to a fluorescent image again, thereby doubling the amount of light and producing an image with contrast.
- a device for viewing. The light amplified by the image intensifier 7b is guided to the image sensor of the high-speed camera 8.
- the high-speed camera 8 captures an image emitted from the light multiplying unit 7 in response to a command signal from the timing controller 9 and outputs the acquired captured image to the image processing unit 3.
- a camera having a resolution of 640 ⁇ 480 (horizontal: vertical), luminance values of 1 to 255 (256 levels), and 100 fps or higher is used.
- the timing controller 9 starts the gate 7a from the light emission start point of the light emitting unit 5 so that the picked-up image picked up by the high-speed camera 8 becomes the timing of the reflected light returning from the target distance region which is the target image pickup region.
- a delay time tD in FIG. 2, tD n , tD n + 1
- the imaging timing is controlled. That is, the delay time tD is a value that determines the distance (imaging target distance) from the vehicle V to the target distance region.
- the timing controller 9 increases the imaging range of the high-speed camera 8 by increasing the delay time tD by a predetermined interval (for example, 10 ns) so that the target distance region is continuously separated forward (distant) of the vehicle V. Change to the front side of the vehicle V.
- the timing controller 9 starts the imaging operation of the high-speed camera 8 immediately before the gate 7a is opened, and ends the imaging operation after the gate 7a is completely closed.
- the timing controller 9 emits light and exposes a plurality of times for each set predetermined target distance region (region 1, region 2,..., Region n), and the high-speed camera 8. To control.
- the light received by the high-speed camera 8 is converted into electric charges and accumulated by repeating light emission and exposure a plurality of times.
- One captured image obtained every predetermined charge accumulation time is called a frame.
- the high-speed camera 8 may acquire one captured image (one frame) for each target distance area, or may acquire a plurality of captured images (several frames) in each target distance area. In this way, the high-speed camera 8 acquires a plurality of captured images having different target distance areas, and outputs the acquired plurality of captured images to the image processing unit 3.
- the image processing unit 3 generates distance image data representing the distance to the object (target object) for each pixel based on the luminance of the same pixel in the captured image of the entire imaging region captured by the high-speed camera 8.
- the obtained distance image data is output to the object recognition processing unit 4.
- the object recognition processing unit 4 identifies an object included in the distance image data.
- a well-known technique such as pattern matching can be used as an object identification method.
- the determination unit 10 determines the relationship (distance, direction, etc.) between the object (person, car, sign, etc.) identified by the object recognition processing unit 4 and the host vehicle (vehicle V).
- the timing controller 9 sets the delay time tD so that the captured image captured by the high-speed camera 8 is the timing of the reflected light returning from the predetermined target distance region, and sets the imaging timing of the high-speed camera 8. Control.
- the time when the light emitted from the light emitting unit 5 returns from the target distance area is the distance between the vehicle V and the target distance area (imaging target distance). Since the reciprocation time is reached, the delay time tD can be obtained from the imaging target distance and the speed of light.
- the luminance value data of the pixel corresponding to the position of the object is affected by the reflected light, and other pixels A value higher than the luminance value data is shown. Thereby, based on the luminance value data of each pixel, the distance to the object existing in the target distance area can be obtained.
- FIG. 3 shows a situation where four objects A to D exist at different positions in front of the vehicle V.
- the object A is a person wearing an umbrella
- the object B is a motorcycle on the opposite lane side
- the object C is a tree on the sidewalk side
- the object D is a vehicle (opposite vehicle) on the opposite lane side.
- the relationship between the distance between the vehicle V and each object is A ⁇ B ⁇ C ⁇ D.
- a part of the imaging region is overlapped so that the reflected light from one object is reflected on the pixels of the captured image in a plurality of continuous imaging regions. That is, as shown in FIG.
- the increase amount of the imaging target distance is set so that a part of the imaging region changes while overlapping.
- FIG. 5 shows temporal luminance changes of pixels corresponding to each object.
- the luminance value of the same pixel in a plurality of consecutive captured images gradually increases and peaks at the positions of the objects A to D. After that, it shows a triangular wave characteristic that gradually decreases.
- the temporal luminance change of the pixel becomes a triangular wave shape, and therefore an imaging region corresponding to the triangular wave peak is determined.
- the obstacle detection device 1 including the image acquisition device 2 can be used for light distribution control of a so-called AHB (automatic high beam) system or ADB (adaptive driving beam) system.
- AHB automatic high beam
- ADB adaptive driving beam
- the distance of each light spot in the image acquired by the camera sensor is obtained from the captured image acquired by the image acquisition device 2, and the distance, brightness, shape (light spot and surrounding shape) of each light spot, and time series change From this, it can be determined whether or not the light spot is a vehicle.
- the image acquisition device 2 and another camera sensor in combination, it is possible to detect a distant vehicle in particular with high accuracy and high speed, and suitably perform light distribution control of the AHB system or the ADB system. Can do.
- FIG. 6 is a diagram for explaining the relationship between the light emission / exposure time and the distance resolution.
- FIG. 6A shows the distance resolution when the pulse width (light emission time) of the pulsed light and the gate time (exposure time) of the high-speed camera are relatively short.
- FIG. 6B shows the distance resolution when the pulse width (light emission time) of the pulsed light and the gate time (exposure time) of the high-speed camera are longer than the pulse width and gate time of (A), respectively.
- 6C and 6D show the relationship between the light emission / exposure time and the imaging target distance.
- the imaging target distance L is obtained from “light speed ⁇ delay time tD (time tA in FIGS.
- FIG. 7 is a diagram for explaining a problem when the light emission interval time is shortened.
- the first light emission tL 1 , the second light emission tL 2 , the third light emission tL 3, and the first exposure tG 1 for exposing the reflected light of each of the first light emission tL 1 to the third light emission tL 3 , A second exposure tG 2 and a third exposure tG 3 are shown.
- the light emission period (light emission interval time) tP of the pulse light is made too short, the reflected light from the first light emission tL 1 is not only received by the first exposure tG 1 but also the second exposure as shown in FIG. tG 2, even ends up receiving.
- the reflected light from the second light emission tL 2 is received not only by the second exposure tG 2 but also by the third exposure tG 3 . That is, if the light emission interval time tP is too short, not only the reflected light due to the light emission at the desired target distance but also the reflected light due to the light emission at the target distance immediately before the desired target distance is imaged. There is a possibility that a captured image of the distance is acquired.
- the inventors of the present application comprehensively consider the above circumstances, and appropriately set the light emission interval time tP in relation to the delay time tD, thereby acquiring a captured image of an unnecessary target distance region. It was found that it can be prevented. The method will be described in detail below.
- FIG. 8 is a timing chart illustrating the light emission interval time and imaging timing according to the first embodiment.
- the delay times tD 1 to tD z gradually increase as the imaging target distance increases from the region 1 to the region n.
- the timing controller 9 so that the light emission interval time tP is longer than the longest delay time tD z required for imaging the region n is an imageable longest distance region reflected light of the target distance region
- a light emission interval time tP is set.
- the light emission interval time tP is set to be longer than the longest delay time tD z necessary for imaging an area 200 m away from the vehicle V.
- the longest delay time tD z is determined from the emission intensity of the pulsed light emitted from the light emitting unit 5, the diffusion angle of the pulsed light, and the sensitivity of the high-speed camera 8 (provided by the image sensor).
- the pulsed light from the light emitting unit 5 is diffused to a predetermined horizontal angle and vertical angle toward the front of the light emitting unit 5, thereby being attenuated by the square of the distance from the light emitting unit 5 or more.
- the high-speed camera 8 includes an image sensor (imaging device) such as a CCD (Charged Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
- the image sensor accumulates electric charges generated by the light incident in each exposure operation, converts them into electric signals, and outputs them to the image processing unit 10. From the attenuation amount of the pulsed light according to the distance from the vehicle V and the sensitivity of the image sensor of the high-speed camera 8, the distance at which charge cannot be accumulated in the image sensor can be calculated. In this example, the set thus calculated charge is not accumulated distance flight time of the pulse light and the longest delay time tD z.
- the target distance area is divided into a plurality of imaging areas 1 to n, and the captured images of the imaging areas 1 to n are acquired while changing the delay times tD 1 to tD z . .
- the amount of reflected light gradually decreases as the imaging region moves away from the vehicle V, so that the captured image of the object in the far region becomes darker than the captured image of the object in the nearby region.
- Example 1 with the emission interval time tP is set to be longer than the longest delay time tD z, emission intensity of the pulsed light when imaging the far-field of the target distance area near Pulse light emission from the light emitting unit 5 is controlled so as to be higher than the light emission intensity in the case of imaging the distance region (see FIG. 8). Thereby, the amount of reflected light when imaging each region can be made uniform by gradually increasing the emission intensity as the target distance region increases. It is preferable that the emission intensity of the pulsed light can be linearly changed so that the imaging target distance gradually increases as the distance from the vehicle V increases.
- the light emission interval time tP is set to be longer than the longest delay time tD z necessary for imaging an area 200 m away from the vehicle V. The Therefore, if the light emission intensity of the pulsed light is higher than necessary, there is a possibility that not only the reflected light due to the light emission just before the exposure but also the reflected light due to the previous light emission may be captured in one exposure. For this reason, it is desirable that the pulsed light is irradiated with a light emission intensity that allows only the reflected light from the light emission just before exposure to be imaged.
- the pulsed light from the light emitting unit 5 when imaging a short-distance area (for example, about 10 m from the vehicle V), the pulsed light from the light emitting unit 5 is set to a light emission intensity that cannot capture an area 210 m from the vehicle V. Further, when imaging a long-distance area (for example, about 190 m from the vehicle V), the pulsed light from the light emitting unit 5 is higher than the emission intensity when imaging the short-distance area and is 390 m from the vehicle V. The light emission intensity is set so that the area cannot be imaged.
- the timing controller 9, the light emitting interval time tP is than the longest delay time tD z required for imaging an imageable longest distance region (eg, region n in FIG. 9) of the reflected light of the target distance region
- the light emission interval time tP is set so as to be longer. According to this configuration, an image of an unnecessary target distance region is captured without imaging not only reflected light due to light emission at a desired target distance but also reflected light due to light emission at a target distance immediately before the desired target distance. Acquisition of an image can be prevented. Thereby, mixing of noise can be suppressed, and more accurate distance information can be acquired.
- maximum delay time tD z is preferably determined from the emission intensity and the sensitivity of the image sensor of the diffusion angle and speed camera 8 of the pulsed light.
- the maximum delay time tD z by using the above parameters for the calculation of the maximum delay time tD z can be easily calculated.
- the light emitting unit 5 weakens the light emission intensity of the pulsed light when shooting in the short distance area, and increases the light emission intensity of the pulsed light when shooting in the long distance area. According to this configuration, the amount of reflected light when imaging each target distance area can be made uniform, and the difference in luminance of pixels corresponding to the position of the object present in the captured image in each area can be reduced. . Thereby, it can prevent that the image of a long distance area
- FIG. 9 is a diagram illustrating a light emission period, an imaging timing, and a captured image according to the second embodiment.
- FIG. 10 is a graph illustrating the relationship between the brightness of a captured image that changes according to the fog density and the distance from the host vehicle according to the second embodiment.
- the imaging target distance L is obtained from “light speed ⁇ delay time tD / 2”. Therefore, by gradually changing the delay time tD, captured images corresponding to different imaging target distances can be acquired.
- the image processing unit 3 acquires a plurality of images 1 to n captured at different imaging timings (exposures 1 to n), and each of the acquired images 1 to n is acquired. The darkness (darkness of the image) is determined. The images 1 to n gradually increase in darkness (that is, the brightness (luminance) decreases) as the distance from the vehicle V increases.
- each image 1 to n The darkness of each image 1 to n is determined as follows. As shown in FIG. 10, when there is no fog or is thin, the amount of reflected light gradually decreases as the distance from the vehicle V increases (that is, the reflection time of pulse light increases), and the brightness of the captured image ( (Luminance) is lowered, that is, the image is gradually darkened as shown in FIG. On the other hand, when the fog is dark, the amount of reflected light suddenly decreases and the brightness of the captured image decreases (the darkness increases rapidly) as the vehicle V moves away to a certain distance. From the relationship between the fog depth and the brightness change of the captured image, the visibility from the vehicle V can be obtained using a plurality of captured images acquired by the high-speed camera 8.
- the image processing unit 3 determines the darkness of the captured image by providing a threshold value for the luminance values of the acquired captured images.
- the image processing unit 3 compares the actual luminance value of the captured image corresponding to the distance from the vehicle V with the maximum luminance value (maximum brightness) in the captured image at the distance.
- the maximum luminance value is, for example, the maximum luminance value that can be predicted when there is no fog and the weather is clear.
- the image processing unit 3 determines that the distance corresponding to the captured image is an invisible distance from the vehicle V.
- the image processing unit 3 acquires the visibility information including the invisible distance, and transmits the visibility information to the integrated control unit 100 (FIG. 1) that controls the driving of the vehicle V.
- the high-speed camera 8 that is an image acquisition unit may acquire view information from a plurality of captured images and directly transmit the view information to the integrated control unit 100 without using the image processing unit 3.
- the integrated control unit 100 can calculate the speed limit of the vehicle V based on the field-of-view information received from the image processing unit 3, and can control the travel speed based on the speed limit. Alternatively, the integrated control unit 100 may notify the driver of the vehicle V using the speed limit as a safe speed.
- the image processing unit 3 determines the darkness with respect to the target distance area from a plurality of captured images captured by the high-speed camera 8 and measures the target distance area that is not visible. Get visibility information. According to this configuration, it is possible to acquire visibility information during bad weather, in particular, information regarding fog depth when fog occurs.
- the darkness of the captured image is preferably determined by providing a threshold value with respect to the brightness of the captured image. According to this structure, the depth of fog can be determined by an easy method.
- the image processing unit 3 (or the high-speed camera 8) can transmit the visibility information to the integrated control unit 100 that controls the driving of the vehicle V.
- production of fog etc. can be utilized for the drive control of the vehicle V.
- the integrated control unit 100 controls the traveling speed of the vehicle V or notifies the driver based on the visibility information received from the image processing unit 3 (or the high-speed camera 8). According to this configuration, it is possible to use the visibility information acquired at the time of occurrence of fog or the like for safe driving or automatic driving of the vehicle V.
- FIG. 11 is an image diagram of a captured image according to a conventional example when the front of the vehicle is imaged with light irradiation.
- a person M1 stands near the front of the vehicle, and a person M2 stands far away.
- the shadow image of the nearby person M1 has a luminance. High and bright.
- the shadow image of the far person M2 has a low luminance and is taken dark. That is, the difference in brightness between a nearby object and a distant object is large and the contrast is high, so the visibility of a distant object is poor.
- the inventor of the present application comprehensively considers the above situation, and can capture images by a method in which the luminance of the pixels in the vicinity image of the vehicle V and the captured image in the distant region are equal. It was found that this reduces the difference in contrast and improves the visibility of objects in the far field.
- a method for imaging the vicinity region and the far region of the vehicle V with the same contrast will be described in detail.
- FIG. 12 is a timing chart of the light emission period and the imaging period according to the third embodiment, and particularly shows an example in which the light emission intensity changes.
- the light emitting unit 5 is controlled so that the light emission intensity of the pulsed light when imaging the far area of the target distance area is higher than the light emission intensity when imaging the neighboring area.
- the emission intensity of the pulsed light can be changed linearly so that the imaging target distance gradually increases as the distance from the vehicle V increases.
- the emission intensity in the region 1 region around 10 m from the vehicle V
- the emission intensity in the region n region around 100 m from the vehicle V
- the difference in luminance of the pixels corresponding to the position of the object existing in the captured image in each region is reduced.
- FIG. 13 is an image diagram of a neighborhood image and a far image according to the third embodiment, and a synthesized image obtained by synthesizing the neighborhood image and the far image.
- the amount of reflected light from the neighborhood area is large as in the conventional example of FIG.
- the reflected light from the distant person M2 is not imaged).
- the amount of reflected light from the person M2 is larger than that in the conventional example of FIG.
- the distant person M2 is also imaged as brightly as the person M1 in the nearby image (at this time, the reflected light from the nearby person M1 is not imaged).
- the image processing unit 3 synthesizes the near image and the distant image obtained by imaging the reflected light of the pulsed light whose emission intensity is gradually increased in accordance with the imaging target distance in this way, and the result is shown in FIG.
- the composite image (distance image data) shown is generated.
- the nearby person M1 and the far person M2 have substantially the same luminance.
- the light-emitting unit 5 is controlled so that the emission intensity of the pulsed light when imaging the far area of the target distance area is higher than the emission intensity when imaging the nearby area. Therefore, it is possible to acquire distance image data (synthesized image) with a small difference in pixel brightness between a nearby object and a distant object. Therefore, it is possible to image the near area and the far area with the same contrast, and a good image can be acquired.
- the emission intensity of the pulsed light can be changed linearly according to the distance of the target distance region. According to this configuration, an image having a uniform contrast can be acquired over the entire range of the target distance region.
- the composite image of FIG. 13 generated as described above can be displayed on various devices provided in the vehicle V.
- driving of the vehicle V such as a display unit of a car navigation system, a display unit in a meter panel, a display unit mounted on a part of a room mirror, etc.
- displaying the composite image at a position that is easy for a person to visually recognize it is possible to contribute to improving safety at night or in rainy weather.
- Example 4 of the present embodiment will be described with reference to FIG.
- FIG. 15 is a timing chart of the light emission period and the imaging period according to the fourth embodiment, and particularly shows an example in which the light emission time changes.
- the light emitting unit 5 is controlled so that the light emission time of the pulsed light when imaging the far region of the target distance region is longer than the light emission time when imaging the nearby region.
- the emission time of the pulsed light can be changed linearly so that the imaging target distance gradually increases as the distance from the vehicle V increases.
- the light emission time (time tL in FIG. 2) in region 1 (region around 10 m from vehicle V) is 10 ⁇ s
- the light emission time in region n region around 100 m from vehicle V
- the image processing unit 3 synthesizes the near image and the distant image acquired with the light emission time variable, and generates a composite image. Therefore, also in the fourth embodiment, it is possible to obtain a composite image having a uniform contrast with a small difference in pixel brightness between a nearby object and a distant object.
- Example 5 of the present embodiment will be described with reference to FIG.
- FIG. 16 is a timing chart of the light emission cycle and the imaging cycle according to the fifth embodiment, and particularly shows an example in which the number of times of light emission and the number of times of imaging change.
- the light emitting unit 5 is controlled so that the number of times of light emission of the pulsed light when imaging the far region of the target distance region is larger than the number of times of light emission when imaging the neighboring region,
- the number of times the gate 7a is opened (number of times of imaging) when the far region is imaged is controlled so as to be larger than the number of times of opening the gate when the neighboring region is imaged according to the number of times of light emission.
- the number of pulsed light emissions and the number of imaging can be changed linearly so that the imaging target distance gradually increases as the distance from the vehicle V increases.
- the number of light emission / imaging times in one frame in the region 1 is 100 times, for example, and the number of light emission / image pickup times in one frame in the region n (region around 100 m from the vehicle V) is, for example, 10,000 times.
- the image processing unit 3 synthesizes the near image and the distant image acquired by varying the number of light emission / imaging times to generate a composite image. Therefore, also in Example 5, it is possible to acquire a composite image having a uniform contrast with a small difference in pixel brightness between a nearby object and a distant object.
- the length of the imaging target can be appropriately set according to the performance of the high-speed camera 8 and the image processing unit 3.
- the high-speed camera 8 is configured to function as an image acquisition unit, but is not limited to this example.
- the image processing unit 3 may have a function as an image acquisition unit, or a separate memory for storing a captured image as an image acquisition unit is provided between the high-speed camera 8 and the image processing unit 3. May be.
- the photomultiplier 7 (gate 7a, image intensifier 7b) is provided between the objective lens 6 and the high-speed camera 8. It is not limited to this example. For example, it is possible to obtain a plurality of captured images by performing gating at a predetermined imaging timing in the high-speed camera 8 without providing the light multiplying unit 7.
- the object recognition is performed by generating the distance image data by the image processing unit 3.
- the object recognition is performed from the captured images of the individual target distances captured by the high speed camera 8. good.
- information relating to the depth of fog is acquired as the visibility information.
- visibility information during bad weather such as rain or snow may be acquired in addition to the fog.
- the configuration in which the distance image data is generated by separately changing the light emission intensity, the light emission time, and the number of times of light emission of the pulse light is illustrated.
- at least two of the number of times of light emission may be combined and changed according to the imaging target distance.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Studio Devices (AREA)
Abstract
Description
所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記タイミング制御部は、前記発光インターバル時間が、前記発光部の発光開始時点から前記画像取得部の撮像開始時点までの時間であるディレイ時間であって前記ターゲット距離領域のうち前記反射光を撮像可能な最長距離領域を撮像するために必要なディレイ時間よりも長くなるように前記発光インターバル時間を設定する。
所定方向にパルス光を発光する発光部と、ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像してターゲット距離領域の異なる複数の撮像画像を取得する画像取得部とを備えた車両用画像取得装置を制御するための制御装置であって、
前記発光インターバル時間が、前記発光部の発光開始時点から前記画像取得部の撮像開始時点までの時間であるディレイ時間であって前記ターゲット距離領域のうち前記反射光を撮像可能な最長距離領域を撮像するために必要なディレイ時間よりも長くなるように前記発光インターバル時間を設定する。
撮像タイミングを変化させながら所定方向に発光されるパルス光の反射光を撮像することで、ターゲット距離領域の異なる複数の撮像画像を取得する、車両用画像取得方法であって、
前記パルス光の発光周期を示す発光インターバル時間が、前記パルス光の発光開始時点から前記反射光の撮像開始時点までの時間であるディレイ時間であって前記ターゲット距離領域のうち前記反射光を撮像可能な最長距離領域を撮像するために必要なディレイ時間よりも長くなるように前記発光インターバル時間を設定する。
所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記画像取得部は、前記複数の撮像画像から前記ターゲット距離領域に対する暗度を判定して視認不能なターゲット距離領域を測定することにより視界情報を取得する。
所定方向にパルス光を発光する発光部と、ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像してターゲット距離領域の異なる複数の撮像画像を取得する画像取得部とを備えた車両用画像取得装置を制御するための制御装置であって、
前記複数の撮像画像から前記ターゲット距離領域に対する暗度を判定して視認不能なターゲット距離領域を測定することにより視界情報を取得するように、前記画像取得部を制御する。
上記に記載の車両用画像取得装置または制御装置と、
前記画像取得部または前記制御装置と通信可能な統合制御部と、を備えた車両であって、
前記統合制御部は、前記視界情報に基づいて車両の走行速度の制御あるいは運転者への報知を行う。
撮像タイミングを変化させながら所定方向に発光されるパルス光の反射光を撮像することで、ターゲット距離領域の異なる複数の撮像画像を取得する、車両用画像取得方法であって、
前記複数の撮像画像から前記ターゲット距離領域に対する暗度を判定して視認不能なターゲット距離領域を測定することにより視界情報を取得する。
所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記発光部は、前記ターゲット距離領域のうち、遠方領域を撮像する場合の前記パルス光の発光強度が、近傍領域を撮像する場合の前記発光強度よりも高くなるように制御される。
所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記発光部は、前記ターゲット距離領域のうち、遠方領域を撮像する場合の前記パルス光の発光時間が、近傍領域を撮像する場合の前記発光時間よりも長くなるように制御される。
所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記ターゲット距離領域のうち、遠方領域を撮像する場合の前記パルス光の発光回数および前記反射光の撮像回数が、近傍領域を撮像する場合の前記発光回数および前記撮像回数よりも多くなるように、前記発光部および前記撮像部が制御される。
所定方向にパルス光を発光する発光部と、ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像してターゲット距離領域の異なる複数の撮像画像を取得する画像取得部とを備えた車両用画像取得装置を制御するための制御装置であって、
前記ターゲット距離領域のうち遠方領域を撮像する場合の前記パルス光の発光強度が近傍領域を撮像する場合の前記発光強度よりも高くなるように前記発光強度を制御すること、前記遠方領域を撮像する場合の前記パルス光の発光時間が前記近傍領域を撮像する場合の前記発光時間よりも長くなるように前記発光時間を制御すること、および、前記遠方領域を撮像する場合の前記パルス光の発光回数および前記反射光の撮像回数が前記近傍領域を撮像する場合の前記発光回数および前記撮像回数よりも多くなるように前記発光回数および前記撮像回数を制御することのうち少なくともいずれか一つを行う。
上記のいずれかに記載の車両用画像取得装置または制御装置と、
前記画像取得部により取得された複数の撮像画像が合成された合成画像を表示可能な表示部と、を備えている。
撮像タイミングを変化させながら所定方向に発光されるパルス光の反射光を撮像することで、ターゲット距離領域の異なる複数の撮像画像を取得する、車両用画像取得方法であって、
前記ターゲット距離領域のうち遠方領域を撮像する場合の前記パルス光の発光強度が近傍領域を撮像する場合の前記発光強度よりも高くなるように前記発光強度を制御するステップ、前記遠方領域を撮像する場合の前記パルス光の発光時間が前記近傍領域を撮像する場合の前記発光時間よりも長くなるように前記発光時間を制御するステップ、および、前記遠方領域を撮像する場合の前記パルス光の発光回数および前記反射光の撮像回数が前記近傍領域を撮像する場合の前記発光回数および前記撮像回数よりも多くなるように前記発光回数および前記撮像回数を制御するステップのうち少なくともいずれか一つを行う。
ゲート7aは、タイミングコントローラ9からの開閉指令信号に応じて開閉する。本実施形態では、ゲート7aの開放時間(ゲート時間)tGを、発光時間tLと同じ5nsとしている。ゲート時間tGは、領域1から領域nまでの全撮像領域における各領域(ターゲット距離領域)の撮像対象長さ(撮像対象深さ)に比例する。ゲート時間tGを長くするほど各領域の撮像対象長さは長くなる。撮像対象長さは、光速度×ゲート時間tGから求められ、本実施形態では、ゲート時間tG=5nsとしているため、撮像対象長さは、「光速度(約3×108m/s)×ゲート時間(5ns)」より、1.5mとなる。
イメージインテンシファイア7bは、極微弱な光(物体からの反射光等)を一旦電子に変換して電気的に増幅し、再度蛍光像に戻すことで光量を倍増してコントラストのついた像を見るためのデバイスである。イメージインテンシファイア7bにより増幅された光は高速度カメラ8のイメージセンサに導かれる。
撮像対象距離=光速度(約3×108m/s)×ディレイ時間tD/2 ・・・式(1)
[画像取得作用]
タイミングコントローラ9は、高速度カメラ8により撮像される撮像画像が、所定のターゲット距離領域から帰ってくる反射光のタイミングとなるように、ディレイ時間tDを設定し、高速度カメラ8の撮像タイミングを制御する。ターゲット距離領域に物体が存在している場合、発光部5から出射された光がターゲット距離領域から戻ってくる時間は、車両Vとターゲット距離領域との間の距離(撮像対象距離)を光が往復する時間となるため、ディレイ時間tDは、撮像対象距離と光速度から求めることができる。
このとき、本実施形態では、1つの物体からの反射光が連続する複数の撮像領域における撮像画像の画素に反映されるように、撮像領域の一部をオーバーラップさせている。すなわち、図4に示すように、撮像対象距離をB1→B2→B3→…と連続的に変化させながら撮像する際、撮像領域の撮像対象長さAよりも撮像対象距離の増加量(B2-B1)を短くすることで、撮像領域の一部がオーバーラップしながら変化するように撮像対象距離の増加量を設定している。
撮像領域の一部をオーバーラップさせることで、図5に示されるように、連続する複数の撮像画像における同一画素の輝度値は、徐々に増加し、各物体A~Dの位置でピークとなった後は徐々に小さくなる三角波状の特性を示す。このように、1つの物体からの反射光が複数の撮像画像に含まれるようにすることで、画素の時間的な輝度変化が三角波状となるため、当該三角波状のピークと対応する撮像領域を当該画素における車両Vから各物体(被写体)A~Dまでの距離とすることで、検出精度を高めることができる。
図6は、発光/露光時間と距離分解能との関係を説明するための図である。図6の(A)は、パルス光のパルス幅(発光時間)および高速度カメラのゲート時間(露光時間)が比較的短い場合の距離分解能を示している。一方、図6の(B)は、パルス光のパルス幅(発光時間)および高速度カメラのゲート時間(露光時間)がそれぞれ(A)のパルス幅およびゲート時間よりも長い場合の距離分解能を示している。また、図6の(C)および(D)には、発光/露光時間と撮像対象距離との関係が示されている。
上述の通り、撮像対象距離Lは「光速×ディレイ時間tD(図6の(C)および(D)における時間tA)/2」から求められる。すなわち、パルス光の発光終了時点から露光開始時点までの時間tA1が距離L1に相当し、パルス光の発光開始時点から露光終了時点までの時間tA2が距離L2に相当する。そのため、図6の(A)のように発光時間および露光時間が短いほど、図6の(C)に示されるように撮像対象長さ(L2-L1)が短くなる、すなわち、距離分解能が高くなる。一方、図6の(B)のように発光時間および露光時間が長いほど、図6の(D)に示されるように撮像対象長さ(L2-L1)が長くなる、すなわち、距離分解能が低下することがわかる。したがって、発光時間および露光時間が短いほどターゲット距離の分解能を細かくし距離検出の精度を上げることができる。
ところで、1フレーム内でできるだけ明るく(高輝度に)撮影するためには、発光と露光とを数多く繰り返すことが望ましい。そのためには、図2に示されるパルス光の発光周期tP(とそれに追随する撮像周期)をできるだけ短くすることが考え得る。しかし、パルス光の発光周期(発光インターバル時間)tPを短くし過ぎると、図7に示されるように、第一発光tL1による反射光を第一露光tG1で受光するだけでなく第二露光tG2でも受光してしまう。同様に、第二発光tL2による反射光を第二露光tG2だけでなく第三露光tG3でも受光してしまう。すなわち、発光インターバル時間tPを短くし過ぎると、所望のターゲット距離の発光による反射光だけでなく、当該所望のターゲット距離のひとつ前のターゲット距離の発光による反射光まで撮像することで、不要なターゲット距離の撮像画像を取得してしまう可能性がある。
実施例1においては、図8に示されるように、領域1から領域nへと撮像対象距離が離れるにつれてディレイ時間tD1~tDzは徐々に長くなる。このとき、タイミングコントローラ9は、発光インターバル時間tPが、ターゲット距離領域のうち反射光を撮像可能な最長距離領域である領域nを撮像するために必要な最長ディレイ時間tDzよりも長くなるように発光インターバル時間tPを設定する。例えば、ターゲット距離領域が車両Vから0~200mの範囲である場合、発光インターバル時間tPは車両Vから200m離れた領域を撮像するために必要な最長ディレイ時間tDzよりも長くなるように設定される。
次に、本実施形態の実施例2について、図9および図10を参照して説明する。
図9は、実施例2に係る、発光周期および撮像タイミングと撮像画像とを示す図である。図10は、実施例2に係る、霧の濃さに応じて変化する撮像画像の明るさと自車両からの距離との関係を示すグラフである。
図10に示されるように、霧が無いか薄い場合は、車両Vからの距離が遠くなる(すなわち、パルス光の反射時間が長くなる)につれて緩やかに反射光量が少なくなり撮像画像の明るさ(輝度)が低くなる、すなわち、図9に示されるように緩やかに画像が暗くなる。一方、霧が濃い場合は、車両Vからある一定の距離まで離れると急激に反射光量が少なくなり撮像画像の明るさが低くなる(暗度が急激に高くなる)。この霧の深さと撮像画像の明るさ変化との関係から、高速度カメラ8により取得された複数の撮像画像を用いて車両Vからの視程を求めることができる。
画像処理部3は、例えば、取得した複数の撮像画像の輝度値に対して閾値を設けることで、撮像画像の暗度を判定する。ここで、画像処理部3は、例えば、車両Vからの距離に応じた撮像画像の実際の輝度値を、当該距離の撮像画像における最大輝度値(最大の明るさ)と比較する。最大輝度値は、例えば霧が無く晴天の場合に予測され得る最大の輝度値である。撮像画像の実際の輝度値が最大輝度値の例えば50%以下である場合は、画像処理部3は、当該撮像画像に対応する距離を車両Vから視認不能な距離と判定する。画像処理部3は、この視認不能距離を含む視界情報を取得し、当該視界情報を車両Vの運転を制御する統合制御部100(図1)へ送信する。なお、画像処理部3を介さず、画像取得部である高速度カメラ8が複数の撮像画像から視界情報を取得して、当該視界情報を統合制御部100へ直接送信する構成としても良い。
次に、本実施形態の実施例3について、図11~14を参照して説明する。
図11は、車両前方を光照射して撮像したときの従来例に係る撮像画像のイメージ図である。
図11に示されるように、車両前方の近傍に人物M1が立っており、遠方に人物M2が立っている。このとき、従来のように、例えば夜間照明光を用いて車載カメラにより撮影を行う場合、車載カメラによる撮像画像においては車両の近傍領域からの反射光量が多いため近傍の人物M1の影像は輝度が高く、明るく撮像される。一方、車両の遠方領域からの反射光量は少なくなるため遠方の人物M2の影像は輝度が低く、暗く撮像される。すなわち、近傍の物体と遠方の物体とでは輝度の差が大きくコントラストが高くなるため、遠方の物体の視認性が劣る。
実施例3においては、発光部5は、ターゲット距離領域のうち遠方領域を撮像する場合のパルス光の発光強度が、近傍領域を撮像する場合の発光強度よりも高くなるように制御される。具体的には、パルス光の発光強度は、撮像対象距離が車両Vから離れるにつれて徐々に高くなるようにリニアに変化可能とされている。領域1(車両Vから10m前後の領域)における発光強度は、例えば100lm(ルーメン)であり、領域n(車両Vから100m前後の領域)における発光強度は、例えば1000lmである。このように、ターゲット距離領域の距離(撮像対象距離)に応じて発光強度を徐々に高くすることで、各領域における撮像画像に存在する物体の位置に対応する画素の輝度の差が少なくなる。
例えば、図13に示されるように、近傍領域を撮像したときの近傍画像では、図11の従来例と同様に近傍領域からの反射光量が多いため近傍の人物M1が明るく撮像される(このとき、遠方の人物M2からの反射光は撮像されない)。また、遠方領域を撮像したときの遠方画像では、近傍領域よりも発光強度が高いパルス光が照射されているため人物M2からの反射光量は図11の従来例よりも多くなる。これにより、遠方の人物M2も近傍画像での人物M1と同等に明るく撮像される(このとき、近傍の人物M1からの反射光は撮像されない)。画像処理部3は、このように発光強度が撮像対象距離に応じて徐々に高くされたパルス光の反射光を撮像することで取得された近傍画像と遠方画像とを合成して、図13に示される合成画像(距離画像データ)を生成する。合成画像では、近傍の人物M1と遠方の人物M2とがほぼ同等の輝度を有している。
次に、本実施形態の実施例4について、図15を参照して説明する。
図15は、実施例4に係る発光周期および撮像周期のタイミングチャートであって、特に発光時間が変化する例を示す図である。
実施例4においては、発光部5は、ターゲット距離領域のうち遠方領域を撮像する場合のパルス光の発光時間が、近傍領域の撮像する場合の発光時間よりも長くなるように制御される。具体的には、パルス光の発光時間は、撮像対象距離が車両Vから離れるにつれて徐々に長くなるようにリニアに変化可能とされている。例えば、領域1(車両Vから10m前後の領域)における発光時間(図2の時間tL)は、10μsであり、領域n(車両Vから100m前後の領域)における発光時間は20μsである。
次に、本実施形態の実施例5について、図16を参照して説明する。
図16は、実施例5に係る発光周期および撮像周期のタイミングチャートであって、特に発光回数および撮像回数が変化する例を示す図である。
実施例5においては、ターゲット距離領域のうち遠方領域を撮像する場合のパルス光の発光回数が、近傍領域の撮像する場合の発光回数よりも多くなるように発光部5が制御されるとともに、当該発光回数に応じて遠方領域を撮像する場合のゲート7aのゲート開放回数(撮像回数)も近傍領域を撮像する場合のゲート開放回数よりも多くなるように制御される。具体的には、パルス光の発光回数および撮像回数は、撮像対象距離が車両Vから離れるにつれて徐々に多くなるようにリニアに変更可能とされている。領域1(車両Vから10m前後の領域)における1フレーム中の発光/撮像回数は、例えば100回であり、領域n(車両Vから100m前後の領域)における1フレーム中の発光/撮像回数は例えば10000回である。
Claims (21)
- 所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記タイミング制御部は、前記発光インターバル時間が、前記発光部の発光開始時点から前記画像取得部の撮像開始時点までの時間であるディレイ時間であって前記ターゲット距離領域のうち前記反射光を撮像可能な最長距離領域を撮像するために必要なディレイ時間よりも長くなるように前記発光インターバル時間を設定する、車両用画像取得装置。 - 前記最長距離領域を撮像するために必要なディレイ時間は、前記パルス光の発光強度および拡散角と前記画像取得部の感度とから定められる、請求項1に記載の車両用画像取得装置。
- 前記発光部は、前記ターゲット距離領域のうち、近距離領域の撮影時ほど前記発光強度を弱くし、遠距離領域の撮影時ほど前記発光強度を強くする、請求項2に記載の車両用画像取得装置。
- 所定方向にパルス光を発光する発光部と、ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像してターゲット距離領域の異なる複数の撮像画像を取得する画像取得部とを備えた車両用画像取得装置を制御するための制御装置であって、
前記発光インターバル時間が、前記発光部の発光開始時点から前記画像取得部の撮像開始時点までの時間であるディレイ時間であって前記ターゲット距離領域のうち前記反射光を撮像可能な最長距離領域を撮像するために必要なディレイ時間よりも長くなるように前記発光インターバル時間を設定する、制御装置。 - 請求項1から3のいずれか一項に記載の車両用画像取得装置、または請求項4に記載の制御装置を備えている、車両。
- 撮像タイミングを変化させながら所定方向に発光されるパルス光の反射光を撮像することで、ターゲット距離領域の異なる複数の撮像画像を取得する、車両用画像取得方法であって、
前記パルス光の発光周期を示す発光インターバル時間が、前記パルス光の発光開始時点から前記反射光の撮像開始時点までの時間であるディレイ時間であって前記ターゲット距離領域のうち前記反射光を撮像可能な最長距離領域を撮像するために必要なディレイ時間よりも長くなるように前記発光インターバル時間を設定する、車両用画像取得方法。 - 所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記画像取得部は、前記複数の撮像画像から前記ターゲット距離領域に対する暗度を判定して視認不能なターゲット距離領域を測定することにより視界情報を取得する、車両用画像取得装置。 - 前記暗度は、各撮像画像の輝度に対して閾値を設けることで判定される、請求項7に記載の車両用画像取得装置。
- 前記画像取得部は、前記視界情報を車両の運転を制御する統合制御部へ送信可能である、請求項7または8に記載の車両用画像取得装置。
- 所定方向にパルス光を発光する発光部と、ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像してターゲット距離領域の異なる複数の撮像画像を取得する画像取得部とを備えた車両用画像取得装置を制御するための制御装置であって、
前記複数の撮像画像から前記ターゲット距離領域に対する暗度を判定して視認不能なターゲット距離領域を測定することにより視界情報を取得するように、前記画像取得部を制御する、制御装置。 - 請求項7から9のいずれか一項に記載の車両用画像取得装置、または請求項10に記載の制御装置と、
前記画像取得部または前記制御装置と通信可能な統合制御部と、を備えた車両であって、
前記統合制御部は、前記視界情報に基づいて車両の走行速度の制御あるいは運転者への報知を行う、車両。 - 撮像タイミングを変化させながら所定方向に発光されるパルス光の反射光を撮像することで、ターゲット距離領域の異なる複数の撮像画像を取得する、車両用画像取得方法であって、
前記複数の撮像画像から前記ターゲット距離領域に対する暗度を判定して視認不能なターゲット距離領域を測定することにより視界情報を取得する、車両用画像取得方法。 - 所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記発光部は、前記ターゲット距離領域のうち、遠方領域を撮像する場合の前記パルス光の発光強度が、近傍領域を撮像する場合の前記発光強度よりも高くなるように制御される、車両用画像取得装置。 - 前記発光強度は、前記ターゲット距離領域の距離に応じてリニアに変化可能である、請求項13に記載の車両用画像取得装置。
- 所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記発光部は、前記ターゲット距離領域のうち、遠方領域を撮像する場合の前記パルス光の発光時間が、近傍領域を撮像する場合の前記発光時間よりも長くなるように制御される、車両用画像取得装置。 - 前記発光時間は、前記ターゲット距離領域の距離に応じてリニアに変化可能である、請求項15に記載の車両用画像取得装置。
- 所定方向にパルス光を発光する発光部と、
ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像し、ターゲット距離領域の異なる複数の撮像画像を取得する画像取得部と、
前記パルス光の発光周期および前記撮像タイミングを制御するタイミング制御部と、
を備え、
前記ターゲット距離領域のうち、遠方領域を撮像する場合の前記パルス光の発光回数および前記反射光の撮像回数が、近傍領域を撮像する場合の前記発光回数および前記撮像回数よりも多くなるように、前記発光部および前記撮像部が制御される、車両用画像取得装置。 - 前記発光回数および前記撮像回数は、前記ターゲット距離領域の距離に応じてリニアに変更可能である、請求項17に記載の車両用画像取得装置。
- 所定方向にパルス光を発光する発光部と、ターゲット距離領域に応じて設定される撮像タイミングで前記ターゲット距離領域から帰ってくる反射光を撮像してターゲット距離領域の異なる複数の撮像画像を取得する画像取得部とを備えた車両用画像取得装置を制御するための制御装置であって、
前記ターゲット距離領域のうち遠方領域を撮像する場合の前記パルス光の発光強度が近傍領域を撮像する場合の前記発光強度よりも高くなるように前記発光強度を制御すること、前記遠方領域を撮像する場合の前記パルス光の発光時間が前記近傍領域を撮像する場合の前記発光時間よりも長くなるように前記発光時間を制御すること、および、前記遠方領域を撮像する場合の前記パルス光の発光回数および前記反射光の撮像回数が前記近傍領域を撮像する場合の前記発光回数および前記撮像回数よりも多くなるように前記発光回数および前記撮像回数を制御することのうち少なくともいずれか一つを行う、制御装置。 - 請求項13から18のいずれか一項に記載の車両用画像取得装置、または請求項19に記載の制御装置と、
前記画像取得部により取得された複数の撮像画像が合成された合成画像を表示可能な表示部と、を備えている、車両。 - 撮像タイミングを変化させながら所定方向に発光されるパルス光の反射光を撮像することで、ターゲット距離領域の異なる複数の撮像画像を取得する、車両用画像取得方法であって、
前記ターゲット距離領域のうち遠方領域を撮像する場合の前記パルス光の発光強度が近傍領域を撮像する場合の前記発光強度よりも高くなるように前記発光強度を制御するステップ、前記遠方領域を撮像する場合の前記パルス光の発光時間が前記近傍領域を撮像する場合の前記発光時間よりも長くなるように前記発光時間を制御するステップ、および、前記遠方領域を撮像する場合の前記パルス光の発光回数および前記反射光の撮像回数が前記近傍領域を撮像する場合の前記発光回数および前記撮像回数よりも多くなるように前記発光回数および前記撮像回数を制御するステップのうち少なくともいずれか一つを行う、車両用画像取得方法。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP16878310.8A EP3396413A4 (en) | 2015-12-21 | 2016-12-01 | VEHICLE APPARATUS FOR VEHICLES, CONTROL APPARATUS, VEHICLE EQUIPPED WITH THE IMAGE RECORDING DEVICE FOR VEHICLES AND THE CONTROL DEVICE IMAGE FORMULATION OF VEHICLES |
| CN201680075214.2A CN108431630A (zh) | 2015-12-21 | 2016-12-01 | 车辆用图像获取装置、控制装置、包括了车辆用图像获取装置或控制装置的车辆和车辆用图像获取方法 |
| JP2017557836A JP6851985B2 (ja) | 2015-12-21 | 2016-12-01 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
| US16/065,074 US11194023B2 (en) | 2015-12-21 | 2016-12-01 | Image acquiring apparatus for vehicle, control device, vehicle having image acquiring apparatus for vehicle or control device, and image acquiring method for vehicle |
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015248826 | 2015-12-21 | ||
| JP2015-248826 | 2015-12-21 | ||
| JP2015248828 | 2015-12-21 | ||
| JP2015-248827 | 2015-12-21 | ||
| JP2015-248828 | 2015-12-21 | ||
| JP2015248827 | 2015-12-21 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017110417A1 true WO2017110417A1 (ja) | 2017-06-29 |
Family
ID=59090090
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/085814 Ceased WO2017110417A1 (ja) | 2015-12-21 | 2016-12-01 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11194023B2 (ja) |
| EP (1) | EP3396413A4 (ja) |
| JP (1) | JP6851985B2 (ja) |
| CN (1) | CN108431630A (ja) |
| WO (1) | WO2017110417A1 (ja) |
Cited By (60)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2019125607A1 (en) | 2017-12-22 | 2019-06-27 | Waymo Llc | Systems and methods for adaptive range coverage using lidar |
| WO2019165130A1 (en) * | 2018-02-21 | 2019-08-29 | Innovusion Ireland Limited | Lidar detection systems and methods with high repetition rate to observe far objects |
| WO2020149140A1 (ja) * | 2019-01-17 | 2020-07-23 | 株式会社小糸製作所 | 車載用イメージング装置、車両用灯具、自動車 |
| JPWO2020184447A1 (ja) * | 2019-03-11 | 2020-09-17 | ||
| WO2022014416A1 (ja) | 2020-07-14 | 2022-01-20 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| JP7031771B1 (ja) | 2021-03-23 | 2022-03-08 | 株式会社リコー | 撮像装置、撮像方法および情報処理装置 |
| US11289873B2 (en) | 2018-04-09 | 2022-03-29 | Innovusion Ireland Limited | LiDAR systems and methods for exercising precise control of a fiber laser |
| US20220214434A1 (en) * | 2019-09-26 | 2022-07-07 | Koito Manufacturing Co., Ltd. | Gating camera |
| WO2022163721A1 (ja) | 2021-01-27 | 2022-08-04 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| US11422234B2 (en) | 2018-02-23 | 2022-08-23 | Innovusion, Inc. | Distributed lidar systems |
| US11422267B1 (en) | 2021-02-18 | 2022-08-23 | Innovusion, Inc. | Dual shaft axial flux motor for optical scanners |
| JPWO2022196779A1 (ja) * | 2021-03-17 | 2022-09-22 | ||
| US11460554B2 (en) | 2017-10-19 | 2022-10-04 | Innovusion, Inc. | LiDAR with large dynamic range |
| US11493601B2 (en) | 2017-12-22 | 2022-11-08 | Innovusion, Inc. | High density LIDAR scanning |
| WO2023276223A1 (ja) * | 2021-06-30 | 2023-01-05 | ソニーセミコンダクタソリューションズ株式会社 | 測距装置、測距方法及び制御装置 |
| US11555895B2 (en) | 2021-04-20 | 2023-01-17 | Innovusion, Inc. | Dynamic compensation to polygon and motor tolerance using galvo control profile |
| US11567182B2 (en) | 2018-03-09 | 2023-01-31 | Innovusion, Inc. | LiDAR safety systems and methods |
| JPWO2023013777A1 (ja) * | 2021-08-05 | 2023-02-09 | ||
| WO2023013776A1 (ja) | 2021-08-05 | 2023-02-09 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| US11579258B1 (en) | 2018-08-30 | 2023-02-14 | Innovusion, Inc. | Solid state pulse steering in lidar systems |
| US11579300B1 (en) | 2018-08-21 | 2023-02-14 | Innovusion, Inc. | Dual lens receive path for LiDAR system |
| US11604279B2 (en) | 2017-01-05 | 2023-03-14 | Innovusion, Inc. | MEMS beam steering and fisheye receiving lens for LiDAR system |
| US11609336B1 (en) | 2018-08-21 | 2023-03-21 | Innovusion, Inc. | Refraction compensation for use in LiDAR systems |
| US11614526B1 (en) | 2018-08-24 | 2023-03-28 | Innovusion, Inc. | Virtual windows for LIDAR safety systems and methods |
| US11614521B2 (en) | 2021-04-21 | 2023-03-28 | Innovusion, Inc. | LiDAR scanner with pivot prism and mirror |
| US11624806B2 (en) | 2021-05-12 | 2023-04-11 | Innovusion, Inc. | Systems and apparatuses for mitigating LiDAR noise, vibration, and harshness |
| US11644543B2 (en) | 2018-11-14 | 2023-05-09 | Innovusion, Inc. | LiDAR systems and methods that use a multi-facet mirror |
| US11662440B2 (en) | 2021-05-21 | 2023-05-30 | Innovusion, Inc. | Movement profiles for smart scanning using galvonometer mirror inside LiDAR scanner |
| US11662439B2 (en) | 2021-04-22 | 2023-05-30 | Innovusion, Inc. | Compact LiDAR design with high resolution and ultra-wide field of view |
| US11675050B2 (en) | 2018-01-09 | 2023-06-13 | Innovusion, Inc. | LiDAR detection systems and methods |
| US11675053B2 (en) | 2018-06-15 | 2023-06-13 | Innovusion, Inc. | LiDAR systems and methods for focusing on ranges of interest |
| US11675055B2 (en) | 2019-01-10 | 2023-06-13 | Innovusion, Inc. | LiDAR systems and methods with beam steering and wide angle signal detection |
| US11762065B2 (en) | 2019-02-11 | 2023-09-19 | Innovusion, Inc. | Multiple beam generation from a single source beam for use with a lidar system |
| US11768294B2 (en) | 2021-07-09 | 2023-09-26 | Innovusion, Inc. | Compact lidar systems for vehicle contour fitting |
| US11782132B2 (en) | 2016-12-31 | 2023-10-10 | Innovusion, Inc. | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
| JP2023147111A (ja) * | 2022-03-29 | 2023-10-12 | ミラクシアエッジテクノロジー株式会社 | 測距装置及び測距方法 |
| US11789128B2 (en) | 2021-03-01 | 2023-10-17 | Innovusion, Inc. | Fiber-based transmitter and receiver channels of light detection and ranging systems |
| US11789132B2 (en) | 2018-04-09 | 2023-10-17 | Innovusion, Inc. | Compensation circuitry for lidar receiver systems and method of use thereof |
| US11796645B1 (en) | 2018-08-24 | 2023-10-24 | Innovusion, Inc. | Systems and methods for tuning filters for use in lidar systems |
| US11808888B2 (en) | 2018-02-23 | 2023-11-07 | Innovusion, Inc. | Multi-wavelength pulse steering in LiDAR systems |
| WO2023224077A1 (ja) | 2022-05-18 | 2023-11-23 | 株式会社小糸製作所 | ToFカメラ、車両用センシングシステム、および車両用灯具 |
| US11860316B1 (en) | 2018-08-21 | 2024-01-02 | Innovusion, Inc. | Systems and method for debris and water obfuscation compensation for use in LiDAR systems |
| WO2024004639A1 (ja) * | 2022-06-29 | 2024-01-04 | ソニーセミコンダクタソリューションズ株式会社 | 受光装置、情報処理装置、測距装置、情報処理方法 |
| US11871130B2 (en) | 2022-03-25 | 2024-01-09 | Innovusion, Inc. | Compact perception device |
| US11927696B2 (en) | 2018-02-21 | 2024-03-12 | Innovusion, Inc. | LiDAR systems with fiber optic coupling |
| US11947047B2 (en) | 2017-01-05 | 2024-04-02 | Seyond, Inc. | Method and system for encoding and decoding LiDAR |
| US11953601B2 (en) | 2016-12-30 | 2024-04-09 | Seyond, Inc. | Multiwavelength lidar design |
| US11965980B2 (en) | 2018-01-09 | 2024-04-23 | Innovusion, Inc. | Lidar detection systems and methods that use multi-plane mirrors |
| US11977185B1 (en) | 2019-04-04 | 2024-05-07 | Seyond, Inc. | Variable angle polygon for use with a LiDAR system |
| US11988773B2 (en) | 2018-02-23 | 2024-05-21 | Innovusion, Inc. | 2-dimensional steering system for lidar systems |
| US12038534B2 (en) | 2021-11-24 | 2024-07-16 | Innovusion (suzhou) Co., Ltd. | Motor for on-vehicle lidar, on-vehicle lidar, and vehicle |
| US12050288B2 (en) | 2017-01-05 | 2024-07-30 | Seyond, Inc. | High resolution LiDAR using high frequency pulse firing |
| US12061289B2 (en) | 2021-02-16 | 2024-08-13 | Innovusion, Inc. | Attaching a glass mirror to a rotating metal motor frame |
| US12072447B2 (en) | 2021-04-22 | 2024-08-27 | Seyond, Inc. | Compact LiDAR design with high resolution and ultra-wide field of view |
| US12204033B2 (en) | 2022-03-25 | 2025-01-21 | Seyond, Inc. | Multimodal detection with integrated sensors |
| US12250469B2 (en) | 2020-04-02 | 2025-03-11 | Koito Manufacturing Co., Ltd. | Gating camera, sensing system for vehicle, and lighting unit for vehicle |
| US12298399B2 (en) | 2018-02-22 | 2025-05-13 | Seyond, Inc. | Receive path for LiDAR system |
| US12313788B1 (en) | 2018-10-09 | 2025-05-27 | Seyond, Inc. | Ultrashort pulses in LiDAR systems |
| US12468017B2 (en) | 2021-10-15 | 2025-11-11 | Seyond, Inc. | Integrated mirror motor galvanometer |
| EP4685517A1 (en) | 2024-07-25 | 2026-01-28 | Canon Kabushiki Kaisha | Image capturing apparatus and image capturing method |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101946941B1 (ko) * | 2016-06-13 | 2019-04-29 | 엘지전자 주식회사 | 야간 영상표시 장치 |
| US10752218B2 (en) * | 2018-02-22 | 2020-08-25 | Ford Global Technologies, Llc | Camera with cleaning system |
| US11510297B2 (en) | 2018-12-24 | 2022-11-22 | Beijing Voyager Technology Co., Ltd. | Adaptive power control for pulsed laser diodes |
| US10826269B2 (en) | 2018-12-24 | 2020-11-03 | Beijing Voyager Technology Co., Ltd. | Multi-pulse generation for pulsed laser diodes using low-side drivers |
| CN109884588B (zh) * | 2019-01-16 | 2020-11-17 | 北京大学 | 一种脉冲序列的距离度量方法及系统 |
| JP7550382B2 (ja) * | 2019-11-15 | 2024-09-13 | パナソニックIpマネジメント株式会社 | センシングデバイスおよび情報処理装置 |
| JP7450237B2 (ja) * | 2020-03-31 | 2024-03-15 | パナソニックIpマネジメント株式会社 | 情報処理システム、センサシステム、情報処理方法、及びプログラム |
| JP7412254B2 (ja) * | 2020-04-02 | 2024-01-12 | 三菱電機株式会社 | 物体認識装置および物体認識方法 |
| US20230204781A1 (en) * | 2021-12-28 | 2023-06-29 | Nio Technology (Anhui) Co., Ltd. | Time of flight cameras using passive image sensors and existing light sources |
| CN115790517A (zh) * | 2022-12-01 | 2023-03-14 | 梅赛德斯-奔驰集团股份公司 | 用于借助车辆大灯测距的方法和系统 |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS59117981U (ja) * | 1983-01-31 | 1984-08-09 | 日産自動車株式会社 | 車両用光レ−ダ装置 |
| JPH04215089A (ja) * | 1990-02-24 | 1992-08-05 | Eltro Gmbh Ges Strahlungstech | 見通し距離の検出方法 |
| JPH09257927A (ja) * | 1996-03-18 | 1997-10-03 | Nissan Motor Co Ltd | 車両用レーダ装置 |
| JPH09274076A (ja) * | 1996-04-04 | 1997-10-21 | Denso Corp | 反射測定装置及びこれを利用した車間距離制御装置 |
| JP2004157061A (ja) * | 2002-11-08 | 2004-06-03 | Nikon-Trimble Co Ltd | 距離測定装置 |
| JP2010054461A (ja) * | 2008-08-29 | 2010-03-11 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
| JP2010066221A (ja) * | 2008-09-12 | 2010-03-25 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
| JP2013096905A (ja) * | 2011-11-02 | 2013-05-20 | Denso Corp | 測距装置 |
| WO2014178376A1 (ja) * | 2013-04-30 | 2014-11-06 | 三菱電機株式会社 | レーザレーダ装置 |
Family Cites Families (80)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3895388A (en) | 1973-10-25 | 1975-07-15 | Ibm | Adaptive illuminator |
| JPS56164969A (en) | 1980-05-23 | 1981-12-18 | Koden Electronics Co Ltd | Reflection searching device |
| JPS59198377A (ja) | 1983-04-27 | 1984-11-10 | Nippon Soken Inc | 車両用障害物検知装置 |
| US5029009A (en) | 1989-05-08 | 1991-07-02 | Kaman Aerospace Corporation | Imaging camera with adaptive range gating |
| JPH0743451A (ja) | 1993-08-02 | 1995-02-14 | Oki Electric Ind Co Ltd | レーダ映像作成装置 |
| JPH07325152A (ja) | 1994-05-31 | 1995-12-12 | Nikon Corp | 距離測定装置 |
| JPH0865690A (ja) | 1994-08-24 | 1996-03-08 | Sony Tektronix Corp | カラー静止画像撮影装置 |
| JPH10132932A (ja) | 1996-10-28 | 1998-05-22 | Unyusho Daiichi Kowan Kensetsukyoku | 3原色水中レーザー視認装置 |
| KR100268048B1 (ko) | 1996-10-28 | 2000-11-01 | 고바야시 마사키 | 수중레이저영상장치 |
| JP2000172995A (ja) | 1998-12-04 | 2000-06-23 | Sumitomo Electric Ind Ltd | 物体検出装置 |
| US6311020B1 (en) | 1998-12-21 | 2001-10-30 | Olympus Optical Co., Ltd. | Camera having autofocusing function and self timer function |
| KR100823047B1 (ko) | 2000-10-02 | 2008-04-18 | 가부시키가이샤 한도오따이 에네루기 켄큐쇼 | 자기발광 장치 및 그 구동 방법 |
| JP2002139304A (ja) | 2000-10-30 | 2002-05-17 | Honda Motor Co Ltd | 距離測定装置、及び距離測定方法 |
| JP2002131016A (ja) | 2000-10-27 | 2002-05-09 | Honda Motor Co Ltd | 距離測定装置、及び距離測定方法 |
| JP4530571B2 (ja) | 2001-04-16 | 2010-08-25 | Hoya株式会社 | 3次元画像検出装置 |
| US6730913B2 (en) * | 2002-02-21 | 2004-05-04 | Ford Global Technologies, Llc | Active night vision system for vehicles employing short-pulse laser illumination and a gated camera for image capture |
| WO2005076037A1 (en) * | 2004-02-04 | 2005-08-18 | Elbit Systems Ltd. | Gated imaging |
| KR100726142B1 (ko) | 2004-02-18 | 2007-06-13 | 마쯔시다덴기산교 가부시키가이샤 | 화상 보정 방법 및 화상 보정 장치 |
| JP4490716B2 (ja) | 2004-03-26 | 2010-06-30 | アイシン精機株式会社 | 車載カメラ用補助照明装置 |
| JP4509704B2 (ja) | 2004-09-03 | 2010-07-21 | 株式会社小糸製作所 | 車両用灯具の点灯制御回路 |
| JP4379728B2 (ja) | 2005-01-31 | 2009-12-09 | カシオ計算機株式会社 | 撮像装置及びそのプログラム |
| JP4478599B2 (ja) | 2005-03-22 | 2010-06-09 | キヤノン株式会社 | 撮像装置 |
| JP2007232498A (ja) | 2006-02-28 | 2007-09-13 | Hitachi Ltd | 障害物検知システム |
| JP4321540B2 (ja) | 2006-03-30 | 2009-08-26 | 株式会社豊田中央研究所 | 物体検出装置 |
| JP4730267B2 (ja) | 2006-07-04 | 2011-07-20 | 株式会社デンソー | 車両用視界状況判定装置 |
| JP2008070999A (ja) | 2006-09-13 | 2008-03-27 | Hitachi Ltd | 車両の障害物検出装置及びそれを搭載した自動車 |
| JP2008166412A (ja) | 2006-12-27 | 2008-07-17 | Koito Mfg Co Ltd | 発光素子駆動回路及び車両用灯具 |
| GB0701869D0 (en) | 2007-01-31 | 2007-03-14 | Cambridge Consultants | Adaptive radar |
| US7667824B1 (en) | 2007-02-06 | 2010-02-23 | Alpha Technology, LLC | Range gated shearography systems and related methods |
| JP2008298741A (ja) | 2007-06-04 | 2008-12-11 | Toyota Central R&D Labs Inc | 距離計測装置及び距離計測方法 |
| JP2009031165A (ja) | 2007-07-27 | 2009-02-12 | Toyota Motor Corp | パルスレーダ装置 |
| JP5092613B2 (ja) | 2007-08-06 | 2012-12-05 | 日産自動車株式会社 | 距離計測方法および装置、ならびに距離計測装置を備えた車両 |
| JP2009092555A (ja) | 2007-10-10 | 2009-04-30 | Denso Corp | パルスレーダ装置 |
| JP4359710B2 (ja) | 2008-02-04 | 2009-11-04 | 本田技研工業株式会社 | 車両周辺監視装置、車両、車両周辺監視用プログラム、車両周辺監視方法 |
| JP5552212B2 (ja) | 2008-02-14 | 2014-07-16 | トヨタ自動車株式会社 | レーダー装置 |
| JP2009257983A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置および車両用距離画像データの生成方法 |
| JP2009257981A (ja) * | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
| JP2009258015A (ja) | 2008-04-18 | 2009-11-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置および車両用距離画像データの生成方法 |
| CN101324749B (zh) | 2008-07-24 | 2010-07-28 | 上海交通大学 | 在纹理平面上进行投影显示的方法 |
| JP5146674B2 (ja) | 2008-08-22 | 2013-02-20 | トヨタ自動車株式会社 | レーダ干渉回避装置、及びレーダ干渉回避方法 |
| JP2010061304A (ja) | 2008-09-02 | 2010-03-18 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
| JP5457684B2 (ja) | 2009-01-20 | 2014-04-02 | 株式会社小糸製作所 | 車両用灯具の点灯制御装置 |
| JP2010170449A (ja) | 2009-01-26 | 2010-08-05 | Calsonic Kansei Corp | 車両用距離画像データ生成装置及び方法 |
| JP2010212042A (ja) | 2009-03-10 | 2010-09-24 | Calsonic Kansei Corp | Led照明装置 |
| JP2010256291A (ja) | 2009-04-28 | 2010-11-11 | Toyota Motor Corp | 距離画像撮影装置 |
| JP2011013950A (ja) | 2009-07-02 | 2011-01-20 | Furuno Electric Co Ltd | 擬似乱数出力装置、送信装置及び探知機 |
| CN101644887A (zh) | 2009-08-28 | 2010-02-10 | 中国工程物理研究院流体物理研究所 | 一种门控型像增强器曝光时间的测量方法及其测量系统 |
| DE102009045600B4 (de) | 2009-10-12 | 2021-11-04 | pmdtechnologies ag | Kamerasystem |
| KR101608867B1 (ko) | 2009-10-15 | 2016-04-04 | 삼성전자주식회사 | 광학 조립체 및 이를 구비한 촬상 장치 |
| JP2011136651A (ja) | 2009-12-28 | 2011-07-14 | Koito Mfg Co Ltd | 車両用灯具システム |
| EP2544449B1 (en) | 2010-03-01 | 2016-03-16 | Honda Motor Co., Ltd. | Vehicle perimeter monitoring device |
| CA2792050C (en) | 2010-03-02 | 2017-08-15 | Elbit Systems Ltd. | Image gated camera for detecting objects in a marine environment |
| WO2011115142A1 (ja) | 2010-03-19 | 2011-09-22 | Okiセミコンダクタ株式会社 | 画像処理装置、方法、プログラム及び記録媒体 |
| JP5523954B2 (ja) | 2010-06-30 | 2014-06-18 | 富士通テン株式会社 | 画像表示システム |
| KR101753312B1 (ko) | 2010-09-17 | 2017-07-03 | 삼성전자주식회사 | 뎁스 영상 생성 장치 및 방법 |
| US8681255B2 (en) | 2010-09-28 | 2014-03-25 | Microsoft Corporation | Integrated low power depth camera and projection device |
| JP2012165090A (ja) | 2011-02-04 | 2012-08-30 | Sanyo Electric Co Ltd | 撮像装置およびその制御方法 |
| US20120249781A1 (en) | 2011-04-04 | 2012-10-04 | Richard Vollmerhausen | Method consisting of pulsing a laser communicating with a gated-sensor so as to reduce speckle, reduce scintillation, improve laser beam uniformity and improve eye safety in laser range gated imagery |
| JP5760220B2 (ja) | 2011-04-11 | 2015-08-05 | オプテックス株式会社 | 距離画像カメラおよびこれを用いた対象物の面形状認識方法 |
| CN102737389A (zh) | 2011-04-13 | 2012-10-17 | 南京大学 | 一种针对颜色编码结构光扫描系统的自适应反射率校正方法 |
| US9182491B2 (en) | 2011-05-06 | 2015-11-10 | Waikatolink Limited | Selective distance range imaging |
| CN202305416U (zh) | 2011-09-29 | 2012-07-04 | 杭州力弘电子有限公司 | 智能图像传感器系统 |
| CN104041022B (zh) | 2012-01-17 | 2016-06-29 | 本田技研工业株式会社 | 图像处理装置 |
| US9720089B2 (en) | 2012-01-23 | 2017-08-01 | Microsoft Technology Licensing, Llc | 3D zoom imager |
| JP5511863B2 (ja) | 2012-02-03 | 2014-06-04 | 三菱電機株式会社 | レーダ装置とその制御方法 |
| US9723233B2 (en) | 2012-04-18 | 2017-08-01 | Brightway Vision Ltd. | Controllable gated sensor |
| WO2013179280A1 (en) | 2012-05-29 | 2013-12-05 | Brightway Vision Ltd. | Gated imaging using an adaptive depth of field |
| US20150125032A1 (en) | 2012-06-13 | 2015-05-07 | Panasonic Intellectual Property Management Co., Ltd. | Object detection device |
| CN104380166B (zh) | 2012-07-12 | 2016-06-08 | 奥林巴斯株式会社 | 摄像装置 |
| JP6161262B2 (ja) | 2012-11-19 | 2017-07-12 | 株式会社ミツトヨ | 画像測定機のled照明方法及び装置 |
| WO2014097539A1 (ja) | 2012-12-20 | 2014-06-26 | パナソニック株式会社 | 3次元測定装置および3次元測定方法 |
| JP6094252B2 (ja) | 2013-02-20 | 2017-03-15 | 株式会社デンソー | 道路標識認識装置 |
| US9110169B2 (en) | 2013-03-08 | 2015-08-18 | Advanced Scientific Concepts, Inc. | LADAR enabled impact mitigation system |
| IL227265A0 (en) | 2013-06-30 | 2013-12-31 | Brightway Vision Ltd | Smart flash for the camera |
| US10203399B2 (en) | 2013-11-12 | 2019-02-12 | Big Sky Financial Corporation | Methods and apparatus for array based LiDAR systems with reduced interference |
| CN103744078B (zh) | 2013-12-30 | 2016-04-13 | 中国科学技术大学 | 一种基于不同码速随机跳频的微波凝视关联成像装置 |
| JP6320050B2 (ja) | 2014-01-17 | 2018-05-09 | オムロンオートモーティブエレクトロニクス株式会社 | レーザレーダ装置 |
| US9866208B2 (en) | 2015-06-15 | 2018-01-09 | Microsoft Technology Lincensing, LLC | Precision measurements and calibrations for timing generators |
| US10408922B2 (en) | 2015-07-10 | 2019-09-10 | Ams Sensors Singapore Pte. Ltd. | Optoelectronic module with low- and high-power illumination modes |
| US10912516B2 (en) | 2015-12-07 | 2021-02-09 | Panasonic Corporation | Living body information measurement device, living body information measurement method, and storage medium storing program |
-
2016
- 2016-12-01 WO PCT/JP2016/085814 patent/WO2017110417A1/ja not_active Ceased
- 2016-12-01 CN CN201680075214.2A patent/CN108431630A/zh active Pending
- 2016-12-01 EP EP16878310.8A patent/EP3396413A4/en not_active Withdrawn
- 2016-12-01 US US16/065,074 patent/US11194023B2/en active Active
- 2016-12-01 JP JP2017557836A patent/JP6851985B2/ja active Active
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS59117981U (ja) * | 1983-01-31 | 1984-08-09 | 日産自動車株式会社 | 車両用光レ−ダ装置 |
| JPH04215089A (ja) * | 1990-02-24 | 1992-08-05 | Eltro Gmbh Ges Strahlungstech | 見通し距離の検出方法 |
| JPH09257927A (ja) * | 1996-03-18 | 1997-10-03 | Nissan Motor Co Ltd | 車両用レーダ装置 |
| JPH09274076A (ja) * | 1996-04-04 | 1997-10-21 | Denso Corp | 反射測定装置及びこれを利用した車間距離制御装置 |
| JP2004157061A (ja) * | 2002-11-08 | 2004-06-03 | Nikon-Trimble Co Ltd | 距離測定装置 |
| JP2010054461A (ja) * | 2008-08-29 | 2010-03-11 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
| JP2010066221A (ja) * | 2008-09-12 | 2010-03-25 | Calsonic Kansei Corp | 車両用距離画像データ生成装置 |
| JP2013096905A (ja) * | 2011-11-02 | 2013-05-20 | Denso Corp | 測距装置 |
| WO2014178376A1 (ja) * | 2013-04-30 | 2014-11-06 | 三菱電機株式会社 | レーザレーダ装置 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3396413A4 * |
Cited By (100)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11953601B2 (en) | 2016-12-30 | 2024-04-09 | Seyond, Inc. | Multiwavelength lidar design |
| US11782132B2 (en) | 2016-12-31 | 2023-10-10 | Innovusion, Inc. | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
| US11899134B2 (en) | 2016-12-31 | 2024-02-13 | Innovusion, Inc. | 2D scanning high precision lidar using combination of rotating concave mirror and beam steering devices |
| US12276755B2 (en) | 2016-12-31 | 2025-04-15 | Seyond, Inc. | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
| US11977183B2 (en) | 2016-12-31 | 2024-05-07 | Seyond, Inc. | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
| US12248095B2 (en) | 2016-12-31 | 2025-03-11 | Seyond, Inc. | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
| US12241999B2 (en) | 2016-12-31 | 2025-03-04 | Seyond, Inc. | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
| US11782131B2 (en) | 2016-12-31 | 2023-10-10 | Innovusion, Inc. | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
| US11947047B2 (en) | 2017-01-05 | 2024-04-02 | Seyond, Inc. | Method and system for encoding and decoding LiDAR |
| US11604279B2 (en) | 2017-01-05 | 2023-03-14 | Innovusion, Inc. | MEMS beam steering and fisheye receiving lens for LiDAR system |
| US12050288B2 (en) | 2017-01-05 | 2024-07-30 | Seyond, Inc. | High resolution LiDAR using high frequency pulse firing |
| US11460554B2 (en) | 2017-10-19 | 2022-10-04 | Innovusion, Inc. | LiDAR with large dynamic range |
| US11340339B2 (en) | 2017-12-22 | 2022-05-24 | Waymo Llc | Systems and methods for adaptive range coverage using LIDAR |
| CN111512183A (zh) * | 2017-12-22 | 2020-08-07 | 伟摩有限责任公司 | 使用lidar进行自适应范围覆盖的系统和方法 |
| US12189058B2 (en) | 2017-12-22 | 2025-01-07 | Seyond, Inc. | High resolution LiDAR using high frequency pulse firing |
| EP3707531A4 (en) * | 2017-12-22 | 2021-08-11 | Waymo LLC | SYSTEMS AND METHODS FOR ADAPTIVE AREA COVERAGE USING LIDAR |
| US11841464B2 (en) | 2017-12-22 | 2023-12-12 | Waymo Llc | Systems and methods for adaptive range coverage using LIDAR |
| CN111512183B (zh) * | 2017-12-22 | 2023-12-12 | 伟摩有限责任公司 | 使用lidar进行自适应范围覆盖的系统和方法 |
| US12222452B2 (en) | 2017-12-22 | 2025-02-11 | Waymo Llc | Systems and methods for adaptive range coverage using LIDAR |
| WO2019125607A1 (en) | 2017-12-22 | 2019-06-27 | Waymo Llc | Systems and methods for adaptive range coverage using lidar |
| US11493601B2 (en) | 2017-12-22 | 2022-11-08 | Innovusion, Inc. | High density LIDAR scanning |
| US12078755B2 (en) | 2018-01-09 | 2024-09-03 | Seyond, Inc. | LiDAR detection systems and methods that use multi-plane mirrors |
| US11977184B2 (en) | 2018-01-09 | 2024-05-07 | Seyond, Inc. | LiDAR detection systems and methods that use multi-plane mirrors |
| US11965980B2 (en) | 2018-01-09 | 2024-04-23 | Innovusion, Inc. | Lidar detection systems and methods that use multi-plane mirrors |
| US11675050B2 (en) | 2018-01-09 | 2023-06-13 | Innovusion, Inc. | LiDAR detection systems and methods |
| US11927696B2 (en) | 2018-02-21 | 2024-03-12 | Innovusion, Inc. | LiDAR systems with fiber optic coupling |
| WO2019165130A1 (en) * | 2018-02-21 | 2019-08-29 | Innovusion Ireland Limited | Lidar detection systems and methods with high repetition rate to observe far objects |
| US11391823B2 (en) | 2018-02-21 | 2022-07-19 | Innovusion, Inc. | LiDAR detection systems and methods with high repetition rate to observe far objects |
| US11782138B2 (en) | 2018-02-21 | 2023-10-10 | Innovusion, Inc. | LiDAR detection systems and methods with high repetition rate to observe far objects |
| US12298399B2 (en) | 2018-02-22 | 2025-05-13 | Seyond, Inc. | Receive path for LiDAR system |
| US11988773B2 (en) | 2018-02-23 | 2024-05-21 | Innovusion, Inc. | 2-dimensional steering system for lidar systems |
| US11808888B2 (en) | 2018-02-23 | 2023-11-07 | Innovusion, Inc. | Multi-wavelength pulse steering in LiDAR systems |
| US12085673B2 (en) | 2018-02-23 | 2024-09-10 | Seyond, Inc. | Distributed LiDAR systems |
| US11422234B2 (en) | 2018-02-23 | 2022-08-23 | Innovusion, Inc. | Distributed lidar systems |
| US11567182B2 (en) | 2018-03-09 | 2023-01-31 | Innovusion, Inc. | LiDAR safety systems and methods |
| US12032100B2 (en) | 2018-03-09 | 2024-07-09 | Seyond, Inc. | Lidar safety systems and methods |
| US11789132B2 (en) | 2018-04-09 | 2023-10-17 | Innovusion, Inc. | Compensation circuitry for lidar receiver systems and method of use thereof |
| US11569632B2 (en) | 2018-04-09 | 2023-01-31 | Innovusion, Inc. | Lidar systems and methods for exercising precise control of a fiber laser |
| US12529773B2 (en) | 2018-04-09 | 2026-01-20 | Seyond, Inc. | Compensation circuitry for LiDAR receiver systems and method of use thereof |
| US11289873B2 (en) | 2018-04-09 | 2022-03-29 | Innovusion Ireland Limited | LiDAR systems and methods for exercising precise control of a fiber laser |
| US11860313B2 (en) | 2018-06-15 | 2024-01-02 | Innovusion, Inc. | LiDAR systems and methods for focusing on ranges of interest |
| US11675053B2 (en) | 2018-06-15 | 2023-06-13 | Innovusion, Inc. | LiDAR systems and methods for focusing on ranges of interest |
| US12276759B2 (en) | 2018-06-15 | 2025-04-15 | Seyond, Inc. | LiDAR systems and methods for focusing on ranges of interest |
| US11860316B1 (en) | 2018-08-21 | 2024-01-02 | Innovusion, Inc. | Systems and method for debris and water obfuscation compensation for use in LiDAR systems |
| US11609336B1 (en) | 2018-08-21 | 2023-03-21 | Innovusion, Inc. | Refraction compensation for use in LiDAR systems |
| US11579300B1 (en) | 2018-08-21 | 2023-02-14 | Innovusion, Inc. | Dual lens receive path for LiDAR system |
| US12050269B2 (en) | 2018-08-21 | 2024-07-30 | Seyond, Inc. | Dual lens receive path for LiDAR system |
| US11796645B1 (en) | 2018-08-24 | 2023-10-24 | Innovusion, Inc. | Systems and methods for tuning filters for use in lidar systems |
| US11614526B1 (en) | 2018-08-24 | 2023-03-28 | Innovusion, Inc. | Virtual windows for LIDAR safety systems and methods |
| US11940570B2 (en) | 2018-08-24 | 2024-03-26 | Seyond, Inc. | Virtual windows for LiDAR safety systems and methods |
| US11914076B2 (en) | 2018-08-30 | 2024-02-27 | Innovusion, Inc. | Solid state pulse steering in LiDAR systems |
| US11579258B1 (en) | 2018-08-30 | 2023-02-14 | Innovusion, Inc. | Solid state pulse steering in lidar systems |
| US12313788B1 (en) | 2018-10-09 | 2025-05-27 | Seyond, Inc. | Ultrashort pulses in LiDAR systems |
| US11644543B2 (en) | 2018-11-14 | 2023-05-09 | Innovusion, Inc. | LiDAR systems and methods that use a multi-facet mirror |
| US11686824B2 (en) | 2018-11-14 | 2023-06-27 | Innovusion, Inc. | LiDAR systems that use a multi-facet mirror |
| US11675055B2 (en) | 2019-01-10 | 2023-06-13 | Innovusion, Inc. | LiDAR systems and methods with beam steering and wide angle signal detection |
| US12158545B2 (en) | 2019-01-10 | 2024-12-03 | Seyond, Inc. | Lidar systems and methods with beam steering and wide angle signal detection |
| US12392901B2 (en) | 2019-01-17 | 2025-08-19 | Koito Manufacturing Co., Ltd. | In-vehicle imaging apparatus |
| JP7463297B2 (ja) | 2019-01-17 | 2024-04-08 | 株式会社小糸製作所 | 車載用イメージング装置、車両用灯具、自動車 |
| JPWO2020149140A1 (ja) * | 2019-01-17 | 2021-12-02 | 株式会社小糸製作所 | 車載用イメージング装置、車両用灯具、自動車 |
| WO2020149140A1 (ja) * | 2019-01-17 | 2020-07-23 | 株式会社小糸製作所 | 車載用イメージング装置、車両用灯具、自動車 |
| US11762065B2 (en) | 2019-02-11 | 2023-09-19 | Innovusion, Inc. | Multiple beam generation from a single source beam for use with a lidar system |
| JPWO2020184447A1 (ja) * | 2019-03-11 | 2020-09-17 | ||
| US12262101B2 (en) | 2019-03-11 | 2025-03-25 | Koito Manufacturing Co., Ltd. | Gating camera |
| WO2020184447A1 (ja) | 2019-03-11 | 2020-09-17 | 株式会社小糸製作所 | ゲーティングカメラ、自動車、車両用灯具、物体識別システム、演算処理装置、物体識別方法、画像表示システム、検査方法、撮像装置、画像処理装置 |
| US11977185B1 (en) | 2019-04-04 | 2024-05-07 | Seyond, Inc. | Variable angle polygon for use with a LiDAR system |
| US12135394B2 (en) * | 2019-09-26 | 2024-11-05 | Koito Manufacturing Co., Ltd. | Gating camera |
| US20220214434A1 (en) * | 2019-09-26 | 2022-07-07 | Koito Manufacturing Co., Ltd. | Gating camera |
| US12250469B2 (en) | 2020-04-02 | 2025-03-11 | Koito Manufacturing Co., Ltd. | Gating camera, sensing system for vehicle, and lighting unit for vehicle |
| WO2022014416A1 (ja) | 2020-07-14 | 2022-01-20 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| WO2022163721A1 (ja) | 2021-01-27 | 2022-08-04 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| US12061289B2 (en) | 2021-02-16 | 2024-08-13 | Innovusion, Inc. | Attaching a glass mirror to a rotating metal motor frame |
| US11567213B2 (en) | 2021-02-18 | 2023-01-31 | Innovusion, Inc. | Dual shaft axial flux motor for optical scanners |
| US11422267B1 (en) | 2021-02-18 | 2022-08-23 | Innovusion, Inc. | Dual shaft axial flux motor for optical scanners |
| US11789128B2 (en) | 2021-03-01 | 2023-10-17 | Innovusion, Inc. | Fiber-based transmitter and receiver channels of light detection and ranging systems |
| JPWO2022196779A1 (ja) * | 2021-03-17 | 2022-09-22 | ||
| JP2022147087A (ja) * | 2021-03-23 | 2022-10-06 | 株式会社リコー | 撮像装置、撮像方法および情報処理装置 |
| JP7031771B1 (ja) | 2021-03-23 | 2022-03-08 | 株式会社リコー | 撮像装置、撮像方法および情報処理装置 |
| US11555895B2 (en) | 2021-04-20 | 2023-01-17 | Innovusion, Inc. | Dynamic compensation to polygon and motor tolerance using galvo control profile |
| US12146988B2 (en) | 2021-04-20 | 2024-11-19 | Innovusion, Inc. | Dynamic compensation to polygon and motor tolerance using galvo control profile |
| US11614521B2 (en) | 2021-04-21 | 2023-03-28 | Innovusion, Inc. | LiDAR scanner with pivot prism and mirror |
| US12072447B2 (en) | 2021-04-22 | 2024-08-27 | Seyond, Inc. | Compact LiDAR design with high resolution and ultra-wide field of view |
| US11662439B2 (en) | 2021-04-22 | 2023-05-30 | Innovusion, Inc. | Compact LiDAR design with high resolution and ultra-wide field of view |
| US11624806B2 (en) | 2021-05-12 | 2023-04-11 | Innovusion, Inc. | Systems and apparatuses for mitigating LiDAR noise, vibration, and harshness |
| US11662440B2 (en) | 2021-05-21 | 2023-05-30 | Innovusion, Inc. | Movement profiles for smart scanning using galvonometer mirror inside LiDAR scanner |
| WO2023276223A1 (ja) * | 2021-06-30 | 2023-01-05 | ソニーセミコンダクタソリューションズ株式会社 | 測距装置、測距方法及び制御装置 |
| US11768294B2 (en) | 2021-07-09 | 2023-09-26 | Innovusion, Inc. | Compact lidar systems for vehicle contour fitting |
| JPWO2023013777A1 (ja) * | 2021-08-05 | 2023-02-09 | ||
| WO2023013777A1 (ja) | 2021-08-05 | 2023-02-09 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| WO2023013776A1 (ja) | 2021-08-05 | 2023-02-09 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| JP7804684B2 (ja) | 2021-08-05 | 2026-01-22 | 株式会社小糸製作所 | ゲーティングカメラ、車両用センシングシステム、車両用灯具 |
| US12375819B2 (en) | 2021-08-05 | 2025-07-29 | Koito Manufacturing Co., Ltd. | Gating camera, sensing system for vehicle, vehicle lamp |
| US12468017B2 (en) | 2021-10-15 | 2025-11-11 | Seyond, Inc. | Integrated mirror motor galvanometer |
| US12038534B2 (en) | 2021-11-24 | 2024-07-16 | Innovusion (suzhou) Co., Ltd. | Motor for on-vehicle lidar, on-vehicle lidar, and vehicle |
| US11871130B2 (en) | 2022-03-25 | 2024-01-09 | Innovusion, Inc. | Compact perception device |
| US12204033B2 (en) | 2022-03-25 | 2025-01-21 | Seyond, Inc. | Multimodal detection with integrated sensors |
| JP2023147111A (ja) * | 2022-03-29 | 2023-10-12 | ミラクシアエッジテクノロジー株式会社 | 測距装置及び測距方法 |
| WO2023224077A1 (ja) | 2022-05-18 | 2023-11-23 | 株式会社小糸製作所 | ToFカメラ、車両用センシングシステム、および車両用灯具 |
| WO2024004639A1 (ja) * | 2022-06-29 | 2024-01-04 | ソニーセミコンダクタソリューションズ株式会社 | 受光装置、情報処理装置、測距装置、情報処理方法 |
| EP4685517A1 (en) | 2024-07-25 | 2026-01-28 | Canon Kabushiki Kaisha | Image capturing apparatus and image capturing method |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190004150A1 (en) | 2019-01-03 |
| EP3396413A4 (en) | 2019-08-21 |
| US11194023B2 (en) | 2021-12-07 |
| EP3396413A1 (en) | 2018-10-31 |
| CN108431630A (zh) | 2018-08-21 |
| JPWO2017110417A1 (ja) | 2018-10-04 |
| JP6851985B2 (ja) | 2021-03-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6851985B2 (ja) | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 | |
| JP6868570B2 (ja) | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 | |
| JP6471528B2 (ja) | 物体認識装置、物体認識方法 | |
| JP6766072B2 (ja) | 車両用センサおよびそれを備えた車両 | |
| JP6766071B2 (ja) | 車両用画像取得装置およびそれを備えた車両 | |
| CN100472322C (zh) | 前方摄影装置 | |
| JP2009257981A (ja) | 車両用距離画像データ生成装置 | |
| JP6851986B2 (ja) | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 | |
| JP7594539B2 (ja) | ゲーティングカメラ、自動車、車両用灯具、画像処理装置、画像処理方法 | |
| JP2009257983A (ja) | 車両用距離画像データ生成装置および車両用距離画像データの生成方法 | |
| WO2020175117A1 (ja) | 測距装置、測距方法、並びにプログラム | |
| JP6942637B2 (ja) | 車両用画像取得装置、および車両用画像取得装置を備えた車両 | |
| WO2020175118A1 (ja) | 測距装置、測距方法、並びにプログラム | |
| WO2020137318A1 (ja) | 測定装置、測距装置および測定方法 | |
| JP2009257982A (ja) | 車両用距離画像データ生成装置 | |
| JP2010054461A (ja) | 車両用距離画像データ生成装置 | |
| JP2009282906A (ja) | 車両用距離画像データ生成装置 | |
| JP2010071704A (ja) | 車両用距離画像データ生成装置及び方法 | |
| JP2009258015A (ja) | 車両用距離画像データ生成装置および車両用距離画像データの生成方法 | |
| US20230302987A1 (en) | Method for Object Tracking at Least One Object, Control Device for Carrying Out a Method of This Kind, Object Tracking Device Having a Control Device of This Kind and Motor Vehicle Having an Object Tracking Device of This Kind | |
| JP2009259070A (ja) | 車両用距離画像データ生成装置 | |
| JP2022175591A (ja) | 測距装置及び測距システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16878310 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017557836 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2016878310 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2016878310 Country of ref document: EP Effective date: 20180723 |