WO2021029270A1 - 測定装置および測距装置 - Google Patents
測定装置および測距装置 Download PDFInfo
- Publication number
- WO2021029270A1 WO2021029270A1 PCT/JP2020/029766 JP2020029766W WO2021029270A1 WO 2021029270 A1 WO2021029270 A1 WO 2021029270A1 JP 2020029766 W JP2020029766 W JP 2020029766W WO 2021029270 A1 WO2021029270 A1 WO 2021029270A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- distance
- measured
- light
- measuring device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
- G01C3/085—Use of electric radiation detectors with electronic parallax measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4811—Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
- G01S7/4813—Housing arrangements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4816—Constructional features, e.g. arrangements of optical elements of receivers alone
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
- G01S7/4866—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak by fitting a model or function to the received signal
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
Definitions
- the present disclosure relates to a measuring device and a ranging device.
- a distance measuring method for measuring the distance to the object to be measured using light
- a distance measuring method called a direct ToF (direct Time of Flight) method is known.
- the object to be measured is based on the time from the emission timing indicating the emission of light by the light source to the reception timing when the reflected light reflected by the object to be measured is received by the light receiving element. Find the distance to.
- the time from the injection timing to the light receiving timing when the light is received by the light receiving element is measured, and the time information indicating the measured time is stored in the memory.
- This measurement is executed a plurality of times, and a histogram is created based on a plurality of time information stored in the memory by the multiple measurements. Based on this histogram, the distance to the object to be measured is obtained.
- a measuring device and a distance measuring device that can reduce the processing time and the labor of parameter adjustment and improve the accuracy.
- a measuring device includes a plurality of light receiving elements and a recognition unit.
- the plurality of light receiving elements are arranged on the first substrate and output a signal when the light emitted from the light source and reflected by the object to be measured is received.
- the recognition unit is arranged on a second substrate different from the first substrate, and recognizes information on the distance to the object to be measured by using a machine learning model based on the output signals of the plurality of light receiving elements.
- the processing time and the labor of parameter adjustment can be reduced, and the accuracy can be improved.
- the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
- FIG. 1 It is a figure which shows another example of the distribution of the time information of a pixel. It is a figure which shows the use example using the measuring apparatus to which the said embodiment and modification are applicable. It is a block diagram which shows the schematic structure example of the vehicle control system which is an example of the mobile body control system to which the technique which concerns on this disclosure can apply. It is a figure which shows the example of the installation position of the imaging unit.
- the direct ToF method is a technique in which light emitted from a light source receives reflected light reflected by an object to be measured by a light receiving element, and distance measurement is performed based on the time difference between the light emission timing and the light receiving timing.
- FIG. 1 is a diagram schematically showing an example of distance measurement by the direct ToF method.
- the distance measuring device 300a includes a light source unit 301a and a light receiving unit 302a.
- the light source unit 301a is, for example, a laser diode, and is driven so as to emit laser light in a pulsed manner.
- the light emitted from the light source unit 301a is reflected by the object to be measured 303a and is received by the light receiving unit 302a as reflected light.
- the light receiving unit 302a includes a light receiving element that converts light into an electric signal by photoelectric conversion, and outputs a signal corresponding to the received light.
- the time when the light source unit 301a emits light is time t em
- the time when the light receiving unit 302a receives the reflected light reflected by the object to be measured 303a (light receiving timing).
- the distance measuring device 300a repeats the above-mentioned processing a plurality of times.
- the distance D may be calculated based on each light receiving timing in which the light receiving unit 302a includes a plurality of light receiving elements and the reflected light is received by each light receiving element.
- the ranging device 300a classifies the time t m (called the light receiving time t m ) from the light emitting timing time t em to the light receiving timing when the light is received by the light receiving unit 302a based on the class (bins). Generate a histogram.
- the light received by the light receiving unit 302a during the light receiving time t m is not limited to the reflected light emitted by the light source unit 301a and reflected by the object 303a.
- the ambient light around the ranging device 300a (light receiving unit 302a) is also received by the light receiving unit 302a.
- FIG. 2 is a diagram showing an example histogram based on the time when the light receiving unit 302a receives light.
- the horizontal axis indicates the bin and the vertical axis indicates the frequency for each bin.
- the bins are obtained by classifying the light receiving time t m for each predetermined unit time d. Specifically, bin # 0 is 0 ⁇ t m ⁇ d, bin # 1 is d ⁇ t m ⁇ 2 ⁇ d , bin # 2 2 ⁇ d ⁇ t m ⁇ 3 ⁇ d, ..., bottles # (N -2) is (N-2) ⁇ d ⁇ t m ⁇ (N-1) ⁇ d.
- the distance measuring device 300a counts the number of times the light receiving time t m is acquired based on the bins to obtain the frequency 310a for each bin, and generates a histogram.
- the light receiving unit 302a also receives light other than the reflected light reflected from the light emitted from the light source unit 301a.
- the ambient light there is the above-mentioned ambient light.
- the portion indicated by the range 311a in the histogram includes the ambient light component due to the ambient light.
- the ambient light is light that is randomly incident on the light receiving unit 302a and becomes noise with respect to the reflected light of interest.
- the target reflected light is light received according to a specific distance, and appears as an active light component 312a in the histogram.
- the bin corresponding to the frequency of the peak in the active light component 312a is the bin corresponding to the distance D of the object to be measured 303a.
- Distance measuring device 300a by obtaining a representative time of the bottle (for example, a central time bins) as the time t re described above, according to the equation (1) described above, calculates the distance D to the object to be measured 303a be able to. In this way, by using a plurality of light receiving results, it is possible to perform appropriate distance measurement for random noise.
- the distance measuring method using the above-mentioned histogram it is necessary to acquire a plurality of light receiving results before the time tre , which is the light receiving timing, is detected, and the light receiving timing detection process takes time. Further, in order to generate a histogram, a memory for storing the light receiving time t m is required, which increases the circuit scale of the device. Moreover, since in order to properly detect the time t re from the histogram, it is necessary to set an appropriate threshold value for each generated histogram, it becomes necessary filtering and threshold setting process therefor, the processing load becomes large. In addition, depending on the required distance accuracy, a filter coefficient that can reliably remove noise and a threshold value for appropriately detecting the time tre are required, so the development man-hours for setting these are also increased. ..
- the distance measuring method using the histogram is desired to be improved from the viewpoint of processing time, processing load or memory reduction. Therefore, in the technique of the present disclosure, the distance from the light receiving timing (time tre ) of the light receiving unit to the object to be measured is recognized by using a machine learning model without generating a histogram.
- time tre time tre
- FIG. 3 is a diagram for explaining an outline of the distance measuring method according to the embodiment of the present disclosure.
- the distance measuring method shown in FIG. 3 is executed by the measuring device 1 (not shown in FIG. 3). Further, the measuring device 1 has a light receiving unit 302 having a plurality of pixels (light receiving elements).
- the light receiving unit 302 of the measuring device 1 outputs each light receiving timing at which the reflected light from the object to be measured is received for each light receiving element (step S1).
- the measuring device 1 recognizes the time tre corresponding to the distance to the object to be measured by using the machine learning model based on the light receiving timing distribution 310 (step S2). Specifically, the measuring device 1 learns the machine learning model M in advance by using the distribution of the light receiving timing by the plurality of light receiving elements as the input data and the time tre corresponding to the distance to the object to be measured as the correct answer data. ..
- the measuring device 1 inputs the light receiving timing output by the light receiving unit 302 in step S1 as input data to the machine learning model M, and sets the output result of the machine learning model M as the time t re corresponding to the distance to the object to be measured. get.
- the measuring device 1 generates data including the distance from the acquired time tr e to the object to be measured (step S3).
- the measuring device 1 does not need to generate a histogram by recognizing the time tr e corresponding to the distance to the object to be measured by using the machine learning model in which the light receiving timing is input. , The time to measure the distance can be shortened. Further, the processing required for generating the histogram becomes unnecessary, and the measuring device 1 can reduce the processing load for measuring the distance. Further, the memory for generating the histogram becomes unnecessary, and the circuit scale of the measuring device 1 can be reduced.
- FIG. 4 is a block diagram showing an example of the configuration of the distance measuring system according to the embodiment of the present disclosure.
- the distance measuring system 6 shown in FIG. 4 includes a measuring device 1, a light source 2, a storage device 7, a control device 4, an optical system 5, and an imaging device 8.
- the light source 2 is, for example, a laser diode, and is driven so as to emit laser light in a pulsed manner.
- a VCSEL Very Cavity Surface Emitting LASER
- a configuration in which laser light emitted from the laser diode array is scanned in a direction perpendicular to the line may be applied.
- the measuring device 1 includes a plurality of light receiving elements.
- the plurality of light receiving elements are arranged in a two-dimensional lattice, for example, to form a light receiving surface.
- the measuring device 1 recognizes the time tre corresponding to the distance to the object to be measured by using machine learning based on the output signals of the plurality of light receiving elements, and generates the distance data.
- the calculated distance data is stored in, for example, the storage device 7.
- the optical system 5 guides light incident from the outside to a light receiving surface included in the measuring device 1.
- the device including the light source 2 and the measuring device 1 is also referred to as a distance measuring device.
- the image pickup device 8 is, for example, an RGB camera that captures an RGB image of the subject space in which the object to be measured exists.
- the control device 4 controls the overall operation of the ranging system 6.
- the control device 4 supplies the measurement device 1 with a light emission trigger signal that is a trigger for causing the light source 2 to emit light.
- the measuring device 1 causes the light source 2 to emit light at a timing based on the light emission trigger signal, and stores a time t em indicating the light emission timing.
- the control device 4 sets a pattern at the time of distance measurement for the measuring device 1 in response to an instruction from the outside, for example.
- control device 4 controls switching of the machine learning model M used for machine learning by the measuring device 1 based on the RGB image captured by the imaging device 8.
- the measuring device 1 selects, for example, a machine learning model M according to the type of the object to be measured according to the control of the control device 4.
- FIG. 5 is a block diagram showing an example of the configuration of the measuring device 1 according to the embodiment of the present disclosure.
- the measuring device 1 has an interface with a pixel array unit 100, a distance measuring processing unit 101, a pixel control unit 102, an overall control unit 103, a clock generation unit 104, and a light emission control unit 105.
- (I / F) unit 106 and model storage unit 107 are included.
- the overall control unit 103 controls the overall operation of the measuring device 1 according to, for example, a program incorporated in advance. Further, the overall control unit 103 can also execute control according to an external control signal supplied from the outside.
- the clock generation unit 104 generates one or more clock signals used in the measuring device 1 based on the reference clock signal supplied from the outside.
- the light emission control unit 105 generates a light emission control signal indicating the light emission timing according to the light emission trigger signal supplied from the outside.
- the light emission control signal is supplied to the light source 2 and also to the distance measuring processing unit 101.
- the pixel array unit 100 is a light receiving unit that is arranged in a two-dimensional lattice and includes a plurality of pixels 10, 10, ...
- the operation of each pixel 10 is controlled by the pixel control unit 102 according to the instruction of the overall control unit 103.
- the pixel control unit 102 controls each pixel 10 so that the output signal of each pixel 10 is read out at once.
- the pixel 10 has, for example, a SPAD (Single Photon Avalanche Diode) element as a light receiving element.
- SPAD Single Photon Avalanche Diode
- the distance measuring processing unit 101 measures the distance D to the object to be measured based on the output signal read from each pixel 10.
- the distance measuring processing unit 101 includes a conversion unit 110, a recognition unit 111, a data generation unit 112, and a model switching unit 113.
- the conversion unit 110 converts the output signal supplied from the pixel array unit 100 into digital information.
- the output signal supplied from the pixel array unit 100 is output corresponding to the timing when light is received by the light receiving element included in the pixel 10 corresponding to the pixel signal.
- the conversion unit 110 converts the supplied output signal into time information of a digital value indicating the timing.
- the output signals of all the pixels 10 of the pixel array unit 100 are input to the conversion unit 110.
- the conversion unit 110 converts all the output signals into time information and outputs the time information to the recognition unit 111.
- the recognition unit 111 includes a DNN (Deep Neural Network) 111a, which is an example of a machine learning model.
- the DNN111a has time information corresponding to the distance D from the time information corresponding to the output signals of all the pixels 10 (hereinafter, also referred to as the time information of each pixel 10) to the object to be measured (hereinafter, also referred to as the time information of the object to be measured). It is a multi-layered algorithm modeled on a human brain neural network (neural network) designed by machine learning to recognize).
- DNN111a is an example, and the recognition unit 111 may use a model (learner) of any type such as a regression model such as SVM as a machine learning model.
- the recognition unit 111 recognizes the time information of the object to be measured by inputting the time information of each pixel 10 input from the conversion unit 110 into the DNN 111a and executing the DNN process. Then, the recognition unit 111 outputs the DNN result output from the DNN 111a to the data generation unit 112 as the recognition result.
- the data generation unit 112 generates output data from the DNN result input from the recognition unit 111 and outputs it to the storage device 7. Specifically, the data generation unit 112 calculates the distance D to the measurement object based on the time information of the measurement object recognized by the recognition unit 111, generates output data, and outputs the output data to the I / F unit 106. ..
- the model switching unit 113 switches the machine learning model of the recognition unit 111.
- the model switching unit 113 reads out the model stored in the model storage unit 107 and supplies it to the DNN 111a, for example, based on an instruction from the control device 4.
- the output data generated by the data generation unit 112 is supplied to the I / F unit 106.
- the I / F unit 106 is, for example, MIPI (Mobile Industry Processor Interface), and outputs output data to, for example, a storage device 7.
- MIPI Mobile Industry Processor Interface
- the I / F unit 106 may output the output data to the control device 4 or an external device (not shown).
- FIG. 6 is a schematic view showing an example of the laminated structure of the measuring device 1 according to the embodiment of the present disclosure.
- the first substrate P1 on which the pixel array unit 100 is arranged, the second substrate P2 on which the conversion unit 110 is arranged, and the recognition unit 111 are arranged. It has a substrate P3 and.
- the pixel array unit 100 is arranged on the first substrate P1.
- a light receiving element among the pixels 10 of the pixel array unit 100 is arranged on the first substrate P1, and other circuit configurations included in the pixels 10 include substrates other than the first substrate P1 such as the second substrate P2. May be placed in.
- a conversion unit 110 is arranged on the second substrate P2.
- the conversion unit 110 has a time digital conversion circuit 110a for each pixel 10, a time digital conversion circuit corresponding to each pixel 10 is arranged on the second substrate P2.
- the time digital conversion circuit (TDC) 110a measures the time when the output signal from the pixel 10 is supplied, and converts the measured time into time information by a digital value.
- the TDC 110a is a circuit that generates one time information for one pixel 10, and the conversion unit 110 has, for example, the same number of TDC 110a as the number of pixels 10.
- the TDC 110a includes, for example, a counter that counts the time from the irradiation timing when the light source 2 irradiates light to the light reception timing when the pixel 10 receives light.
- the counter starts time measurement (counting) in synchronization with the light emission control signal supplied from the light emission control unit 105.
- the counter ends the time measurement according to the inversion timing of the output signal supplied from the pixel 10.
- the TDC 110a outputs the time information obtained by converting the count number from the start to the end of the time measurement by the counter into a digital value to the recognition unit 111.
- the recognition unit 111 is arranged on the third substrate P3.
- a logic circuit other than the recognition unit 111 such as a data generation unit 112 or an overall control unit 103, may be arranged on the third substrate P3.
- logic circuits such as the data generation unit 112, the model switching unit 113, and the overall control unit 103 may be arranged on another board (not shown).
- Time information is supplied from all TDC 110a to the recognition unit 111 arranged on the third substrate P3.
- the recognition unit 111 recognizes the time information of the object to be measured by recognizing the time information of each pixel 10 supplied from all the TDC 110a as an input by the machine learning model.
- the pixel array unit 100, the conversion unit 110, and the recognition unit 111 are arranged on different substrates (first to third substrates P1 to P3), and the measuring device 1 has all the pixels.
- the light receiving time of 10 is measured at one time.
- the recognition unit 111 can recognize the time information of the object to be measured based on the output signals of all the pixels 10. Therefore, the measuring device 1 can detect the distance D to the object to be measured without generating the histogram, and the processing and the memory for generating the histogram become unnecessary.
- the present invention is not limited to this.
- the pixel array unit 100, the conversion unit 110, and the recognition unit 111 may be arranged on one substrate.
- the case where the output signals of all the pixels 10 are read at once is shown as an example, but the reading method of the pixels 10 is not limited to this.
- the output signal may be read out for each pixel 10 in a predetermined area.
- the output signal may be read out for each pixel 10, row or column.
- the measuring device 1 may have a memory for storing the output signal (or the time information obtained by converting the output signal into time) read for each pixel 10.
- the recognition unit 111 is input with the time information corresponding to the output signals of all the pixels 10.
- the recognition unit 111 recognizes the time information of the object to be measured by inputting the time information of each pixel 10 into the DNN 11a and executing the DNN process.
- FIG. 7 is a diagram for explaining an outline of the neural network.
- the neural network 40 is composed of three types of layers, an input layer 41, an intermediate layer 42, and an output layer 43, and has a network structure in which nodes included in each layer are connected by a link.
- the circle in FIG. 7 corresponds to a node, and the arrow corresponds to a link.
- the operation at the node and the weighting at the link are performed in the order from the input layer 41 to the intermediate layer 42 and from the intermediate layer 42 to the output layer 43, and the output data is output from the output layer 43. Is output.
- those having a predetermined number of layers or more are also called DNN (Deep Neural Network) or deep learning.
- the neural network 40 shown in FIG. 7 is only an example, and any network configuration may be used as long as a desired function can be realized.
- the case where the output layer 43 has one node is shown for the sake of simplicity, but for example, in the case of the classification model, the number of nodes of the output layer 43 is a plurality (for example, classification). The number of classes to be used) may be used.
- neural networks can approximate arbitrary functions.
- the neural network can learn the network structure that matches the teacher data by using a calculation method such as backpropagation. Therefore, by constructing the model by the neural network, the model is released from the restriction of the expressive ability that the model is designed within the range that can be understood by humans.
- the machine learning model used by the recognition unit 111 for recognizing time information is not limited to the DNN 111a, and may be configured by various networks.
- the machine learning model may be a model (learner) of any format, such as a regression model such as SVM (Support Vector Machine).
- the machine learning model may be various regression models such as a non-linear regression model and a linear regression model.
- FIGS. 8 and 9 are diagrams for explaining an example of a machine learning model.
- the light emitted from the light source 2 is reflected by the object to be measured and received by the measuring device 1. At this time, how much reflected light reaches the measuring device 1 changes depending on the type of the object to be measured.
- the object to be measured is a car
- most of the irradiation light is reflected by, for example, a body portion formed of a flat surface and received by the measuring device 1. Therefore, from the pixel 10 for receiving ambient light, the greater the pixel 10 for receiving reflected light in the vicinity of the light receiving timing t c corresponding to the distance to the object to be measured.
- the light receiving timing distribution is such that the number of pixels that receive the reflected light increases in a predetermined range including the light receiving timing t c .
- the vertical axis shows the frequency for each bin and the horizontal axis shows the bin (time t), showing the relationship between the light receiving timing of each pixel and the number of pixels.
- the graph shown in FIG. 8 is a graph showing at what time many pixels received the reflected light.
- the object to be measured is a tree
- a part of the irradiation light is diffusely reflected on the surface of the tree, so that the reflected light reaching the measuring device 1 is less than that in the case of a car. Therefore, as shown in FIG. 9, the distribution of the light reception timing is increased in the vicinity of the light receiving timing t t for the distance to the object to be measured, as compared with the case the object to be measured is a car, the light receiving timing t t
- the number of pixels that receive reflected light in the vicinity decreases.
- the distribution of the light receiving timing changes not only with the distance D to the object to be measured but also with the type of the object to be measured. Therefore, in the embodiment of the present disclosure, the machine learning model is learned for each type of the object to be measured such as a car, a building, a road, a pedestrian, and a tree. This makes it possible to improve the recognition accuracy of the time information of the object to be measured.
- the machine learning model is pre-constructed based on supervised learning as supervised data by associating the distribution of the light receiving timing with the time information of the measured object for each type of the object to be measured. Further, it is assumed that the constructed machine learning model is stored in the model storage unit 107 in advance.
- the recognition unit 111 switches the machine learning model used for recognizing the time information by reading the machine learning model from the model storage unit 107 based on the instruction of the model switching unit 113.
- a machine learning model may be constructed for each scene in the subject space.
- a machine learning model may be constructed for each hour such as day or night, for each weather such as rainy or sunny weather, or for each place such as a country, region, urban area, or mountainous area.
- ambient light other than the reflected light received by each pixel 10 changes between day and night.
- the light receiving timing of the pixel 10 changes due to the reflection by the raindrops.
- road conditions, signs, and surrounding environment differ depending on the country, region, urban area, and mountainous area.
- a machine learning model for each scene in the subject space, it is possible to improve the recognition accuracy of the time information of the object to be measured in the subject space. Further, for example, when recognizing the distance D to the car in the urban area in the daytime, a machine learning model may be constructed for each combination of the scene and the type of the object to be measured.
- FIG. 10 is a diagram for explaining recognition of time information by the recognition unit 111.
- the fact that the light source 2 irradiates the irradiation light is also described as a shot.
- the recognition unit 111 acquires, for example, the time information obtained by converting the output signal from the pixel array unit 100 by the conversion unit 110 for each shot.
- the recognition unit 111 inputs the time information acquired for each shot into the DNN 111a, and acquires the recognition result of the time information corresponding to the distance D of the object to be measured.
- the recognition unit 111 sequentially acquires time information of the distribution as shown in FIG. 10 corresponding to the first to Nth shots by the light source 2.
- the distance D to the object to be measured corresponding to each shot is obtained as described above. It is calculated based on the above equation (1).
- the time information t 1 to t N output from the recognition unit 111 corresponds to the em ⁇ tr re of the equation (1).
- the data generation unit 112 may calculate the distance D to the object to be measured for each of the first to Nth shots, or the data generation unit 112 may calculate the distance to the object to be measured for each of a plurality of shots. D may be calculated. At this time, the data generation unit 112 may calculate the distance D by using the average value of the time information of the object to be measured in the plurality of shots.
- the data generation unit 112 may determine whether or not the value of the time information of the object to be measured has converged, and the distance D may be calculated using the time information in the shot determined to have converged. When the value of the time information converges and the data generation unit 112 calculates the distance D, the data generation unit 112 determines that the recognition by the recognition unit 111 has been completed, and ends the measurement of the distance of the object to be measured. You may try to do it.
- the measuring device 1 may complete the measurement of the distance D in one shot. That is, when the recognition unit 111 performs DNN processing on the time information of each pixel 10 acquired corresponding to one shot, recognizes the time information of the object to be measured, and the data generation unit 112 calculates the distance D, the measuring device The measurement of the distance D according to 1 may be completed. In this way, by calculating the distance D using the machine learning model, the measuring device 1 can measure the distance D in the shortest one shot, and the measurement time of the distance D can be shortened.
- FIG. 11 is a block diagram showing a configuration example of the control device 4 according to the embodiment of the present disclosure.
- the control device 4 is realized by, for example, an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor. Further, the control device 4 may include a ROM (Read Only Memory) for storing programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) for temporarily storing parameters and the like that change as appropriate.
- ROM Read Only Memory
- RAM Random Access Memory
- the control device 4 detects an object to be measured, recognizes a scene in the subject space, and controls switching of the machine learning model of the measuring device 1 based on the captured image of the imaging device 8. Specifically, the control device 4 functions as an acquisition unit 401, an extraction unit 402, a scene recognition unit 403, a distance measurement determination unit 406, a model selection unit 404, and a notification unit 405.
- the acquisition unit 401 acquires an RGB image from the image pickup device 8.
- the acquisition unit 401 may acquire information regarding, for example, the imaging time and the imaging location of the RGB image.
- Such information may be acquired from the image pickup apparatus 8, or may be acquired from a sensor such as a GPS sensor (not shown).
- a sensor may be included in the ranging system 6 or may be an external sensor.
- the extraction unit 402 extracts the object to be measured from the RGB image.
- the extraction unit 402 performs image processing such as template matching to extract an object to be measured such as a tree, a vehicle, or a road from an RGB image. Further, the extraction unit 402 may extract, for example, an empty area, a ground area including a road, or the like from the color information of the RGB image. Alternatively, the extraction unit 402 may extract the object to be measured, the empty area, the ground area, or the like using a machine learning model such as DNN.
- the scene recognition unit 403 recognizes the scene in the subject space from the RGB image. Alternatively, the scene recognition unit 403 may recognize the scene from information regarding the imaging time, the imaging location, and the like of the RGB image. Further, the scene recognition unit 403 may acquire information necessary for scene recognition from an external device or the like via a network (not shown).
- the scene is information indicating the subject space represented by peripheral information such as season, time, weather, or place.
- the scene recognition unit 403 recognizes, for example, whether the subject space is daytime or nighttime from the brightness (brightness) of the RGB image. Further, the scene recognition unit 403 may recognize the scene based on the extraction result by the extraction unit 402. For example, when the extraction unit 402 detects a building, a pedestrian, a car, or the like as an object to be measured, the scene recognition unit 403 recognizes that the subject space is an urban area.
- the scene recognition by the scene recognition unit 403 is not limited to the one based on the RGB image, and the scene may be recognized from information other than the RGB image.
- the scene recognition unit 403 may recognize the season, time, etc. of the subject space from the imaging date and time of the RGB image.
- the scene recognition unit 403 may recognize the location of the subject space from the imaging location of the RGB image.
- the scene recognition unit 403 may recognize the weather in the subject space based on, for example, the detection result of the rain detection sensor and the RGB image. In this way, the scene recognition unit 403 may recognize the scene in the subject space based on the detection results of the plurality of sensors and the like.
- the scene recognition unit 403 may recognize the scene in the subject space by using a machine learning model such as DNN.
- the scene recognition unit 403 may input an RGB image to the DNN, or may input information such as a shooting date and time and a detection result of the rain detection sensor to the DNN in addition to the RGB image.
- the scene recognition unit 403 recognizes the scene in the subject space based on the detection result of the sensor including the RGB image, for example.
- the scene recognition unit 403 recognizes, for example, that the subject space is "in Japan”, “urban area”, “sunny”, and "evening".
- the distance measurement determination unit 406 determines the distance measurement position (distance measurement point) for measuring the distance D based on the extraction result of the extraction unit 402 and the recognition result of the scene recognition unit 403. Further, the distance measuring determination unit 406 determines the values of various parameters related to the light emitting system such as the direction and power of irradiation by the light source 2 and the pulse shape. Further, the distance measuring determination unit 406 selects the values of various parameters related to the light receiving system, such as the exposure period and the frame rate of the pixel array unit 100.
- FIGS. 12 and 13 are diagrams for explaining the distance measuring position determined by the distance measuring determination unit 406.
- the extraction unit 402 extracts the tree Tr, the road R, the car C1, the house H, the pedestrian Pe, and the empty area SR from the RGB image M1.
- the distance measurement determination unit 406 determines which distance D to be measured in the subject space is to be measured based on the extraction result of the extraction unit 402. Further, the distance measuring determination unit 406 determines at which position of the object to be measured the distance D is measured.
- the distance measuring determination unit 406 determines that, for example, the distance D to the pedestrian Pe is measured at five points indicated by "+" in FIG.
- the number N1 of the distance D to be measured for each object to be measured is, for example, predetermined for each type of the object to be measured, and the distance measuring determination unit 406 determines the distance measuring position in the RGB image M1 according to the number N1, in other words.
- the direction of the light emitted from the light source 2 is determined.
- the distance measuring determination unit 406 increases the number of measurements N1 determined for each type of the object to be measured, for example, when the object to be measured is a moving object such as a pedestrian Pe or a car C1, and a stationary object such as a tree Tr or a house H. In the case of, the amount should be less than in the case of a moving object (see FIG. 12). Further, a background such as a road R may have a smaller number of measurements N1 than a stationary object. Further, it is not necessary to measure the distance D in the area where there is no object to be measured, such as the empty area SR.
- the distance measurement determination unit 406 determines to measure the distance D to the object to be measured excluding the empty area SR.
- the distance measurement determination unit 406 determines the direction of the light emitted from the light source 2 (distance measurement position) based on the type of the object to be measured extracted by the extraction unit 402 and the position in the RGB image.
- the distance measurement determination unit 406 determines one position A1 as the measurement position for measuring the distance D to the vehicle C2. To do. This is because when the car C2 is oriented sideways, the side surface of the car C2 is formed of a substantially flat surface, so that the distance D to the car C2 does not change significantly regardless of the position of the side surface. In this way, when the object to be measured has a plane, the distance measuring determination unit 406 determines the representative position of the plane as the distance measuring position. As a result, the distance measuring system 6 can reduce the number of times the measuring device 1 measures the distance D to the object to be measured.
- the distance measuring determination unit 406 determines one representative position, but the present invention is not limited to this.
- the distance measuring determination unit 406 may determine a plurality of representative positions.
- the distance measuring determination unit 406 may determine the distance measuring position according to the scene by using the recognition result of the scene recognition unit 403. For example, when the time zone of the subject space is "night", the distance measuring determination unit 406 may set the number of measurements N1 of the distance D to the object to be measured to be larger than that when the time zone is "day”. .. That is, the distance measurement determination unit 406 determines the measurement position so that the distance D to the object to be measured is measured at more measurement positions when the time zone is "night” than in the case of "day". You may.
- the model selection unit 404 determines, for example, a distance measurement model used for recognizing the distance D of the object to be measured by the measuring device 1 based on the extraction result of the extraction unit 402 and the recognition result of the scene recognition unit 403. Select for each distance measurement position determined by unit 406.
- the model selection unit 404 selects a machine learning model constructed by supervised learning with the pedestrian as correct answer data. In this way, the model selection unit 404 selects the machine learning model according to the type of the object to be measured extracted by the extraction unit 402, for example.
- the model selection unit 404 may select a machine learning model according to the scene by using the recognition result of the scene recognition unit 403. For example, when the weather in the subject space is "rain", the model selection unit 404 is in the scene based on the recognition result of the scene, such as selecting a machine learning model constructed based on the measurement data collected in the rain. Select a machine learning model.
- the model selection unit 404 may select a machine learning model based on, for example, the extraction result of the extraction unit 402 and the recognition result of the scene recognition unit 403. For example, a keyword related to the machine learning model is associated in advance, and the model selection unit 404 selects a machine learning model to which the same keywords as the extraction result and the recognition result are associated. Specifically, for example, when the extraction result is "car” and the scene recognition result is "daytime” or "rain", the model selection unit 404 associates the extraction result and the scene recognition result with a plurality of machine learning models. Match the keywords and select the machine learning model with the highest degree of matching between the results and the keywords.
- the model selection unit 404 may select the model using machine learning.
- the model selection unit 404 selects the machine learning model used for measuring the distance D by inputting the extraction result and the scene recognition result into the DNN, which is an example of the machine learning model, and executing the DNN process. ..
- the notification unit 405 notifies the measuring device 1 of the selection result of the model selection unit 404.
- the notification unit 405 notifies the measuring device 1 of information that specifies, for example, the machine learning model selected by the model selection unit 404.
- the model selection unit 404 selects the values of the parameters related to the light emitting system and the light receiving system, the value of the selected parameters is notified to the measuring device 1 or the light source 2.
- FIG. 14 is a flowchart showing an example of the procedure of the model selection process according to the embodiment of the present disclosure.
- control device 4 acquires an RGB image from the image pickup device 8 (step S201). Next, the control device 4 extracts the object to be measured by extracting the feature amount from the RGB image (step S202).
- control device 4 recognizes the scene of the RGB image (step S203).
- the control device 4 determines the distance measurement position of the distance D of the object to be measured based on the object to be measured extracted in step S202 and the scene recognized in step S203 (step S204).
- the control device 4 selects a machine learning model for each distance measuring position (step S205), and notifies the measuring device 1 of the determined distance measuring position and the corresponding machine learning model (step S206).
- FIG. 15 is a flowchart showing an example of the procedure of the measurement process according to the embodiment of the present disclosure.
- the measuring device 1 first switches the machine learning model used for recognition of the recognition unit 111 based on the notification of the control device 4 and the distance measuring position (step S101). Subsequently, the measuring device 1 controls the light source 2 to irradiate the light (step S102).
- the measuring device 1 measures the time from when the light source 2 irradiates the light to when the pixel 10 receives the light in all the pixels 10 (step S103).
- the measuring device 1 recognizes the time information of the object to be measured from the time measurement results of all the pixels 10 by using the machine learning model (step S104).
- the measuring device 1 generates data including the distance D to the measured object based on the recognized time information of the measured object (step S105).
- the measuring device 1 includes a plurality of light receiving elements (pixels 10) and a recognition unit 111.
- the plurality of light receiving elements are arranged on the first substrate P1 and output a signal when the light emitted from the light source 2 and reflected by the object to be measured is received.
- the recognition unit 111 is arranged on a second substrate (corresponding to the third substrate P3 of the above embodiment) different from the first substrate P1 and uses a machine learning model based on output signals of a plurality of light receiving elements. Recognize the distance D to the object to be measured.
- the measuring device 1 can recognize the time information of the object to be measured based on the output signals of the plurality of light receiving elements, and can reduce the processing time required for measuring the distance D to the object to be measured. it can. Further, since the measuring device 1 does not generate a histogram and does not require a memory, it is possible to reduce the time and effort for parameter adjustment as compared with the measurement of the distance D using the histogram, and the accuracy can be improved. ..
- control device 4 selects the machine learning model based on the RGB image, but the present invention is not limited to this.
- the measuring device 1 may select a machine learning model based on the time information of all the pixels 10.
- the model switching unit 113 of the measuring device 1 selects a machine learning model according to the distribution, and switches the machine learning model used for recognition by the recognition unit 111.
- the model switching unit 113 may acquire the distribution of the time information of the pixels 10 from the conversion unit 110, or may acquire it from the recognition unit 111, for example.
- FIG. 16 is a diagram showing an example of the distribution of time information of the pixel 10.
- the model switching unit 113 of the measuring device 1 may change the values of various parameters related to the light emitting system such as the direction and power of irradiation by the light source 2 and the pulse shape based on the distribution. For example, when the time information of the pixels 10 has a random distribution as shown in FIG. 16, the model switching unit 113 determines that the distance D to the object to be measured is far, and increases the irradiation power or is covered in the irradiation direction. The values of various parameters are changed so that the irradiation direction is changed by judging that there is no object to be measured. The model switching unit 113 outputs the changed parameter to, for example, the light emission control unit 105. Alternatively, the light emission control unit 105 may change various parameters based on the distribution.
- FIG. 17 is a diagram showing another example of the distribution of time information of the pixel 10.
- the model switching unit 113 of the measuring device 1 may change the values of various parameters related to the light emitting system such as the direction and power of irradiation by the light source 2 and the pulse shape based on the distribution.
- the model switching unit 113 outputs the changed parameter to, for example, the light emission control unit 105.
- the light emission control unit 105 may change various parameters based on the distribution.
- the measuring device 1 changes various parameters of the light emitting system
- various parameters of the light receiving system such as the light receiving sensitivity may be changed.
- the measuring device 1 switches the machine learning model based on the time information of the pixel 10, and changes various parameters of the light emitting system or the light receiving system, so that the measuring device 1 can move the distance to the object to be measured.
- the measurement accuracy of D can be improved.
- the recognition unit 111 of the measuring device 1 recognizes the time information of the object to be measured, but the present invention is not limited to this.
- the recognition unit 111 of the measuring device 1 may input the time information of each pixel 10 into the DNN 111a to recognize the distance D to the object to be measured. In this way, the recognition unit 111 recognizes information related to the distance D to the object to be measured, such as the time information of the object to be measured and the distance D.
- control device 4 selects the machine learning model based on the imaging result of the imaging device 8, but the machine learning model may be selected based on the sensing result of another sensor device.
- the sensing result of an infrared sensor such as a near infrared light (IR) sensor, a short wave infrared (SWIR) sensor, a medium wave infrared (MWIR) sensor, or a long wave infrared (LWIR) sensor
- the control device 4 may select the machine learning model.
- a machine learning model based on the sensing result of a LIDAR (Laser Imaging Detection and Ranking) sensor in which the control device 4 measures the distance D based on the light in a frequency band different from the frequency band of the light emitted from the light source 2. May be selected.
- the control device 4 may select the machine learning model based on the sensing result of the RADAR (Radio Detecting and Ringing) sensor.
- RADAR Radio Detecting and Ringing
- control device 4 may select the machine learning model by combining the sensing results of various sensors including the image pickup device 8 described above.
- the recognition unit 111 may recognize the time information of the object to be measured by using the sensing results of the various sensors described above.
- the outputs of the various sensors described above are input to the recognition unit 111.
- the recognition unit 111 recognizes the time information of the object to be measured by inputting the time information of each pixel 10 and the outputs of the various sensors described above into the DNN 111a and executing the DNN process. In this way, by using the outputs of various sensors, it is possible to improve the recognition accuracy of the time information of the object to be measured by the recognition unit 111.
- the measuring device 1 switches the machine learning model to measure the distance D to the object to be measured, but the present invention is not limited to this.
- the measuring device 1 may measure the distance D to the object to be measured using one machine learning model.
- the measuring device 1 measures the distance D to a specific object to be measured such as a car
- the measuring device 1 measures the distance D using, for example, one machine learning model specialized for the car. May be good.
- the data generation unit 112 and the model switching unit 113 are provided inside the measuring device 1
- the data generating unit 112 and the model switching unit 112 and the model switching unit are provided in the application processor provided outside the distance measuring device.
- a unit 113 may be provided.
- the distance measurement determination unit 406 and the model selection unit 404 are provided inside the control device 4, but the measurement device 1 is provided with the distance measurement determination unit 406 and the model selection unit 404. May be good.
- the model selection unit 404 may be provided in the measuring device 1 by including the function of the model selection unit 404 in the model switching unit 113.
- the measuring device 1 is arranged on the first substrate P1 and has a plurality of light receiving elements (pixels 10) that output a signal when the light emitted from the light source 2 and reflected by the object to be measured is received. ) And a third substrate P3 different from the first substrate P1. Based on the output signals of a plurality of light receiving elements, the machine learning model (DNN111a) is used to recognize information on the distance D to the object to be measured.
- the recognition unit 111 is provided.
- the measuring device 1 can measure the distance D to the object to be measured without generating a histogram, and the processing time required for measuring the distance D can be reduced.
- the measuring device 1 measures the time from the light emitting timing when the light source 2 emits light to the light receiving timing when the light receiving element receives light based on the output signals of the plurality of light receiving elements, and acquires the measured value.
- a time measuring unit (converting unit 110) is further provided.
- the recognition unit 111 recognizes information about the distance D by using a machine learning model in which a plurality of measured values corresponding to each of the plurality of light receiving elements are input.
- the measuring device 1 can measure the distance D to the object to be measured without generating a histogram, and the processing time required for measuring the distance D can be reduced.
- the recognition unit 111 recognizes the information regarding the distance D based on the measurement value measured by the time measurement unit (conversion unit 110) when the light source 2 emits light once.
- the measuring device 1 can reduce the measurement time of the distance D by recognizing the information regarding the distance D by the recognition unit 111 by emitting light once by the light source 2.
- the recognition unit 111 provides information on the distance D based on the measured values measured by the time measurement unit (conversion unit 110) corresponding to each light emission when the light source 2 emits light a plurality of times. Recognize.
- the measuring device 1 can improve the measurement accuracy of the distance D by recognizing the information regarding the distance D by the recognition unit 111 by emitting light a plurality of times by the light source 2.
- the recognition unit 111 determines that the information regarding the distance D has been recognized, the recognition unit 111 ends the recognition of the information regarding the distance D.
- the recognition unit 111 recognizes the information regarding the distance D, the recognition process is terminated, so that the measuring device 1 can improve the measurement accuracy of the distance D.
- the recognition unit 111 recognizes the information regarding the distance D by using the neural network as a machine learning model.
- the measuring device 1 can improve the measurement accuracy of the distance D.
- connection between the first substrate P1 and the third substrate P3 according to the embodiment and the modification is the connection between the copper electrodes.
- the degree of freedom in designing the measuring device 1 is improved, and the measuring device 1 can be miniaturized.
- the distance measuring device is arranged on the light source 2 that irradiates the object to be measured with light and the first substrate P1, and receives a signal when the light reflected from the object to be measured is received.
- a plurality of light receiving elements (pixels 10) to be output are arranged on a third substrate P3 different from the first substrate P1, and are measured using a machine learning model (DNN111a) based on output signals of the plurality of light receiving elements. It includes a recognition unit 111 that recognizes information about a distance D to an object.
- the distance measuring device can measure the distance D to the object to be measured without generating a histogram, and the processing time required for measuring the distance D can be reduced.
- FIG. 18 is a diagram showing a usage example using the measuring device 1 to which the above-described embodiment and modification can be applied.
- the above-mentioned measuring device 1 can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-ray, as described below.
- -A device that captures images used for viewing, such as digital cameras and mobile devices with camera functions.
- in-vehicle sensors that photograph the front, rear, surroundings, interior of the vehicle, etc., surveillance cameras that monitor traveling vehicles and roads, inter-vehicle distance, etc.
- a device used for traffic such as a distance measuring sensor that measures a distance.
- -A device used in home appliances such as TVs, refrigerators, and air conditioners to photograph a user's gesture and operate the device according to the gesture.
- -Devices used for medical treatment and healthcare such as endoscopes and devices that perform angiography by receiving infrared light.
- -Devices used for security such as surveillance cameras for crime prevention and cameras for personal authentication.
- -A device used for beauty such as a skin measuring device that photographs the skin and a microscope that photographs the scalp.
- -Devices used for sports such as action cameras and wearable cameras for sports applications.
- -Devices used for agriculture such as cameras for monitoring the condition of fields and crops.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure is realized as a device mounted on a moving body of any kind such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot. You may.
- FIG. 19 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile control system to which the technique according to the present disclosure can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via the communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside information detection unit 12030, an in-vehicle information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (Interface) 12053 are shown as a functional configuration of the integrated control unit 12050.
- the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 provides a driving force generator for generating the driving force of the vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to the wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism for adjusting and a braking device for generating braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, back lamps, brake lamps, blinkers or fog lamps.
- the body system control unit 12020 may be input with radio waves transmitted from a portable device that substitutes for the key or signals of various switches.
- the body system control unit 12020 receives inputs of these radio waves or signals and controls a vehicle door lock device, a power window device, a lamp, and the like.
- the vehicle outside information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
- the image pickup unit 12031 is connected to the vehicle exterior information detection unit 12030.
- the vehicle outside information detection unit 12030 causes the image pickup unit 12031 to capture an image of the outside of the vehicle and receives the captured image.
- the vehicle exterior information detection unit 12030 may perform object detection processing or distance detection processing such as a person, a vehicle, an obstacle, a sign, or a character on the road surface based on the received image.
- the image pickup unit 12031 is an optical sensor that receives light and outputs an electric signal according to the amount of the light received.
- the imaging unit 12031 can output an electric signal as an image or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
- the in-vehicle information detection unit 12040 detects the in-vehicle information.
- a driver state detection unit 12041 that detects the driver's state is connected to the in-vehicle information detection unit 12040.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated, or it may be determined whether the driver is dozing.
- the microcomputer 12051 calculates the control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit.
- a control command can be output to 12010.
- the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, follow-up driving based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, and the like. It is possible to perform cooperative control for the purpose of.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generating device, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the outside information detection unit 12030 or the inside information detection unit 12040, so that the driver can control the driver. It is possible to perform coordinated control for the purpose of automatic driving that runs autonomously without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the vehicle exterior information detection unit 12030.
- the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or the oncoming vehicle detected by the external information detection unit 12030, and performs cooperative control for the purpose of antiglare such as switching the high beam to the low beam. It can be carried out.
- the audio image output unit 12052 transmits the output signal of at least one of the audio and the image to the output device capable of visually or audibly notifying the passenger or the outside of the vehicle of the information.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
- the display unit 12062 may include, for example, at least one of an onboard display and a heads-up display.
- FIG. 20 is a diagram showing an example of the installation position of the imaging unit 12031.
- the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
- the imaging units 12101, 12102, 12103, 12104, 12105 are provided at positions such as, for example, the front nose, side mirrors, rear bumpers, back doors, and the upper part of the windshield in the vehicle interior of the vehicle 12100.
- the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 provided in the side mirrors mainly acquire images of the side of the vehicle 12100.
- the imaging unit 12104 provided on the rear bumper or the back door mainly acquires an image of the rear of the vehicle 12100.
- the imaging unit 12105 provided on the upper part of the windshield in the vehicle interior is mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 20 shows an example of the photographing range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
- the imaging range 12114 indicates the imaging range of the imaging units 12102 and 12103.
- the imaging range of the imaging unit 12104 provided on the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, a bird's-eye view image of the vehicle 12100 as viewed from above can be obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the image pickup units 12101 to 12104 may be a stereo camera composed of a plurality of image pickup elements, or may be an image pickup element having pixels for phase difference detection.
- the microcomputer 12051 has a distance to each three-dimensional object within the imaging range 12111 to 12114 based on the distance information obtained from the imaging units 12101 to 12104, and a temporal change of this distance (relative velocity with respect to the vehicle 12100).
- a predetermined speed for example, 0 km / h or more.
- the microcomputer 12051 can set an inter-vehicle distance to be secured in front of the preceding vehicle in advance, and can perform automatic braking control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform coordinated control for the purpose of automatic driving or the like in which the vehicle travels autonomously without depending on the operation of the driver.
- the microcomputer 12051 converts three-dimensional object data related to a three-dimensional object into two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, electric poles, and other three-dimensional objects based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that can be seen by the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104.
- pedestrian recognition includes, for example, a procedure for extracting feature points in an image captured by an imaging unit 12101 to 12104 as an infrared camera, and pattern matching processing for a series of feature points indicating the outline of an object to determine whether or not the pedestrian is a pedestrian. It is done by the procedure to determine.
- the audio image output unit 12052 determines that the recognized pedestrian has a square contour line for emphasis.
- the display unit 12062 is controlled so as to superimpose and display. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
- the above is an example of a vehicle control system to which the technology according to the present disclosure can be applied.
- the technique according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
- the ranging system 6 of FIG. 4 can be applied to the imaging unit 12031.
- the present technology can also have the following configurations.
- a plurality of light receiving elements arranged on the first substrate and outputting a signal when the light emitted from the light source and reflected by the object to be measured is received.
- a recognition unit that is arranged on a second substrate different from the first substrate and recognizes information on the distance to the object to be measured by using a machine learning model based on the output signals of the plurality of light receiving elements.
- a measuring device provided with.
- a time measuring unit for measuring the time from the light emitting timing when the light source emits light to the light receiving timing received by the light receiving element and acquiring the measured value is further provided.
- the measuring device wherein the recognition unit recognizes information on the distance by using the machine learning model in which a plurality of the measured values corresponding to the plurality of light receiving elements are input.
- the recognition unit recognizes information on the distance based on the measured value measured by the time measuring unit when the light source emits light once.
- the recognition unit recognizes information on the distance based on the measured value measured by the time measuring unit corresponding to each light emission.
- the recognition unit ends the recognition of the information regarding the distance.
- the measuring device according to any one of (1) to (5), wherein the recognition unit recognizes information on the distance by using a neural network as the machine learning model.
- the connection between the first substrate and the second substrate is a connection between copper electrodes.
- a light source that irradiates the object to be measured with light, A plurality of light receiving elements arranged on the first substrate and outputting a signal when receiving light reflected from the object to be measured, and a plurality of light receiving elements.
- a recognition unit that is arranged on a second substrate different from the first substrate and recognizes information on the distance to the object to be measured by using a machine learning model based on the output signals of the plurality of light receiving elements.
- Measuring device 2 Light source 4 Control device 6 Distance measuring system 8 Imaging device 10 pixels 100 pixels Array unit 107 Model storage unit 110 Conversion unit 111 Recognition unit 112 Data generation unit 113 Model switching unit 401 Acquisition unit 402 Extraction unit 403 Scene recognition unit 404 Model selection unit 405 Notification unit 406 Distance measurement determination unit
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Radar Systems Or Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
本開示の実施形態の説明に先んじて、理解を容易とするために、測距方法の一つとして、ヒストグラムを用いて測距を行う技術について説明する。この場合の測距技術として、直接ToF(direct Time Of Flight)方式を適用する。直接ToF方式は、光源から射出された光が被測定物により反射した反射光を受光素子により受光し、光の射出タイミングと受光タイミングとの差分の時間に基づき測距を行う技術である。
D=(c/2)×(tem-tre) …(1)
[測距方法の概要]
図3は、本開示の実施形態にかかる測距方法の概要を説明するための図である。図3に示す測距方法は、測定装置1(図3での図示は省略する)によって実行される。また、測定装置1は、複数の画素(受光素子)を有する受光部302を有する。
図4は、本開示の実施形態にかかる測距システムの構成の一例を示すブロック図である。図4に示す測距システム6は、測定装置1と、光源2と、記憶装置7と、制御装置4と、光学系5と、撮像装置8と、を含む。
図5は、本開示の実施形態にかかる測定装置1の構成の一例を示すブロック図である。図5に示すように、測定装置1は、画素アレイ部100と、測距処理部101と、画素制御部102と、全体制御部103と、クロック生成部104と、発光制御部105と、インタフェース(I/F)部106と、モデル記憶部107と、を含む。
ここで、図6を用いて測定装置1の積層構造について概略的に説明する。図6は、本開示の実施形態にかかる測定装置1の積層構造の一例を示す概略図である。図6に示すように、測定装置1は、画素アレイ部100が配置される第1の基板P1と、変換部110が配置される第2の基板P2と認識部111が配置される第3の基板P3と、を有する。
次に、図7~図9を用いて認識部111が距離Dの認識に用いる機械学習モデルついて説明する。上述したように、認識部111には、画素10全ての出力信号に対応する時間情報が入力される。認識部111は、DNN11aに各画素10の時間情報を入力してDNN処理を実行することにより、被測定物の時間情報を認識する。
次に、図10を用いて、認識部111による被測定物の時間情報の認識について説明する。図10は、認識部111による時間情報の認識について説明するための図である。以下、光源2が照射光を照射することをショットとも記載する。
続いて、図11を用いて制御装置4の詳細について説明する。制御装置4は、測距システムの制御を行うとともに、モデル切替部113による機械学習モデルの切り替えを制御する。ここでは、主に、制御装置4が機械学習モデルの切り替え制御を行う点について説明する。なお、図11は、本開示の実施形態にかかる制御装置4の構成例を示すブロック図である。
次に、測距システム6が実行する測距処理手順の一例を説明する。まず、制御装置4によるモデル選択処理について説明し、その後、測定装置1による測定処理手順について説明する。
図14は、本開示の実施形態に係るモデル選択処理の手順の一例を示すフローチャートである。
図15は、本開示の実施形態に係る測定処理の手順の一例を示すフローチャートである。
なお、上記実施形態では、制御装置4がRGB画像に基づき、機械学習モデルを選択するとしたが、これに限定されない。例えば、測定装置1が、全画素10の時間情報に基づき、機械学習モデルを選択するようにしてもよい。
実施形態および変形例にかかる測定装置1は、第1の基板P1に配置され、光源2から照射され被測定物で反射される光を受光した際に信号を出力する複数の受光素子(画素10)と、第1の基板P1とは異なる第3の基板P3に配置され、複数の受光素子の出力信号に基づき、機械学習モデル(DNN111a)を用いて被測定物までの距離Dに関する情報を認識する認識部111と、を備える。
次に、本開示の実施形態および変形例の適用例について説明する。図18は、上述の実施形態および変形例を適用可能な測定装置1を使用する使用例を示す図である。
・自動停止等の安全運転や、運転者の状態の認識等のために、自動車の前方や後方、周囲、車内等を撮影する車載用センサ、走行車両や道路を監視する監視カメラ、車両間等の測距を行う測距センサ等の、交通の用に供される装置。
・ユーザのジェスチャを撮影して、そのジェスチャに従った機器操作を行うために、TVや、冷蔵庫、エアーコンディショナ等の家電に供される装置。
・内視鏡や、赤外光の受光による血管撮影を行う装置等の、医療やヘルスケアの用に供される装置。
・防犯用途の監視カメラや、人物認証用途のカメラ等の、セキュリティの用に供される装置。
・肌を撮影する肌測定器や、頭皮を撮影するマイクロスコープ等の、美容の用に供される装置。
・スポーツ用途等向けのアクションカメラやウェアラブルカメラ等の、スポーツの用に供される装置。
・畑や作物の状態を監視するためのカメラ等の、農業の用に供される装置。
本開示に係る技術(本技術)は、様々な製品へ適用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
(1)
第1の基板に配置され、光源から照射され被測定物で反射される光を受光した際に信号を出力する複数の受光素子と、
前記第1の基板とは異なる第2の基板に配置され、前記複数の受光素子の出力信号に基づき、機械学習モデルを用いて前記被測定物までの距離に関する情報を認識する認識部と、
を備える測定装置。
(2)
前記複数の受光素子の前記出力信号に基づき、光源が発光した発光タイミングから前記受光素子が受光した受光タイミングまでの時間を計測し、計測値を取得する時間計測部をさらに備え、
前記認識部は、前記複数の受光素子それぞれに対応する複数の前記計測値を入力とする前記機械学習モデルを用いて前記距離に関する情報を認識する
(1)に記載の測定装置。
(3)
前記認識部は、前記光源が1回発光したときに前記時間計測部が計測した前記計測値に基づき、前記距離に関する情報を認識する(2)に記載の測定装置。
(4)
前記認識部は、前記光源が複数回発光したときに、前記時間計測部が各発光に対応して計測した前記計測値に基づき、前記距離に関する情報を認識する(2)に記載の測定装置。
(5)
前記距離に関する情報を認識したと判定した場合に、前記認識部は、前記距離に関する情報の認識を終了する(4)に記載の測定装置。
(6)
前記認識部は、前記機械学習モデルとしてニューラルネットワークを用いて前記距離に関する情報を認識する(1)~(5)のいずれか1つに記載の測定装置。
(7)
前記第1の基板と前記第2の基板との間の接続は、銅電極同士の接続である(1)~(6)のいずれか1つに記載の測定装置。
(8)
被測定物に光を照射する光源と、
第1の基板に配置され、前記被測定物から反射される光を受光した際に信号を出力する複数の受光素子と、
前記第1の基板とは異なる第2の基板に配置され、前記複数の受光素子の出力信号に基づき、機械学習モデルを用いて前記被測定物までの距離に関する情報を認識する認識部と、
を備える測距装置。
2 光源
4 制御装置
6 測距システム
8 撮像装置
10 画素
100 画素アレイ部
107 モデル記憶部
110 変換部
111 認識部
112 データ生成部
113 モデル切替部
401 取得部
402 抽出部
403 シーン認識部
404 モデル選択部
405 通知部
406 測距決定部
Claims (8)
- 第1の基板に配置され、光源から照射され被測定物で反射される光を受光した際に信号を出力する複数の受光素子と、
前記第1の基板とは異なる第2の基板に配置され、前記複数の受光素子の出力信号に基づき、機械学習モデルを用いて前記被測定物までの距離に関する情報を認識する認識部と、
を備える測定装置。 - 前記複数の受光素子の前記出力信号に基づき、光源が発光した発光タイミングから前記受光素子が受光した受光タイミングまでの時間を計測し、計測値を取得する時間計測部をさらに備え、
前記認識部は、前記複数の受光素子それぞれに対応する複数の前記計測値を入力とする前記機械学習モデルを用いて前記距離に関する情報を認識する
請求項1に記載の測定装置。 - 前記認識部は、前記光源が1回発光したときに前記時間計測部が計測した前記計測値に基づき、前記距離に関する情報を認識する請求項2に記載の測定装置。
- 前記認識部は、前記光源が複数回発光したときに、前記時間計測部が各発光に対応して計測した前記計測値に基づき、前記距離に関する情報を認識する請求項2に記載の測定装置。
- 前記距離に関する情報を認識したと判定した場合に、前記認識部は、前記距離に関する情報の認識を終了する請求項4に記載の測定装置。
- 前記認識部は、前記機械学習モデルとしてニューラルネットワークを用いて前記距離に関する情報を認識する請求項5に記載の測定装置。
- 前記第1の基板と前記第2の基板との間の接続は、銅電極同士の接続である請求項6に記載の測定装置。
- 被測定物に光を照射する光源と、
第1の基板に配置され、前記被測定物から反射される光を受光した際に信号を出力する複数の受光素子と、
前記第1の基板とは異なる第2の基板に配置され、前記複数の受光素子の出力信号に基づき、機械学習モデルを用いて前記被測定物までの距離に関する情報を認識する認識部と、
を備える測距装置。
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE112020003847.5T DE112020003847T5 (de) | 2019-08-13 | 2020-08-04 | Messungsvorrichtung und entfernungsmessungsvorrichtung |
| JP2021539218A JP7614096B2 (ja) | 2019-08-13 | 2020-08-04 | 測定装置および測距装置 |
| US17/624,844 US20220268890A1 (en) | 2019-08-13 | 2020-08-04 | Measuring device and distance measuring device |
| CN202080055841.6A CN114207472A (zh) | 2019-08-13 | 2020-08-04 | 测量装置和测距装置 |
| KR1020227003510A KR20220043125A (ko) | 2019-08-13 | 2020-08-04 | 측정 장치 및 측거 장치 |
| EP20851965.2A EP4015992A4 (en) | 2019-08-13 | 2020-08-04 | MEASURING DEVICE AND DISTANCE MEASURING DEVICE |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019-148675 | 2019-08-13 | ||
| JP2019148675 | 2019-08-13 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021029270A1 true WO2021029270A1 (ja) | 2021-02-18 |
Family
ID=74569568
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/029766 Ceased WO2021029270A1 (ja) | 2019-08-13 | 2020-08-04 | 測定装置および測距装置 |
Country Status (8)
| Country | Link |
|---|---|
| US (1) | US20220268890A1 (ja) |
| EP (1) | EP4015992A4 (ja) |
| JP (1) | JP7614096B2 (ja) |
| KR (1) | KR20220043125A (ja) |
| CN (1) | CN114207472A (ja) |
| DE (1) | DE112020003847T5 (ja) |
| TW (3) | TWI839653B (ja) |
| WO (1) | WO2021029270A1 (ja) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12260610B2 (en) * | 2022-03-24 | 2025-03-25 | Objectvideo Labs, Llc | Dual descriptor data for object recognition in low light conditions |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016176750A (ja) | 2015-03-19 | 2016-10-06 | 株式会社豊田中央研究所 | 光学的測距装置 |
| WO2017057056A1 (ja) * | 2015-09-30 | 2017-04-06 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
| WO2019044487A1 (ja) * | 2017-08-28 | 2019-03-07 | ソニーセミコンダクタソリューションズ株式会社 | 測距装置、及び、測距方法 |
Family Cites Families (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7983817B2 (en) * | 1995-06-07 | 2011-07-19 | Automotive Technologies Internatinoal, Inc. | Method and arrangement for obtaining information about vehicle occupants |
| DE4433772A1 (de) * | 1994-09-22 | 1996-03-28 | Micro Epsilon Messtechnik | Sensoranordnung und Verfahren zur Meßwerterfassung mit der Sensoranordnung |
| JP2002286844A (ja) * | 2001-03-28 | 2002-10-03 | Denso Corp | 距離測定装置 |
| JP4882592B2 (ja) * | 2006-08-15 | 2012-02-22 | 日産自動車株式会社 | 光抽出装置、光抽出方法及び距離計測システム |
| JP5266709B2 (ja) | 2007-10-19 | 2013-08-21 | 日産自動車株式会社 | 距離計測装置、距離計測方法および車両 |
| US8804101B2 (en) * | 2012-03-16 | 2014-08-12 | Advanced Scientific Concepts, Inc. | Personal LADAR sensor |
| TWI599757B (zh) * | 2012-12-19 | 2017-09-21 | 巴斯夫歐洲公司 | 用於偵測至少一物件之偵測器、其用途、用於偵測至少一物件之方法、人機介面、娛樂器件、追蹤系統及攝影機 |
| US10229502B2 (en) * | 2016-02-03 | 2019-03-12 | Microsoft Technology Licensing, Llc | Temporal time-of-flight |
| US9760837B1 (en) * | 2016-03-13 | 2017-09-12 | Microsoft Technology Licensing, Llc | Depth from time-of-flight using machine learning |
| JP2018031607A (ja) * | 2016-08-23 | 2018-03-01 | ソニーセミコンダクタソリューションズ株式会社 | 測距装置、電子装置、および、測距装置の制御方法 |
| CN109804266B (zh) * | 2016-11-30 | 2023-09-19 | 索尼半导体解决方案公司 | 测距装置及测距方法 |
| JP2018129374A (ja) * | 2017-02-07 | 2018-08-16 | ソニーセミコンダクタソリューションズ株式会社 | 半導体装置および半導体装置の製造方法 |
| JP2018174246A (ja) * | 2017-03-31 | 2018-11-08 | ソニーセミコンダクタソリューションズ株式会社 | 半導体装置および電子機器 |
| WO2019139656A1 (en) * | 2017-10-13 | 2019-07-18 | The Regents Of The University Of Michigan | Material-sensing light imaging, detection, and ranging (lidar) systems |
| US20190187253A1 (en) * | 2017-12-14 | 2019-06-20 | Vathys, Inc. | Systems and methods for improving lidar output |
| US20190212445A1 (en) * | 2018-01-10 | 2019-07-11 | Integrated Device Technology, Inc. | Laser distance sensing using prior measurement information |
| US11507087B2 (en) * | 2018-11-07 | 2022-11-22 | Gm Cruise Holdings Llc | Distributed integrated sensing and communication module |
| CN109256042A (zh) * | 2018-11-22 | 2019-01-22 | 京东方科技集团股份有限公司 | 显示面板、电子设备及人眼追踪方法 |
| WO2020118279A1 (en) * | 2018-12-06 | 2020-06-11 | Finisar Corporation | Optoelectronic assembly |
| JP2022020871A (ja) * | 2018-12-06 | 2022-02-02 | パナソニックIpマネジメント株式会社 | 物体認識装置、物体認識方法、およびプログラム |
-
2020
- 2020-08-04 CN CN202080055841.6A patent/CN114207472A/zh active Pending
- 2020-08-04 DE DE112020003847.5T patent/DE112020003847T5/de active Pending
- 2020-08-04 TW TW110140033A patent/TWI839653B/zh active
- 2020-08-04 US US17/624,844 patent/US20220268890A1/en not_active Abandoned
- 2020-08-04 WO PCT/JP2020/029766 patent/WO2021029270A1/ja not_active Ceased
- 2020-08-04 KR KR1020227003510A patent/KR20220043125A/ko not_active Withdrawn
- 2020-08-04 EP EP20851965.2A patent/EP4015992A4/en not_active Withdrawn
- 2020-08-04 TW TW110138848A patent/TWI839646B/zh active
- 2020-08-04 TW TW109126286A patent/TWI748588B/zh not_active IP Right Cessation
- 2020-08-04 JP JP2021539218A patent/JP7614096B2/ja active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016176750A (ja) | 2015-03-19 | 2016-10-06 | 株式会社豊田中央研究所 | 光学的測距装置 |
| WO2017057056A1 (ja) * | 2015-09-30 | 2017-04-06 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
| WO2019044487A1 (ja) * | 2017-08-28 | 2019-03-07 | ソニーセミコンダクタソリューションズ株式会社 | 測距装置、及び、測距方法 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4015992A4 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4015992A4 (en) | 2022-10-12 |
| JP7614096B2 (ja) | 2025-01-15 |
| US20220268890A1 (en) | 2022-08-25 |
| JPWO2021029270A1 (ja) | 2021-02-18 |
| KR20220043125A (ko) | 2022-04-05 |
| TW202111351A (zh) | 2021-03-16 |
| TW202208877A (zh) | 2022-03-01 |
| TWI839646B (zh) | 2024-04-21 |
| TWI839653B (zh) | 2024-04-21 |
| TW202204931A (zh) | 2022-02-01 |
| TWI748588B (zh) | 2021-12-01 |
| DE112020003847T5 (de) | 2022-05-19 |
| EP4015992A1 (en) | 2022-06-22 |
| CN114207472A (zh) | 2022-03-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2021029262A1 (ja) | 装置、測定装置、測距システムおよび方法 | |
| US20050271254A1 (en) | Adaptive template object classification system with a template generator | |
| JP2020139810A (ja) | 測定装置および測距装置 | |
| TWI732425B (zh) | 受光裝置及測距裝置 | |
| US12259471B2 (en) | Measurement apparatus, distance measurement apparatus, and measurement method | |
| WO2020255855A1 (ja) | 測距装置および測距方法 | |
| WO2020255770A1 (ja) | 測距装置、測距方法、および、測距システム | |
| WO2021010174A1 (ja) | 受光装置、および、受光装置の駆動方法 | |
| WO2021065495A1 (ja) | 測距センサ、信号処理方法、および、測距モジュール | |
| WO2021065494A1 (ja) | 測距センサ、信号処理方法、および、測距モジュール | |
| US20220128690A1 (en) | Light receiving device, histogram generating method, and distance measuring system | |
| JP7614096B2 (ja) | 測定装置および測距装置 | |
| US20220075029A1 (en) | Measuring device, distance measuring device and measuring method | |
| US12366641B2 (en) | Ranging apparatus and measuring apparatus | |
| WO2021065500A1 (ja) | 測距センサ、信号処理方法、および、測距モジュール | |
| WO2020153262A1 (ja) | 計測装置および測距装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20851965 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2021539218 Country of ref document: JP Kind code of ref document: A |
|
| ENP | Entry into the national phase |
Ref document number: 2020851965 Country of ref document: EP Effective date: 20220314 |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 1020227003510 Country of ref document: KR |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 2020851965 Country of ref document: EP |