[go: up one dir, main page]

CN114814878A - Depth data measuring head, measuring equipment, control system and corresponding method - Google Patents

Depth data measuring head, measuring equipment, control system and corresponding method Download PDF

Info

Publication number
CN114814878A
CN114814878A CN202110064415.5A CN202110064415A CN114814878A CN 114814878 A CN114814878 A CN 114814878A CN 202110064415 A CN202110064415 A CN 202110064415A CN 114814878 A CN114814878 A CN 114814878A
Authority
CN
China
Prior art keywords
depth data
light
light source
source module
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110064415.5A
Other languages
Chinese (zh)
Other versions
CN114814878B (en
Inventor
王敏捷
梁雨时
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Tuyang Information Technology Co ltd
Original Assignee
Shanghai Tuyang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tuyang Information Technology Co ltd filed Critical Shanghai Tuyang Information Technology Co ltd
Priority to CN202110064415.5A priority Critical patent/CN114814878B/en
Publication of CN114814878A publication Critical patent/CN114814878A/en
Application granted granted Critical
Publication of CN114814878B publication Critical patent/CN114814878B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/495Counter-measures or counter-counter-measures using electronic or electro-optical means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

公开了一种深度数据测量头、测量装置、控制系统及对应方法。测量头,包括:光源模块,用于分区域地向被测空间投射激光脉冲;飞行时间(ToF)传感器,用于分区域地接收所述被测空间的返回光并生成传感信号,所述传感信号表征所述光的飞行时间以计算被测空间内拍摄对象的距离;以及控制器,用于控制所述光源模块和所述ToF传感器各自使用对应的分区投入工作。本发明的深度数据测量方案利用ToF传感器,尤其是卷帘型盖革模式传感器来实现基于直接飞行时间的高精度深度数据测量。该方案进一步结合分区工作的发光模块,在保证对应传感区域的正常曝光的同时,降低功耗。

Figure 202110064415

Disclosed are a depth data measuring head, a measuring device, a control system and a corresponding method. The measuring head includes: a light source module for projecting laser pulses to the measured space in sub-regions; a time-of-flight (ToF) sensor for sub-regionally receiving the returned light from the measured space and generating sensing signals, the The sensing signal represents the flight time of the light to calculate the distance of the photographed object in the measured space; and the controller is used to control the light source module and the ToF sensor to use corresponding partitions to work. The depth data measurement scheme of the present invention utilizes ToF sensors, especially rolling shutter Geiger mode sensors, to realize high-precision depth data measurement based on direct flight time. The solution is further combined with a light-emitting module that works in zones, which reduces power consumption while ensuring the normal exposure of the corresponding sensing area.

Figure 202110064415

Description

Depth data measuring head, measuring equipment, control system and corresponding method
Technical Field
The present invention relates to depth imaging, and more particularly, to a depth data measuring head, a measuring device, a control system, and a corresponding method.
Background
In recent years, three-dimensional imaging techniques have been developed vigorously. This is not unlike conventional two-dimensional image acquisition, a significant advance in depth data measurement technology.
Currently, a structured light based depth measurement scheme is capable of three-dimensional measurement of the surface of an object in real time. Briefly, the scheme first projects a two-dimensional laser texture pattern, such as a discretized speckle pattern, with encoded information onto a surface of a natural body. Monocular or binocular image acquisition is then implemented to calculate depth information based on a comparison with a reference image or disparity between binocular images.
The scheme of actively projecting the structured light is widely applied to the field of relatively fixed shooting equipment, such as security, intelligent equipment face recognition, factory quality inspection and the like. However, when the shooting device is applied to a motion mechanism (such as driving or other motion scenes), the existing structured light scheme has the problems of limited range and unsuitability for high-speed movement.
For this reason, a depth data measurement scheme suitable for a moving scene is required.
Disclosure of Invention
The technical problem to be solved by the present disclosure is to provide a depth data measurement scheme, which is particularly suitable for an application scenario in which the measurement device itself is in a motion state. Specifically, a depth data measuring head of the present invention includes a light source module emitting laser pulses in divided regions, and a ToF sensor, particularly a dtod (direct time of flight) sensor, receiving returning light in respective divided regions, for acquiring sensing information in divided regions, particularly in columns, to match existing computational limitations and reduce power consumption.
According to a first aspect of the present disclosure, there is provided a depth data measurement head comprising: the light source module is used for projecting laser pulses to a measured space in regions; a time-of-flight (ToF) sensor for receiving the returned light of the space under test regionally and generating a sensing signal characterizing the time-of-flight of the light to calculate a distance to a subject within the space under test; and a controller for controlling the light source module and the ToF sensor to be operated by using the corresponding partitions respectively.
Optionally, the light source module and the ToF sensor each being put into operation using a corresponding partition includes: the measured space area irradiated by the laser pulse projected by the light source module corresponding to the subarea covers the measured space area of the corresponding subarea of the ToF sensor for receiving the return light.
Optionally, the light source module includes a plurality of light emitting areas, and the ToF sensor includes a sensing array based on geiger mode, and controlling the light source module and the ToF sensor to each be operated using a corresponding segment includes: the sensor controls the plurality of light emitting areas to be lighted in turn, and controls one or more columns in the sensor array corresponding to the lighted spatial area to be measured to receive the return light.
Optionally, the plurality of light emitting areas includes N light emitting areas, each light emitting area corresponding to M columns in the sensor array, where MxN is equal to the total number of columns in the sensor array, the sensor controls the plurality of light emitting areas to be illuminated in turn, and the controlling one or more columns in the sensor array corresponding to illuminating the spatial area under test to receive return light includes: the sensor controls each light emitting area to emit M light pulses, and the M rows in the sensing array corresponding to the light emitting area receive the return light one by one.
Optionally, the plurality of light emitting regions each comprise a plurality of VCSEL cells. Optionally, the VCSEL chip is composed of multiple groups of VCSEL units capable of being lighted by zones.
Optionally, the light source module further includes: the diffusion sheet is arranged in the exit direction of the VCSEL chip; and/or a power detection element for detecting whether the VCSEL chip is working normally.
Optionally, each group of VCSEL cells includes: at least two VCSEL cell sub-groupings, the controller capable of controlling any one of the group of VCSEL cells to simultaneously emit optical pulses.
Optionally, the ToF sensor comprises: a silicon photomultiplier (SiPM) that receives the returning light column by column.
According to a second aspect of the present disclosure, there is provided a depth data measuring apparatus comprising: the depth data measuring head of the first aspect; a processor to: performing distance calculation based on the sensing signals generated by the ToF sensor in different regions; and synthesizing the sensing signal calculation results aiming at the plurality of ToF sensor partitions into depth data of the measured space and outputting the depth data.
Optionally, the processor integrates the functionality of the controller.
According to a third aspect of the present disclosure, there is provided a motion device control system including: a depth data measuring apparatus according to a second aspect for collecting return light of a space to be measured during movement of a moving device and generating a depth data output; a control unit for generating a control signal based on the depth data output; and an action unit for changing or maintaining the movement of the movement device based on the control signal.
According to a fourth aspect of the present disclosure, there is provided a depth data measuring method, including:
controlling the light source module to project laser pulses to a measured space in regions; a time-of-flight (ToF) sensor is controlled to receive the return light of the space under test in a divisional manner and generate a sensing signal, wherein the light source module and the ToF sensor are each controlled to be operated using a corresponding divisional portion, and the sensing signal characterizes a time-of-flight of the light to calculate a distance to a subject within the space under test.
Optionally, the ToF sensor is a geiger-mode based sensing array with MxN columns and the light source modules are arranged as N light emitting areas, wherein controlling the light source modules to project laser pulses regionally to the space under test comprises: controlling each of the N light-emitting areas to emit M pulses one by one, and controlling the ToF sensor to receive the return light of the measured space in different areas and generate a sensing signal comprises the following steps: and controlling the ToF sensor to receive the MxN pulses column by column and generate sensing signals column by column.
Optionally, the method further comprises: performing distance calculation based on the sensing signals generated by the ToF sensor in different regions; and synthesizing the sensing signal calculation results aiming at the plurality of ToF sensor partitions into depth data of the measured space and outputting the depth data.
The ToF sensor and the light source module may be depth data measuring devices mounted on a moving apparatus. To this end, the method may further include: generating a control signal based on the depth data output; and changing or maintaining the motion of the motion device based on the control signal.
Thus, the depth data measurement scheme of the present invention utilizes ToF sensors, particularly sipms operating in roller shutter mode, to achieve high precision depth data measurement based on direct time of flight. The scheme is further combined with the light-emitting module which works in a partitioning mode, normal exposure of the corresponding sensing area is guaranteed, and meanwhile power consumption is reduced.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
FIG. 1 shows a schematic composition of a depth data measurement head according to one embodiment of the present disclosure.
FIG. 2 shows a schematic view of a light source module working partition encompassing a ToF sensor working partition.
Fig. 3 shows an example of a VCSEL chip including a plurality of light emitting regions.
Figure 4 shows an example of a VCSEL chip including sub-packets within a partition.
Fig. 5 shows a composition example of the light source module.
FIG. 6 illustrates an exemplary flow diagram of a depth data measurement method according to one embodiment of the invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In order to acquire depth data of a third dimension of the three-dimensional data, acquisition by a depth sensor is required. Depth sensors refer to optical sensors that employ an array of pixels to obtain a high resolution depth profile of the entire scene. The mechanisms commonly used to measure depth include three types of structured light, binocular and ToF (time of flight), where both structured light and binocular techniques (including combinations thereof) are based on geometric principles for indirect depth estimation, and ToF measures the time of flight between emitted and reflected light and estimates depth directly from speed of light.
Further, although the target distance is directly estimated from the time of flight between the emitted light and the reflected light, unlike the conventional line-scanning radar which can only obtain a low-density point cloud, the ToF sensor used in the present invention can obtain a high-resolution distance/depth distribution of the measurement scene by a high-density ToF sensing pixel array.
ToF can be divided into two categories, iToF (indirect time of flight) and dtofs (direct time of flight). The principle of dtod, i.e. directly emitting a light pulse, and then measuring the time interval between the reflected light pulse and the emitted light pulse, allows to obtain the time of flight of the light. The principle of iToF is relatively complex. In iToF, not one light pulse, but modulated light, is emitted. A phase difference exists between the received reflected modulated light and the emitted modulated light, and the time of flight can be measured by detecting the phase difference, thereby estimating the distance.
The iToF is in principle a contradiction between the maximum ranging distance and the ranging accuracy. For example, if the distance of the current target is 0.15m, the time of flight of the emitted and reflected light adds up to 1 ns. When the modulation frequency of the modulated light is 100MHz (the period is 10ns), the time difference of flight of 1ns is converted into a phase difference of 36 degrees, and if the modulation frequency of the modulated light is 10MHz (the period is 100ns), the time difference of flight of 1ns is converted into a phase difference of 3.6 degrees. Since the phase difference of 36 degrees is easier to detect than the phase difference of 3.6 degrees, the higher the modulation optical modulation frequency of the iToF, the better the distance measurement accuracy. But higher modulation frequencies also limit the maximum ranging distance. For example, when the modulation frequency is 100MHz, then whether the time of flight is 1ns or 11ns reflects 36 degrees in modulation phase difference, so its maximum ranging distance is limited by the modulation period; for example, the maximum range at a modulation period of 10ns is the corresponding range (1.5m) at a time of flight equal to the modulation period. Therefore, the main application scenario of iToF is an application (e.g. a mobile phone, etc.) with a distance of several meters. In contrast, dtofs do not have this conflict between range distance and range accuracy.
In a particular implementation, dtofs are much more difficult than itofs. The difficulty with dtofs is that the optical signal to be detected is a pulsed signal and therefore the sensitivity ratio of the detector to light needs to be very high. ToF sensor implementations can use SPAD (single-photon avalanche diode). The operating region of the SPAD is located near the breakdown region of the diode, and a large number of electron-hole pairs are generated after a single photon enters the SPAD, so that the SPAD can detect very weak optical pulses. However, the integration level of the existing SPAD is low, so that the 2D resolution of the dToF sensor is poor. In addition, dtofs need to be able to resolve very fine time differences and therefore require extremely accurate sensing circuitry.
Due to the fact that the existing structured light and binocular scheme has the problems that when the shooting equipment is located in an application scene (such as driving or other motion scenes) of a motion mechanism, the range measurement range is limited, the shooting equipment is not suitable for high-speed movement, and the like. For this purpose, the invention turns to the use of ToF, in particular dtofs arrays, for time-of-flight-based depth distance measurement. The above measurement can be combined with a light source module that is lit by zones in order to reduce power consumption while acquiring a high-precision image.
FIG. 1 shows a schematic composition of a depth data measurement head according to one embodiment of the present disclosure. As shown in fig. 1, the depth data measuring head 100 may include a light source module 110 and a ToF sensor 120. The light source module 110 and the ToF sensor 120 may be fixed on a base for subsequent installation within the housing. Although not shown in the drawings, the depth data measurement 100 also needs to include a controller (e.g., may be disposed on the back of the base) for controlling the light source module 110 and the ToF sensor 120 to be operated.
In the present invention, the light source module 110 may project laser pulses to the measured space in regions. The ToF sensor 120 can receive the return light of the measured space in different areas and generate a sensing signal. Herein, "zoning" may refer to the component being brought into operation regionally. For example, the light source module 110 may illuminate only a certain area at a time for pulse emission; the ToF sensor 120 may only turn on a certain area at a time for return light sensing. The "sensor signal" can characterize the time of flight of the light to calculate the distance of the object in the measured space. Preferably, the ToF sensor 120 may be a sensor that performs distance calculation using the dtod (direct time of flight) principle. The controller is used to control the light source module 110 and the ToF sensor 120 to be operated by using the corresponding sub-zones.
Here, the operation of the light source module 110 in the divided areas may be that the light source module 110 is fixedly or variably divided into N areas, and only the light source module of a certain area is turned on at each operation time, that is, only the light emitting units in the area emit light pulses. Accordingly, the operation of the ToF sensor 120 in the divided areas may be that the ToF sensor 120 is fixedly or variably divided into P areas, and only one of the sensor sections is operated at each operation time, that is, only the sensing unit in the section receives the return light.
Since the return light is received only when it is illuminated, the work area of the ToF sensor needs to be matched with the work area of the light source module. To this end, the light source module 110 and the ToF sensor 120 each being put into operation using a corresponding partition may include: the spatial region to be measured irradiated by the laser pulse projected by the light source module 110 corresponding to the partition covers the spatial region to be measured of the ToF sensor 120 corresponding to the partition for receiving the return light. In other words, the spatial area illuminated by the work area of the light source module is at least not smaller than the sensor area that receives the returning light.
FIG. 2 shows a schematic view of a light source module working partition encompassing a ToF sensor working partition. As shown, the overall illumination range of the light source module may be comparable to the overall viewing range of the ToF sensor, for example, the illumination range (thick solid line) is slightly larger than the viewing range (thick dashed line). Since the light source module is put into operation in a partitioned manner, as shown in fig. 2, at a certain time, a certain region of the light source module emits a laser pulse to irradiate a range shown by a thin solid line. In order to image the emitted light pulse, the viewing range corresponding to the active ToF sensor segment should be within the range indicated by the thin dashed line, not exceeding the pulse projection range indicated by the thin solid line.
More specifically, the light source module 110 may include a plurality of light emitting areas, and the ToF sensor includes a sensing array. To this end, the controller controlling the light source module and the ToF sensor to each be operated using the corresponding segment may include: the sensor controls the plurality of light emitting areas to be lighted in turn, and controls one or more columns in the sensing array corresponding to the lighted spatial area to receive the return light.
When the viewing range corresponding to the ToF sensor sub-area is smaller than the illumination range corresponding to the light source module sub-area, as shown in fig. 2, in a complete sensor imaging process, the light emitting area of each light source module needs to be lit multiple times, so that each row of sensing units in the corresponding viewing area can obtain the return light. For example, if the projection range of one light-emitting region corresponds to the viewing range of 50 columns of sensing units, and the sensing units capable of receiving the returning light at a time are smaller than 50 columns (for example, 1 column at a time, or 10 columns at a time), it means that the light-emitting region needs to be projected multiple times (for example, 50 times, or 5 times, respectively) so that each corresponding column of sensing units can acquire the returning light.
In one embodiment, the plurality of light emitting areas includes N light emitting areas, each light emitting area corresponding to M columns in the sense array, where MxN is equal to the total number of columns in the sense array. The sensor may control the plurality of light emitting areas to be illuminated in turn, and controlling one or more columns of the sensing array corresponding to the illuminated spatial area to receive return light comprises: the sensor controls each light emitting area to emit M light pulses, and the M columns in the sensor array corresponding to the light emitting areas receive the return light one by one.
For example, the ToF sensor may be a 400 × 100 sensing array, i.e., including 400 columns of 100 sensing units each (i.e., P ═ 400). The light source module may include 8 light emitting areas (N-8) arranged vertically, and each light emitting area may correspond to 50 columns (M-P/N-50) in the sensing array. In one embodiment, the controller may first control the first light-emitting region to be operated, for example, to emit 50 pulses at fixed intervals, so that the 1 st to 50 th columns in the sensor are exposed one by one; the second light emitting region can then be controlled to operate, for example, to emit 50 pulses at regular intervals, so that the 51 st to 100 th columns in the sensor are exposed one by one; and so on until the eighth region is activated, for example, 50 pulses are sent out at regular intervals, so that the 351-400 th columns in the sensor are exposed one by one. For this, the light source module completes exposure of 400 columns of sensing units after being divided into 8 regions and subjected to pulse emission 400 times in total. For this reason, 7/8 is saved in luminous efficiency compared to a scheme of full lighting at a time.
In some embodiments, the plurality of light emitting regions included in the light emitting module may be independent single light emitting regions, and in other embodiments, the plurality of light emitting regions may each include a plurality of VCSEL (vertical cavity surface emitting laser) cells. Preferably, the light source module may include: a VCSEL chip is composed of multiple groups of VCSEL units which can be lighted in a partitioned mode. Fig. 3 shows an example of a VCSEL chip including a plurality of light emitting regions. As shown, the VCSEL chip includes 5 light emitting regions (N ═ 5), each region including 32 light emitting units, i.e., 160 light emitting cells divided into 5 groups of 32 cells each. The chip is usually placed vertically during operation so that each independently illuminated cell group can illuminate a vertically long area in the space under test for depth distance measurement in combination with, for example, a ToF sensor working column by column for a photographic subject in the range of 20 meters.
In certain embodiments, each group of VCSEL cells may be further grouped. Figure 4 shows an example of a VCSEL chip including sub-packets within a partition. Fig. 4 further illustrates, in dashed lines, the grouping of VCSEL cells on a VCSEL chip, as compared to fig. 3. Similarly to fig. 3, the VCSEL chip includes 5 light emitting regions (N ═ 5), each region including 32 light emitting units, that is, 160 light emitting cells divided into 5 groups of 32 cells each. However, unlike fig. 3, in which the light emitting cells within each group are all used to be put into operation at the same time, the 32 light emitting cells of each group are further divided into two subgroups (open circles and filled circles). The individual elements belonging to the two subgroups are preferably distributed in a staggered manner with respect to one another in order to be able to illuminate the corresponding measured space uniformly when the subgroups are individually illuminated. The controller may control any group of VCSEL cell sub-packets in the group of VCSEL cells to emit optical pulses simultaneously. For example, under the requirement of a 10-meter ranging range, under the condition of low electric quantity and/or under the condition of small external interference, only any sub-group is put into projection each time, namely, only 16 light-emitting single bodies are lighted each time. And under the requirement of the 20-meter ranging range, the condition of sufficient electric quantity and/or the condition of large external interference, all the sub-groups can be put into projection each time, namely, all the 32 luminous monomers in the group are lightened each time.
The light source module may further include a diffusion sheet in addition to the VCSEL chip. Fig. 5 shows a composition example of the light source module. As shown, the light source module 510 may include a VCSEL chip 511, and a diffusion sheet 512 positioned in an exit direction of the VCSEL chip 511 for making an exit light distribution more uniform. The light source module may further include a package case for fixing the chip 511 and the diffusion sheet 512. In addition, although not shown, the light source module may further include a power detection element for detecting whether the VCSEL chip is normally operated.
As previously mentioned, ToF sensors are devices that derive high resolution distance/depth profiles of a measurement scene based on direct time-of-flight through a high density ToF sensing pixel array. In a preferred embodiment, the ToF sensor may be a Geiger Mode (Geiger Mode) based sensor array.
Here, the "geiger mode" is explained for the convenience of understanding the present invention. Conventional Avalanche Photodiodes (APDs) differ from conventional p-n junction photodiodes in that APDs are capable of withstanding higher bias voltages. When a photon is absorbed by an APD, a pair of electron-hole pairs is generated, which is referred to as a primary electron. The main electrons are accelerated by the strong electric field generated by the high bias voltage to gain sufficient energy, and then collide with the lattice to generate additional electron-hole pairs, which lose some kinetic energy, a process known as impact ionization. The electrons (or holes) and secondary electrons (or holes) are then accelerated again by the strong electric field, creating more electron-hole pairs, the so-called "avalanche" phenomenon, and the current increases exponentially. Over several transitions, the generation rate and absorption rate of electron-hole pairs are balanced. If the bias voltage of the APD is lower than its breakdown voltage, the absorption rate of the electron-hole pair is greater than the generation rate, so that the electron-hole pair drops, and at this time, the average photocurrent generated by the APD is proportional to the incident light, and the proportionality coefficient is the gain factor M, so the process is called a linear operation mode. By utilizing the linear proportional relation, the measurement of the intensity of the incident light signal can be realized.
However, if the bias voltage of an APD is higher than its breakdown voltage, the rate at which the APD generates electron-hole pairs due to impact ionization is greatly increased, faster than the absorption rate of the electron-hole pairs, which causes the current to increase exponentially with time, causing avalanche to occur, resulting in a current pulse. The increase of photocurrent will weaken the strong electric field of APD due to high bias voltage, which will decrease the avalanche rate, so that the photocurrent will decrease and finally reach equilibrium, and the photocurrent will not change after equilibrium. The balance process is formed mainly because the equivalent resistance on the APD provides a negative feedback, and the increase of the photocurrent increases the voltage drop on the equivalent resistance, so as to offset a part of the bias voltage, so that the bias voltage on the APD decreases, which leads to the decrease of the avalanche rate, so that the photocurrent decreases, which leads to the decrease of the voltage drop on the equivalent resistance, and the just reverse process occurs, so that the photocurrent increases, and after a period of time, a balance state is formed. If the photocurrent is stabilized to be greater than several hundred microamps, the photocurrent will remain constant at all times, i.e., the equilibrium state will be maintained at all times, and will no longer respond to incident photons. The mode is the geiger mode, the most remarkable characteristic of the mode is that the mode can respond to a single photon event, the mode is the most different from the linear mode in that only the existence of photons is responded, and the number of photons cannot be distinguished, and the mode is also the most characteristic of the GAPD (geiger mode avalanche photodiode). To be able to respond to the next photon event, the APD must be quenched by connecting it to a quenching circuit and then recharged for the APD to function properly. Ideally, the APD does not respond to any photons during quenching until charging is complete, which is referred to as the "blind time".
SPAD is a photodiode that operates in the geiger mode, just like a photon triggered switch, in either the "on" or "off" state. The ToF sensor preferably used in the present invention is a silicon photomultiplier (SiPM) consisting of a plurality of individual SPAD sensors. The SiPM is composed of a plurality of independent SPAD sensors, each sensor has its own quenching resistance, thereby overcoming the disadvantage that a single SPAD cannot measure multiple photons simultaneously. Since the present invention uses sipms that receive returning light column by column, efficient and fast calculation of depth data can be achieved under the constraints of computation power and blind time.
In addition, it should be understood that the laser pulses emitted by the light source module of the present invention are pulses outside the visible light frequency band, such as near infrared pulses, and thus can be combined with a band pass filter to filter out the interference of extraneous ambient light.
The present invention can also be realized as a depth data measuring apparatus comprising: a depth data measuring head and a processor as described above. The processor is configured to: performing distance calculation based on the sensing signals generated by the ToF sensor in different regions; and synthesizing the sensing signal calculation results aiming at the plurality of ToF sensor partitions into depth data of the measured space and outputting the depth data.
In some embodiments, the measurement device may have a separate controller and processor, and in other embodiments, the processor may integrate all or part of the functionality of the controller.
The depth data measuring device of the invention is particularly suitable for depth data measurement on moving devices. To this end, the present invention may also be embodied as a motion device control system including: the depth data measuring equipment is used for collecting return light of a measured space during the movement of the moving device and generating a depth data output; a control unit for generating a control signal based on the depth data output; and an action unit for changing or maintaining the movement of the movement device based on the control signal.
In one embodiment, the movement device may be an automated or unmanned device, such as an unmanned vehicle, and the depth data measuring device of the present invention may be implemented as an on-board radar to collect and calculate the distance states of various objects within the environment in real time to facilitate corresponding actions (e.g., evasion, deceleration, etc.) by the vehicle. In another embodiment, the moving device may also be a handling apparatus, such as a handling robot in a logistics warehouse, or an automated delivery apparatus movable in a wider space, etc. The depth measuring equipment is installed to help the depth measuring equipment to complete cargo handling and daily operation.
The present invention can also be realized as a depth data measuring method. The method may be implemented by a depth measuring head as disclosed above or by a controller and/or processor of a measuring device. FIG. 6 illustrates an exemplary flow diagram of a depth data measurement method according to one embodiment of the invention.
In step S610, the light source module is controlled to project laser pulses to the measured space in regions. In step S620, a time-of-flight (ToF) sensor is controlled to receive the return light of the space under test in a divisional manner and generate a sensing signal, wherein the light source module and the ToF sensor are each controlled to be operated using a corresponding divisional area, and the sensing signal represents a time-of-flight of the light to calculate a distance to photograph an object within the space under test.
The depth data measuring method of the present invention may perform the respective operations described above in connection with the measuring head and the measuring device. In a preferred embodiment, the ToF sensor is a geiger-mode based sensing array with MxN columns and the light source modules are arranged as N light emitting areas. To this end, wherein the controlling the light source module to project the laser pulse to the measured space in the divided regions may include: and controlling each light emitting area in the N light emitting areas to emit M pulses one by one. Controlling the ToF sensor to receive the return light of the measured space in different areas and generate sensing signals comprises: and controlling the ToF sensor to receive the MxN pulses column by column and generate sensing signals column by column.
It should be understood that, while in consideration of design and control simplicity, each light-emitting region may be put into operation sequentially from left to right or from right to left for the light-emitting partitions shown in fig. 3 (i.e., the 1 st, 2 nd, 3 rd, 4 th, 5 th partitions each emit M pulses in sequence), and the ToF sensor may be subjected to rolling-up imaging column by column from left to right or from right to left (i.e., the columns of sensing units that follow the 1 st to 400 th columns one by one), the invention is not limited to this order. In other embodiments, partition 1, 2, 3, 4, 5 or partition 5, 4, 3, 2, 1 need not be in order, but may be, for example, 1, 3, 5, 2, 4 or any order such as 4, 3, 5, 1, 2; the sensors working column by column do not need to make the next adjacent column work every time; it is not even necessary that all sensor columns corresponding to the light-emitting sub-area are exposed before the projection and corresponding exposure of the next light-emitting sub-area is performed. As long as each column of the ToF sensor is exposed during the acquisition of a single sensed image. For example, the above description is made in connection with the light-emitting partition and the sensing column in the vertical direction, but in other embodiments, the light-emitting partition and the line-by-line exposure in the horizontal direction may be implemented, and the present invention is not limited thereto.
Further, the depth data measuring method further includes: performing distance calculation based on the sensing signals generated by the ToF sensor in different regions; and synthesizing the sensing signal calculation results of the plurality of ToF sensor partitions into depth data of the measured space and outputting the depth data.
Preferably, the ToF sensor and the light source module may be depth data measuring equipment mounted on a moving device. To this end, the measurement method may further include: generating a control signal based on the depth data output; and changing or maintaining the motion of the motion device based on the control signal.
The depth data measurement scheme according to the present invention has been described in detail above with reference to the accompanying drawings. The solution utilizes ToF sensors, in particular sipms operating in roller shutter mode, to achieve high precision depth data measurement based on direct time of flight. The scheme is further combined with the light-emitting module which works in a partitioning mode, normal exposure of the corresponding sensing area is guaranteed, and meanwhile power consumption is reduced.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (16)

1. A depth data measurement head comprising:
the light source module is used for projecting laser pulses to a measured space in regions;
a time-of-flight (ToF) sensor for receiving the returned light of the space under test regionally and generating a sensing signal characterizing the time-of-flight of the light to calculate a distance to a subject within the space under test;
and the controller is used for controlling the light source module and the ToF sensor to respectively use the corresponding subareas to work.
2. The depth data measurement head of claim 1, wherein the light source module and the ToF sensor each being put into operation with a corresponding zone comprises:
the measured space area irradiated by the laser pulse projected by the light source module corresponding to the subarea covers the measured space area of the corresponding subarea of the ToF sensor for receiving the return light.
3. The depth data measurement head of claim 2, wherein the light source module includes a plurality of light emitting areas and the ToF sensor includes a geiger-mode based sensing array, and
controlling the light source module and the ToF sensor to each be put into operation using a corresponding zone comprises:
the sensor controls the plurality of light emitting areas to be illuminated in turn and controls one or more columns of the sensor array corresponding to the illuminated spatial area to receive return light.
4. The depth data measurement head of claim 3, wherein the plurality of light emitting areas comprises N light emitting areas, each light emitting area corresponding to M columns of the sense array, wherein MxN is equal to a total number of columns of the sense array,
the sensor controls the plurality of light emitting areas to be lighted in turn, and the controlling one or more columns of the sensor array corresponding to the lighted spatial area to receive the return light comprises:
the sensor controls each light emitting area to emit M light pulses, and the M rows in the sensing array corresponding to the light emitting area receive the return light one by one.
5. The depth data measurement head of claim 3, wherein the plurality of light emitting areas each comprise a plurality of VCSEL cells.
6. The depth data measurement head of claim 5, wherein the light source module comprises:
the VCSEL chip is composed of multiple groups of VCSEL units which can be lighted in a subarea mode.
7. The depth data measurement head of claim 6, wherein the light source module further comprises:
the diffusion sheet is arranged in the exit direction of the VCSEL chip; and/or
And the power detection element is used for detecting whether the VCSEL chip works normally or not.
8. The depth data measurement head of claim 6, wherein each set of VCSEL cells comprises:
at least two of the VCSEL cells are sub-grouped,
the controller is capable of controlling any group of VCSEL cell sub-groups in the group of VCSEL cells to emit optical pulses simultaneously.
9. The depth data measurement head of claim 3, wherein the ToF sensor comprises:
silicon photomultiplier (SiPM) that receives the returning light column by column.
10. A depth data measuring apparatus comprising:
the depth data measurement head of any one of claims 1-9;
a processor to:
performing distance calculation based on the sensing signals generated by the ToF sensor in different regions; and
and synthesizing the sensing signal calculation results aiming at the plurality of ToF sensor partitions into depth data of the measured space and outputting the depth data.
11. The depth data measurement device of claim 10, wherein the processor integrates the functions of the controller.
12. A motion device control system comprising:
a depth data measuring apparatus as claimed in claim 10 or 11 for collecting return light from the measured volume during movement of the movement means and generating a depth data output;
a control unit for generating a control signal based on the depth data output; and
and the action unit is used for changing or maintaining the motion of the motion device based on the control signal.
13. A depth data measurement method, comprising:
controlling the light source module to project laser pulses to a measured space in regions;
a time-of-flight (ToF) sensor receives the return light of the measured space in regions and generates a sensing signal,
wherein the light source module and the ToF sensor are each controlled to be operated using a corresponding section, and the sensing signal characterizes a time of flight of the light to calculate a distance to a photographic subject in the measured space.
14. The method of claim 13, wherein the ToF sensor is a Geiger-mode based sensing array with MxN columns and light source modules are arranged as N light emitting areas,
wherein, control light source module divides regional ground to be surveyed the space and throws laser pulse and includes:
each of the N light emitting areas is controlled to emit M pulses one by one,
controlling the ToF sensor to receive the return light of the measured space in different areas and generate sensing signals comprises:
and controlling the ToF sensor to receive the MxN pulses column by column and generate sensing signals column by column.
15. The method of claim 13, further comprising:
performing distance calculation based on the sensing signals generated by the ToF sensor in different regions; and
and synthesizing the sensing signal calculation results of the plurality of ToF sensor partitions into depth data of the measured space and outputting the depth data.
16. The method of claim 15, wherein the ToF sensor and the light source module are depth data measuring devices mounted on a mobile device,
the method further comprises the following steps:
generating a control signal based on the depth data output; and
based on the control signal, an action of the motion device is changed or maintained.
CN202110064415.5A 2021-01-18 2021-01-18 Depth data measurement head, measurement equipment, control system and corresponding methods Active CN114814878B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110064415.5A CN114814878B (en) 2021-01-18 2021-01-18 Depth data measurement head, measurement equipment, control system and corresponding methods

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110064415.5A CN114814878B (en) 2021-01-18 2021-01-18 Depth data measurement head, measurement equipment, control system and corresponding methods

Publications (2)

Publication Number Publication Date
CN114814878A true CN114814878A (en) 2022-07-29
CN114814878B CN114814878B (en) 2026-01-09

Family

ID=82524507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110064415.5A Active CN114814878B (en) 2021-01-18 2021-01-18 Depth data measurement head, measurement equipment, control system and corresponding methods

Country Status (1)

Country Link
CN (1) CN114814878B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106405572A (en) * 2016-11-10 2017-02-15 西安交通大学 Long distance high resolution laser active imaging device and method based on spatial coding
US20190018106A1 (en) * 2017-07-11 2019-01-17 4Sense, Inc. Light-Source Array for a Time-of-Flight Sensor and Method of Operation of Same
US20190018119A1 (en) * 2017-07-13 2019-01-17 Apple Inc. Early-late pulse counting for light emitting depth sensors
US20190310370A1 (en) * 2018-04-09 2019-10-10 Sick Ag Optoelectronic sensor and method for detection and distance determination of objects
US20190324143A1 (en) * 2018-04-20 2019-10-24 Sick Ag Optoelectronic sensor and method of distance determination
CN111598072A (en) * 2020-05-12 2020-08-28 深圳阜时科技有限公司 Image sensing device and electronic apparatus
CN111854625A (en) * 2019-04-29 2020-10-30 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106405572A (en) * 2016-11-10 2017-02-15 西安交通大学 Long distance high resolution laser active imaging device and method based on spatial coding
US20190018106A1 (en) * 2017-07-11 2019-01-17 4Sense, Inc. Light-Source Array for a Time-of-Flight Sensor and Method of Operation of Same
US20190018119A1 (en) * 2017-07-13 2019-01-17 Apple Inc. Early-late pulse counting for light emitting depth sensors
US20190310370A1 (en) * 2018-04-09 2019-10-10 Sick Ag Optoelectronic sensor and method for detection and distance determination of objects
US20190324143A1 (en) * 2018-04-20 2019-10-24 Sick Ag Optoelectronic sensor and method of distance determination
CN111854625A (en) * 2019-04-29 2020-10-30 上海图漾信息科技有限公司 Depth data measuring head, measuring device and measuring method
CN111598072A (en) * 2020-05-12 2020-08-28 深圳阜时科技有限公司 Image sensing device and electronic apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"激光应用与全息术和光学信息处理", 电子科技文摘, no. 09, 28 September 2006 (2006-09-28) *

Also Published As

Publication number Publication date
CN114814878B (en) 2026-01-09

Similar Documents

Publication Publication Date Title
CN114424086B (en) Processing system for LIDAR measurement
US10921454B2 (en) System and method for determining a distance to an object
KR102494430B1 (en) System and method for determining distance to object
CN110927734B (en) Laser radar system and anti-interference method thereof
US11543501B2 (en) Method for subtracting background light from an exposure value of a pixel in an imaging array, and pixel for use in same
CN101449181B (en) Distance measuring method and distance measuring instrument for detecting the spatial dimension of a target
CN112731425B (en) Histogram processing method, distance measurement system and distance measurement equipment
US10852400B2 (en) System for determining a distance to an object
KR20200085297A (en) Flight time detection using an addressable array of emitters
US20220334253A1 (en) Strobe based configurable 3d field of view lidar system
US20220326358A1 (en) Method and device for determining distances to a scene
CN216285732U (en) Depth data measuring head and motion device control system
CN114814878B (en) Depth data measurement head, measurement equipment, control system and corresponding methods
US20240192375A1 (en) Guided flash lidar
US20230243928A1 (en) Overlapping sub-ranges with power stepping
WO2021144340A1 (en) Apparatus and method for detecting two photon absorption
CN216211121U (en) Depth information measuring device and electronic apparatus
CN113075672A (en) Ranging method and system, and computer readable storage medium
CN220584396U (en) Solid-state laser radar measurement system
Hallman et al. 3-D Range Imaging Using Stripe-Like Illumination and SPAD-Based Pulsed TOF Techniques
US20240393438A1 (en) HYBRID LiDAR SYSTEM
CN118786362A (en) Overlapping subranges with power steps
CN113822875A (en) Depth information measuring device, full-scene obstacle avoidance method and electronic equipment
Cottin et al. Active 3D camera design for target capture on Mars orbit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: 210046 Jiangsu Province Nanjing City Nanjing Economic and Technological Development Zone Hengguang Road No. 2 Intelligent Manufacturing Pilot Plant Base Building No. 2 Room 101

Applicant after: Nanjing Tuyang Technology Co.,Ltd.

Address before: 201203 Shanghai Pudong New Area BiBo Road 635, Legend Square 302

Applicant before: SHANGHAI TUYANG INFORMATION TECHNOLOGY Co.,Ltd.

Country or region before: China

GR01 Patent grant
GR01 Patent grant