TWI858694B - Lidar system and resolusion improvement method thereof - Google Patents
Lidar system and resolusion improvement method thereof Download PDFInfo
- Publication number
- TWI858694B TWI858694B TW112117055A TW112117055A TWI858694B TW I858694 B TWI858694 B TW I858694B TW 112117055 A TW112117055 A TW 112117055A TW 112117055 A TW112117055 A TW 112117055A TW I858694 B TWI858694 B TW I858694B
- Authority
- TW
- Taiwan
- Prior art keywords
- laser
- lens
- receiver
- sub
- lidar system
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
- G01S7/4876—Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
本發明係關於一種光達系統,特別關於一種解析度提升的光達系統。 The present invention relates to a lidar system, and in particular to a lidar system with improved resolution.
近年,光達(Light Detection and Ranging,LiDAR)技術被廣泛使用於汽車的自動/半自動駕駛及安全警示。光達的主要部件包括:感測器(例如直接飛時測距(direct time of flight,D-ToF)感測器)、雷射光源、掃描器及資料處理器。現行的光達掃描方式可具有多種形式,例如以光學相位陣列(optical phased array,OPA)或繞射光學元件(diffractive optical element,DOE)投射小區塊光點,透過微機電系統(MEMS)微振鏡或多角反射鏡(polygon mirror)進行蛇狀來回掃描或斜向掃描一個大區塊,或透過DOE或多點線性排列光源或多次反射擴束投射線狀光束,以機械轉動方式橫向掃描一個大區塊等。透過上述掃描方式,感測器可接收反射後的光訊號。 In recent years, LiDAR (Light Detection and Ranging) technology has been widely used in the autonomous/semi-autonomous driving and safety warning of automobiles. The main components of LiDAR include: sensors (such as direct time of flight (D-ToF) sensors), laser light sources, scanners and data processors. Current lidar scanning methods can have many forms, such as using an optical phased array (OPA) or a diffractive optical element (DOE) to project a small area of light, using a micro-electromechanical system (MEMS) micro-vibration mirror or a polygon mirror to perform a serpentine back-and-forth scan or oblique scan of a large area, or using a DOE or a multi-point linear arrangement of light sources or multiple reflections to expand the beam and project a linear beam, and use mechanical rotation to scan a large area horizontally. Through the above scanning methods, the sensor can receive the reflected light signal.
然而,以上述掃描方式進行的雷射光源感測,其長寬比(screen ratio)較小,因此必須持續以較高頻率接收反射的光訊號。相較之下,快閃式光達(flash LiDAR)透過一次投射大面積光點,可在系統運算需求及整體耗能較低的情形下達成高頻率、高幀數的感測。而在光點密度固定的情形下,若能進一步提升快閃式光達的成像解析度,則可使成像更加清晰,增進距離測量的準確性,進一步增進行車安全。因此,亟需一種光達系統,其解析度相較於光點密度相同的現有技術更加提升,以正確判斷距離,維護行車安全。此外,亦亟需一種解析度提升方法,使光達系統的解析度相較於光點密度相同的現有技術更加提升,以正確判斷距離,維護行車安全。 However, the laser light source sensing performed in the above scanning method has a small aspect ratio (screen ratio), so it is necessary to continuously receive the reflected light signal at a higher frequency. In contrast, flash LiDAR can achieve high-frequency, high-frame sensing with lower system computing requirements and overall energy consumption by projecting a large area of light spots at one time. In the case of a fixed light spot density, if the imaging resolution of the flash LiDAR can be further improved, the imaging can be made clearer, the accuracy of distance measurement can be improved, and driving safety can be further improved. Therefore, there is an urgent need for a LiDAR system with a higher resolution than existing technologies with the same light spot density, so as to correctly judge the distance and maintain driving safety. In addition, there is an urgent need for a resolution enhancement method to improve the resolution of the lidar system compared to existing technologies with the same light spot density, so as to accurately judge the distance and maintain driving safety.
本發明的主要目的在於提供一種光達系統,其解析度相較於光點密度相同的現有技術更加提升,以正確判斷距離,維護行車安全。 The main purpose of the present invention is to provide a lidar system with a higher resolution than the existing technology with the same light spot density, so as to accurately judge the distance and maintain driving safety.
為了達成前述的目的,本發明提供一種光達系統,包括:一微控制器;一雷射光源,耦接至該微控制器;一鏡頭模組;以及一接收器,耦接至該微控制器,其中,該雷射光源發射多個不同波長的雷射光,並包括一光耦合器及一光纖,該光耦合器將該等雷射光光學耦合為一準直光訊號,經由該光纖傳輸。該鏡頭模組包括一雷射分光鏡模組及一接收器鏡頭模組,該雷射分光鏡模組接收來自該雷射光源發射的雷射光,並將該雷射光繞射為多束繞射光,該等繞射光向一目標發射。該雷射分光鏡模組包括一繞射光學元件及一準直鏡組。該接收器鏡頭模組接收該等繞射光接觸該目標後反射的一反射光訊號,並向該接收器發出該反射光訊號。該雷射光源以一週期時間射出一脈衝訊號。該微控制器控制該接收器在每一週期時間內的一感測器快門時間內開啟和一重置時間內關閉。在一幀的一子幀的一感測器快門時間內,該接收器的複數個像素接收該等不同波長的雷射光的至少一反射光訊號,取得複數個子幀的環境影像,並以該等反射光訊號所代表的距離值為該子幀的該等像素的距離值。該微控制器融合該複數個子幀的環境影像的該等像素的距離值成為該幀的一最終距離值。 In order to achieve the above-mentioned purpose, the present invention provides a lidar system, including: a microcontroller; a laser light source coupled to the microcontroller; a lens module; and a receiver coupled to the microcontroller, wherein the laser light source emits a plurality of laser lights of different wavelengths, and includes an optical coupler and an optical fiber, the optical coupler optically couples the laser lights into a collimated light signal, and transmits the collimated light signal through the optical fiber. The lens module includes a laser spectroscope module and a receiver lens module, the laser spectroscope module receives the laser light emitted from the laser light source, and diverts the laser light into a plurality of diverted lights, and the diverted lights are emitted toward a target. The laser spectroscope module includes a diverting optical element and a collimating lens assembly. The receiver lens module receives a reflected light signal reflected after the diffracted light contacts the target, and sends the reflected light signal to the receiver. The laser light source emits a pulse signal in a cycle time. The microcontroller controls the receiver to open in a sensor shutter time and close in a reset time in each cycle time. In a sensor shutter time of a subframe of a frame, a plurality of pixels of the receiver receive at least one reflected light signal of the laser light of different wavelengths, obtain environmental images of a plurality of subframes, and use the distance values represented by the reflected light signals as the distance values of the pixels of the subframe. The microcontroller fuses the distance values of the pixels of the multiple sub-frames of the environmental image into a final distance value of the frame.
為了達成前述的目的,本發明提供一種用於上述的光達系統的解析度提升方法,該方法包括:將該繞射光學元件設置為可動件,具有旋轉及/或往復移動的功能。在複數個旋轉角度或往復移動位置條件下,取複數子幀環境影像。該等反射光訊號在環境影像中的每一個像素代表一個子距離值,每一子幀環境影像由複數個子距離值組成具深度資訊的三維影像。在汰除異常子幀後,再將其餘複數子幀環境影像融合,若同一像素具複數個子距離值,則進行平均或擇一,若同一像素僅具有一個子距離值,則選擇該子距離值,若該像素不具距離值,則選擇測距範圍中的最大值或最小值(0),成為該幀的三維影像的該最終距離值。 In order to achieve the above-mentioned purpose, the present invention provides a method for improving the resolution of the above-mentioned lidar system, the method comprising: setting the diffraction optical element as a movable part with the function of rotation and/or reciprocating movement. Under the conditions of multiple rotation angles or reciprocating movement positions, multiple sub-frame environmental images are obtained. Each pixel of the reflected light signals in the environmental image represents a sub-range value, and each sub-frame environmental image is composed of multiple sub-range values to form a three-dimensional image with depth information. After eliminating abnormal sub-frames, the remaining multiple sub-frame environmental images are fused. If the same pixel has multiple sub-range values, they are averaged or selected. If the same pixel has only one sub-range value, the sub-range value is selected. If the pixel has no range value, the maximum or minimum value (0) in the range measurement range is selected as the final range value of the three-dimensional image of the frame.
本發明的功效在於,在光點密度固定的情形下,進一步提升快閃式光達的成像解析度,使成像更加清晰,增進距離測量的準確性,進一步增進行車安全。 The effect of the present invention is that, under the condition of fixed light spot density, the imaging resolution of the flash lidar is further improved, the imaging is clearer, the accuracy of distance measurement is improved, and driving safety is further improved.
100:光達系統 100: LiDAR system
101:微控制器 101:Microcontroller
102:雷射光源 102: Laser light source
104:雷射光 104: Laser light
106:鏡頭模組 106: Lens module
108:接收器鏡頭模組 108: Receiver lens module
110:雷射分光鏡模組 110: Laser spectrometer module
112:接收器 112: Receiver
120:目標 120: Target
122:像場 122: Image field
124:視場 124: Field of view
126:反射光 126: Reflected light
202:凹透鏡 202: Concave lens
204:凸透鏡 204: Convex lens
206:繞射光學元件 206:Diffraction optical element
208:凹面鏡 208: Concave mirror
210:準直鏡組 210: Collimator lens set
221,222,223,224:雷射光 221,222,223,224:Laser light
230:光耦合器 230: Optocoupler
240:光纖 240: Optical fiber
a:狹縫寬度 a: Seam width
θ1:繞射角 θ 1 : diffraction angle
L:投射距離 L: Projection distance
y1,y2,...-y1,-y2,...:暗紋位置 y 1 ,y 2 ,...-y 1 ,-y 2 ,...: dark pattern position
λ1,λ2,λ3:波長 λ 1 ,λ 2 ,λ 3 : wavelength
302:雷射光 302: Laser light
304:繞射光學元件 304:Diffraction optical element
306a,306b,306c:點雲 306a,306b,306c: Point cloud
310a,310b:繞射光學元件 310a, 310b: diffraction optical element
502:準直鏡組 502: Collimator lens set
5021:凹透鏡 5021: Concave lens
5022:凸透鏡 5022: Convex lens
504:繞射光學元件 504:Diffraction optical element
506:雷射光 506:Laser light
508:繞射光 508:Diffuse light
512:準直鏡組 512: Collimator lens set
5121:凹透鏡 5121: Concave lens
5122:凸透鏡 5122: Convex lens
514:繞射光學元件 514:Diffraction optical element
516:雷射光 516:Laser light
518:繞射光 518:Diffuse light
520:凹面鏡 520: Concave mirror
PW:脈衝寬度 PW: Pulse Width
T:週期時間 T: cycle time
SS:感測器快門時間 SS:Sensor shutter time
R:重置時間 R: Reset time
Ts:開始時間 Ts: start time
Tl:結束時間 Tl: End time
800:方法 800:Method
802,804,806:步驟 802,804,806: Steps
901,902,903:雷射光 901,902,903:Laser light
911,912,913:雷射光 911,912,913:Laser light
921:雷射光 921: Laser light
A,B,C,D,E:取樣區塊 A,B,C,D,E: Sampling blocks
圖1是本發明的光達系統的示意圖;圖2A及圖2B是圖1所示部分元件的內部結構示意圖;圖2C及圖2D為單狹縫繞射示意圖;圖2E顯示不同波長的雷射光產生的點雲相互疊合的情形;圖3A及圖3B是顯示繞射光學元件的運作情形;圖4顯示本發明在不同距離下的運作情形;圖5A是依據本發明的一準直鏡組配置方式;圖5B是依據本發明的另一準直鏡組配置方式;圖6是依據本發明的一範例時序圖;圖7是依據本發明的另一範例時序圖;圖8是依據本發明的解析度提升方法的流程圖;圖9A、圖9B及圖9C是本發明的範例時序圖;圖10A為真實的環境影像;圖10B、圖10C及圖10D為圖10A的取樣範例;以及圖11A、圖11B、圖11C、圖11D、圖11E及圖11F為同一幀中不同子幀的取樣情形。 FIG1 is a schematic diagram of a lidar system of the present invention; FIG2A and FIG2B are schematic diagrams of the internal structure of some components shown in FIG1; FIG2C and FIG2D are schematic diagrams of single slit diffraction; FIG2E shows the overlapping of point clouds generated by laser lights of different wavelengths; FIG3A and FIG3B show the operation of the diffraction optical element; FIG4 shows the operation of the present invention at different distances; FIG5A is a collimator lens group configuration according to the present invention; FIG5B is another collimator lens group configuration according to the present invention method; FIG. 6 is an example timing diagram according to the present invention; FIG. 7 is another example timing diagram according to the present invention; FIG. 8 is a flow chart of the resolution enhancement method according to the present invention; FIG. 9A, FIG. 9B and FIG. 9C are example timing diagrams of the present invention; FIG. 10A is a real environment image; FIG. 10B, FIG. 10C and FIG. 10D are sampling examples of FIG. 10A; and FIG. 11A, FIG. 11B, FIG. 11C, FIG. 11D, FIG. 11E and FIG. 11F are sampling situations of different subframes in the same frame.
以下配合圖式及元件符號對本發明的實施方式做更詳細的說明,俾使熟習該項技藝者在研讀本說明書後能據以實施。 The following is a more detailed description of the implementation of the present invention with the help of diagrams and component symbols, so that those who are familiar with the technology can implement it accordingly after reading this manual.
本發明提供一種解析度提升的光達系統,以及該光達系統的解析度提升方法。透過投射多種不同波長的雷射光,並對繞射光學元件進行旋轉或振盪,可在光點密度不變的情形下,增加成像的解析度。 The present invention provides a lidar system with improved resolution and a method for improving the resolution of the lidar system. By projecting laser lights of multiple different wavelengths and rotating or oscillating the diffraction optical element, the resolution of the imaging can be increased while the light spot density remains unchanged.
請參閱圖1,本發明提供一種光達系統100,包括微控制器(MCU)102、雷射光源102、鏡頭模組106及接收器112。鏡頭模組106包括接收器鏡頭模組108及雷射分光鏡模組110。雷射光源102及接收器112耦接至微控制器101。
Referring to FIG. 1 , the present invention provides a lidar system 100, including a microcontroller (MCU) 102, a
為了測量目標120與光達系統100之間的距離,首先,微控制器101控制雷射光源102發出雷射光104。接著,雷射分光鏡模組110將雷射光104散射為複數個光點,該等光點分布於像場(field of image,FOI)122之內,且該像場122完全涵蓋目標120。隨後,該等光點在接觸目標120後,反射為複數個反射光126,該等反射光分布於視場(field of view,FOV)124之內。接收器鏡頭模組108接收反射光126,並傳送訊號至接收器112。接收器112將接收的訊號傳送至微控制器101,進行後續影像分析。
In order to measure the distance between the target 120 and the lidar system 100, first, the microcontroller 101 controls the
請參閱圖2A,圖1中的接收器鏡頭模組108包括由至少一凹透鏡202及至少一凸透鏡204組成的透鏡模組,該凹透鏡202及該凸透鏡204形成一聚光鏡組,可聚集圖1中的反射光126,以便傳送光訊號至接收器112。圖1中的雷射分光鏡模組110包括繞射光學元件(diffractive optical element,DOE)206、凹面鏡208及準直鏡組(collimation lens)210。其中,繞射光學元件206具有旋轉或振盪的功能。雷射分光鏡模組110之運作方式,將於下文詳述。
Please refer to FIG. 2A. The
請參閱圖2B,圖1中的雷射光源102可發射多種(例如但不限於四種)不同波長的雷射光221、222、223及224,該雷射光源102包括光耦合器230及光纖240。雷射光221、222、223及224可例如為紅外光,例如波長為850nm、905nm、940nm及1064nm的紅外光。雷射光221、222、223及224可同時或依序發射。光耦合器230將雷射光221、222、223及224光學耦合為單一準直光訊號,經由光纖240向該鏡頭模組傳輸。
Please refer to FIG. 2B . The
依據單狹縫繞射(single-slit diffraction)原理,繞射角θn關聯於狹縫寬度a及波長λ:a sinθ n =nλ,n=±1,±2,±3...請參閱圖2C,當投射距離L固定時,暗紋位置y1,y2,...及-y1,-y2,...關聯於繞射角θn。當波長λ為紅外光範圍(約為1000nm之數量級)而狹縫寬度a為mm之數量級時,繞射角θn極小,此時:
請參閱圖3A,當雷射光302射向繞射光學元件304時,繞射光學元件304會將雷射光302繞射為數千至數萬個光點。該等光點在不同距離外形成點雲(point cloud)306a、306b及306c,其中點雲306a距離繞射光學元件304最近,光點最密集,點雲覆蓋面積最小;而點雲306c距離繞射光學元件304最遠,光點最不密集,點雲覆蓋面積最大。繞射光學元件304可為例如廣州印芯半導體公司(Tyrafos)的HCPDOETM,但本發明不限於此。
Please refer to FIG. 3A . When
請參閱圖3B,透過旋轉或振盪,繞射光學元件310a及310b可為繞射光學元件304在不同時間的兩種樣態。由圖3B可知,透過繞射光學元件310a及310b的旋轉及/或往復移動,繞射光學元件310b的光點位置可涵蓋繞射光學元件310a的光點間空隙,反之亦然。因此,在一幀的複數個子幀的每一子幀(subframe)中,繞射光學元件304的光點投射位置會改變。透過融合複數個子幀,可在不增加光點密度的情形下,增加光點照射面積,提升成像解析度。此一方法適合使用於靜態偵測之情境。
Please refer to FIG. 3B. Through rotation or oscillation, the diffraction
由於圖3A所示的點雲覆蓋面積正比於距離的平方,故當距離較遠時,點雲覆蓋面積會快速擴大,造成單位面積光能下降,進而導致反射光強度不足。而大幅增加雷射光302的強度可能導致設備壽命下降,且對人眼易造成傷害。因此,請參閱圖4,由至少一凹透鏡202及至少一凸透鏡204組成的焦距可調變透鏡模組可依照測距範圍(detection range)調變視場大小,使不同距離下(例如15公尺、40公尺、100公尺、200公尺及300公尺)的單位面積光能大致相等,防止距離較遠時反射光強度不足的情形。或者,亦可使用複數個固定焦距的透
鏡模組,每一透鏡模組包括至少一凹透鏡202及至少一凸透鏡204,並依照測距範圍切換透鏡模組,以調變視場大小。
Since the point cloud coverage area shown in FIG3A is proportional to the square of the distance, when the distance is far, the point cloud coverage area will expand rapidly, causing the light energy per unit area to decrease, thereby resulting in insufficient reflected light intensity. A significant increase in the intensity of the
一種達成圖4所示配置的方式為利用準直鏡組將繞射光的覆蓋面積收束在一定範圍內。透過焦距的調變,準直鏡組可調變出射平行光的發散角,依測距範圍調整投射光點的像場範圍,以達成圖4所示之效果。可使用複數個固定焦距的準直鏡組,並依照測距範圍切換準直鏡組,以調變該像場範圍。或者,亦可使用可變焦距的準直鏡組,並依照測距範圍切換準直鏡組,以調變像場範圍。請參閱圖5A,一種準直鏡組配置為將準直鏡組502設置於可旋轉或振盪的繞射光學元件504的正前方,其中,準直鏡組502的鏡面垂直於雷射光506的入射方向。如圖5A所示,準直鏡組502可將繞射光學元件504射出的繞射光508收束為大致相互平行,使繞射光508在不同距離下的單位面積光能大致維持相等。在一實施例中,準直鏡組502包括凹透鏡5021及凸透鏡5022,其中凹透鏡5021及凸透鏡5022的間距可調變,以控制發散角。
One way to achieve the configuration shown in FIG. 4 is to use a collimator set to converge the coverage area of the diffracted light within a certain range. By adjusting the focal length, the collimator set can adjust the divergence angle of the outgoing parallel light and adjust the image field range of the projected light spot according to the ranging range to achieve the effect shown in FIG. 4. A plurality of collimators with fixed focal lengths can be used, and the collimators can be switched according to the ranging range to adjust the image field range. Alternatively, a collimator set with a variable focal length can be used, and the collimators can be switched according to the ranging range to adjust the image field range. Please refer to FIG. 5A , a collimator lens set is configured to place the collimator lens set 502 directly in front of a rotatable or oscillating diffraction
請參閱圖5B,另一種準直鏡組配置為將準直鏡組512設置於凹面鏡520的前方,以凹面鏡520收集可旋轉或振盪的繞射光學元件514射出的繞射光。如圖5B所示,繞射光學元件514將雷射光516繞射為多束繞射光518,該等繞射光518透過凹面鏡520向準直鏡組512反射。隨後,準直鏡組512將該等繞射光518收束為大致相互平行,使繞射光518在不同距離下的單位面積光能大致維持相等。在一實施例中,準直鏡組512包括凹透鏡5121及凸透鏡5122,其中凹透鏡5121及凸透鏡5122的間距可調變,以控制發散角。此一配置相較於圖5A所示的配置,可收集到更大角度的繞射光,進而在不增加雷射光強度的情況下,增加投射出的單位面積光能。
Referring to FIG. 5B , another collimator lens set configuration is to place the collimator lens set 512 in front of the
在車輛自動駕駛的使用情境中,當車輛行駛時,光達系統100可能接收到的干擾訊號包括前方對向車道上車輛的掃描雷射、前方對向車道上車輛的前定向脈衝雷射、前方同向車道上車輛的掃描雷射及前方同向車道上車輛的後定向脈衝雷射等。因此,必須以適當的方法排除該等干擾訊號,以正確測量距離,維護行車安全。 In the use scenario of vehicle automatic driving, when the vehicle is driving, the interference signals that the lidar system 100 may receive include scanning lasers from vehicles in the opposite lane ahead, front directional pulse lasers from vehicles in the opposite lane ahead, scanning lasers from vehicles in the same lane ahead, and rear directional pulse lasers from vehicles in the same lane ahead. Therefore, such interference signals must be eliminated by appropriate methods to accurately measure distance and maintain driving safety.
當圖1的雷射光源102射出脈衝訊號時,為了排除干擾訊號,微控制器101可依據測距範圍,將接收器112啟動或關閉,使接收器112僅接收測距範圍內的反射光訊號。舉例而言,若待測物體在300公尺外,則自雷射光源102射
出脈衝訊號至接收器112接收反射光訊號的所需時間為2μs(R=ct/2,其中R為距離,c為光速3×108m/s,t為時間(秒))。因此,可在一個週期時間(cycle time)內,將接收器112與雷射光源102同步啟動,感測時間2μs,而其餘時間關閉,以防止接收干擾訊號。請參閱圖6,雷射光源(TX)以週期時間T射出脈衝寬度(pulse width)PW的脈衝訊號。接收器(RX)在週期時間T內的感測器快門(sensor shutter)時間SS內開啟和重置(reset)時間R內關閉,其中T=SS+R。感測器快門時間SS及重置時間R係依據測距範圍而定。在一個實施例中,當測距範圍為300公尺時,感測器快門時間SS為2μs,重置時間R為2μs,週期時間T為4μs,脈衝寬度PW為100ns。此時,接收器(RX)可接收0至300公尺內的反射光訊號,而理論圖框率(掃描次數)可高達1/T=2.5×105f/s。
When the
請參閱圖7,除了測距範圍上限以外,亦可透過調整感測器快門時間SS,對接收器(RX)設定測距範圍下限。在圖7中,感測器快門時間SS的開始時間Ts係依據測距範圍下限而定,而結束時間Tl係依據測距範圍上限而定。在一個實施例中,當測距範圍為90至300公尺時,開始時間Ts為600ns,結束時間Tl為2μs,感測器快門時間SS為1400ns,重置時間R為2μs,週期時間T為4μs,脈衝寬度PW為100ns。此時,接收器(RX)可接收90至300公尺內的反射光訊號,而理論圖框率為1/T=2.5×105f/s。 Please refer to FIG7. In addition to the upper limit of the ranging range, the lower limit of the ranging range can also be set for the receiver (RX) by adjusting the sensor shutter time SS. In FIG7, the start time Ts of the sensor shutter time SS is determined according to the lower limit of the ranging range, and the end time Tl is determined according to the upper limit of the ranging range. In one embodiment, when the ranging range is 90 to 300 meters, the start time Ts is 600ns, the end time Tl is 2μs, the sensor shutter time SS is 1400ns, the reset time R is 2μs, the cycle time T is 4μs, and the pulse width PW is 100ns. At this time, the receiver (RX) can receive the reflected light signal within 90 to 300 meters, and the theoretical frame rate is 1/T=2.5×10 5 f/s.
請參閱圖8,在一幀包括複數個(至少三個,例如六個)子幀的環境影像中,方法800透過取得複數個子幀的環境影像,比較每一子幀的複數個取樣區塊的平均距離值,該微控制器融合該複數個子幀的環境影像的該等像素的距離值成為該幀的一最終距離值。在步驟802中,雷射光源依序發射複數個雷射光訊號,以取得複數個子幀的環境影像,其中,在一幀的一子幀的一感測器快門時間內,該接收器的複數個像素接收該等不同波長的雷射光的至少一反射光訊號。在每一子幀中,複數個不同波長的雷射光可同時出射並被耦合為單一光訊號,亦可分別依序出射。在步驟804中,以該等反射光訊號所代表的距離值(依據反射光時間的飛行時間(time of flight,ToF)可求得距離值)為該子幀的該等像素的距離值。在包括複數個取樣區塊的環境影像中,對該等子幀的該複數個取樣區塊的平均距離值進行批量比較(詳見下文表1、表2及表3)。在步驟806中,依據步驟804的批量比較的結果,該微控制器汰除異常子幀,並
融合正常子幀成為該幀的最終距離值。其中,「融合」可以取平均值、疊加、擇一或其他方式進行。
Please refer to FIG8 . In a frame of an environment image including a plurality of (at least three, for example, six) subframes,
請參見圖9A及圖9B,如前文所述,在動態偵測的情境下,不同波長的雷射光可同時或依序發射。圖9A顯示不同波長的雷射光同時發射的情形。在一幀包括六個子幀的範例中,在每一子幀中,雷射光源(TX)以週期時間T同時發射不同波長的雷射光901、902及903,而接收器(RX)在感測器快門時間SS內開啟,接收雷射光901、902及903的反射光訊號,並將該等反射光訊號交由微控制器計算融合後的子距離值。在圖9A所示的範例中,第五子幀的反射光訊號在感測器快門時間SS中的位置與其餘子幀不同,故微控制器可汰除第五子幀的子距離值,再將其餘子幀的子距離值融合,成為該幀的最終距離值。圖9B顯示不同波長的雷射光依序發射的情形。在一幀包括六個子幀的範例中,在每一子幀中,雷射光源(TX)以週期時間T依序發射不同波長的雷射光911、912及913,而接收器(RX)在感測器快門時間SS內開啟,接收雷射光911、912或913的反射光訊號,並將該等反射光訊號交由微控制器計算該等反射光訊號所代表的子距離值。在第一子幀中,雷射光911發射。在第二子幀中,雷射光912發射。在第三子幀中,雷射光913發射。在第四子幀中,雷射光911發射。在第五子幀中,雷射光912發射。在第六子幀中,雷射光913發射。如此,微控制器可計算每一子幀的子距離值,其中,第一子幀及第四子幀關聯於雷射光911,第二子幀及第五子幀關聯於雷射光912,第三子幀及第六子幀關聯於雷射光913。在圖9B所示的範例中,第五子幀的反射光訊號在感測器快門時間SS中的位置與其餘子幀不同(亦即飛行時間不同),因此第五子幀的平均距離值會與其餘子幀不同,故微控制器可汰除第五子幀的子距離值,再將其餘子幀的子距離值融合,成為該幀的最終距離值。 Please refer to FIG. 9A and FIG. 9B. As mentioned above, in the case of dynamic detection, laser lights of different wavelengths can be emitted simultaneously or sequentially. FIG. 9A shows the situation where laser lights of different wavelengths are emitted simultaneously. In an example where a frame includes six subframes, in each subframe, the laser light source (TX) emits laser lights 901, 902, and 903 of different wavelengths simultaneously with a cycle time T, and the receiver (RX) is turned on within the sensor shutter time SS to receive the reflected light signals of the laser lights 901, 902, and 903, and the reflected light signals are sent to the microcontroller to calculate the fused sub-range value. In the example shown in FIG9A , the position of the reflected light signal of the fifth subframe in the sensor shutter time SS is different from that of the remaining subframes, so the microcontroller can eliminate the sub-distance value of the fifth subframe and then merge the sub-distance values of the remaining subframes to form the final distance value of the frame. FIG9B shows the situation where laser lights of different wavelengths are emitted in sequence. In an example where a frame includes six subframes, in each subframe, a laser light source (TX) sequentially emits laser lights 911, 912, and 913 of different wavelengths with a cycle time T, and a receiver (RX) is turned on within a sensor shutter time SS to receive reflected light signals of laser lights 911, 912, or 913, and the reflected light signals are sent to a microcontroller to calculate the sub-range values represented by the reflected light signals. In the first subframe, laser light 911 is emitted. In the second subframe, laser light 912 is emitted. In the third subframe, laser light 913 is emitted. In the fourth subframe, laser light 911 is emitted. In the fifth subframe, laser light 912 is emitted. In the sixth subframe, laser light 913 is emitted. In this way, the microcontroller can calculate the sub-range value of each subframe, wherein the first and fourth subframes are associated with laser light 911, the second and fifth subframes are associated with laser light 912, and the third and sixth subframes are associated with laser light 913. In the example shown in FIG9B , the position of the reflected light signal of the fifth subframe in the sensor shutter time SS is different from that of the remaining subframes (i.e., the flight time is different), so the average range value of the fifth subframe will be different from that of the remaining subframes, so the microcontroller can eliminate the sub-range value of the fifth subframe, and then merge the sub-range values of the remaining subframes to become the final range value of the frame.
請參閱圖9C,如前文所述,在靜態偵測的情境下,透過旋轉或振盪,在一幀的複數個子幀的每一子幀中,透過將繞射元件304設置為可動件,具有旋轉及/或往復移動的功能,繞射光學元件304的光點投射位置會改變。在圖9C中,在複數個旋轉角度或往復移動位置條件下,取複數子幀環境影像,雷射光921的光點投射位置在每一子幀中會改變。每一反射光訊號在環境影像中的每一個像素代表一個子距離值,每一子幀環境影像由複數個子距離值組成具有深度資訊的三維影像。隨後,微控制器在汰除異常子幀後,再將其餘複數子幀環境
影像融合,若同一像素具複數個距離值,則進行平均或擇一;若同一像素僅具有一個距離值,則選擇該子距離值;若該像素不具距離值,則選擇測距範圍中的最大值(例如500公尺)或最小值(例如0),計算該幀的三維影像的最終距離值。舉例而言,在一個實施例中,下列表1A~表1F分別代表同一取樣區塊在同一幀內的第一子幀至第六子幀,其中每一個小方格代表一個像素,有數值者代表該子幀的該像素測得的子距離值,而無數值者代表無子距離值。對於每一子幀,將具有子距離值的像素取平均距離值,並根據每一子幀的平均距離值汰除異常子幀。由表1A~表1F可知,第六子幀的平均距離值明顯不同於其餘子幀,故將第六子幀視為異常值而汰除。隨後,如表1G所示,將正常子幀(第一至第五子幀,即表1A至表1E)的每一像素疊合,再將疊合後具有子距離值的的每一像素取平均值,作為此一取樣區塊在此幀的最終距離值。在疊合時,若同一像素具有複數個子距離值,則進行平均或選擇最大值,若該像素不具子距離值,則選擇測距範圍中的最小值(例如0)。或者,若同一像素具有複數個子距離值,則進行平均或選擇最小值,若該像素不具子距離值,則選擇測距範圍中的最大值(例如500或1000)。在圖9C所示的範例中,第五子幀的子距離值與其餘子幀不同,故微控制器可汰除第五子幀的子距離值,再將其餘子幀融合,計算該幀的最終距離值。
Please refer to FIG. 9C . As described above, in the static detection scenario, by rotating or oscillating, in each of the multiple subframes of a frame, by setting the
圖10A為真實的環境影像。為了提升運算效率,並不需要將整張影像上的每一個像素皆進行距離測量,而可取樣數個區塊進行距離測量。每一取樣區塊中包括複數個像素,例如10×10像素。被取樣的像素不宜太多,例如不多於總像素數的10%,以提升運算效率。圖10B顯示取樣二個區塊的實施例。圖10C顯示取樣五個區塊的實施例。圖10D顯示取樣九個區塊的實施例。取樣區塊數不宜少於五個,以較佳地掌握環境訊息。在一個正常的子幀中,具有正常距離值的取樣區塊多於一特定比例(例如80%或88.9%,其中80%指取樣五個區塊時,容許一個取樣區塊具有異常的距離值,而88.9%指取樣九個區塊時,容許一個取樣區塊具有異常的距離值),否則視為異常子幀。 Figure 10A is a real environment image. In order to improve the computing efficiency, it is not necessary to measure the distance of every pixel on the entire image, but several blocks can be sampled for distance measurement. Each sampling block includes a plurality of pixels, such as 10×10 pixels. The sampled pixels should not be too many, for example, not more than 10% of the total number of pixels, to improve the computing efficiency. Figure 10B shows an example of sampling two blocks. Figure 10C shows an example of sampling five blocks. Figure 10D shows an example of sampling nine blocks. The number of sampling blocks should not be less than five to better grasp the environmental information. In a normal subframe, if the sampling blocks with normal distance values exceed a certain ratio (e.g. 80% or 88.9%, where 80% means that when five blocks are sampled, one sampling block is allowed to have an abnormal distance value, and 88.9% means when nine blocks are sampled, one sampling block is allowed to have an abnormal distance value), otherwise it is considered an abnormal subframe.
請參閱圖11A至圖11F,在一幀包括六個子幀的實施例中,該六個子幀依序為圖11A、圖11B、圖11C、圖11D、圖11E及圖11F。其中,第二子幀(圖11B)及第六子幀(圖11F)有外光侵入。為了有效濾除受干擾的子幀,可將每一子幀中各取樣區塊內的每一像素所測得的距離值融合,作為該子幀該取樣區塊的子距離值,再將同一取樣區塊在六個子幀內的六個子距離值相互比較,以濾除異常值。在一個實施例中,濾除異常值的方式為計算同一取樣區塊在六個子幀內的的六個子距離值的平均值(μ)、標準差(σ)、上閾值及下閾值,其中上閾值為平均值加上數個標準差(μ+nσ),下閾值為平均值減去數個標準差(μ-nσ),其中n的大小係依據實驗數據及實際需求而定,且可為整數或非整數,例如(但不限於)1或1.5。在下文的表2、3及4所示之實施例中,係以n=1為例進行說明,但本發明不以此為限。隨後,汰除該等子距離值中,大於上閾值或小於下閾值的子幀,並融合其餘子距離值相近的子幀,作為該該幀的最終距離值。 Please refer to FIG. 11A to FIG. 11F. In an embodiment in which a frame includes six subframes, the six subframes are FIG. 11A, FIG. 11B, FIG. 11C, FIG. 11D, FIG. 11E and FIG. 11F in sequence. Among them, the second subframe (FIG. 11B) and the sixth subframe (FIG. 11F) have external light intrusion. In order to effectively filter out the interfered subframes, the distance values measured by each pixel in each sampling block in each subframe can be fused as the sub-distance value of the sampling block in the subframe, and then the six sub-distance values of the same sampling block in the six subframes are compared with each other to filter out abnormal values. In one embodiment, the method of filtering out abnormal values is to calculate the average value (μ), standard deviation (σ), upper threshold and lower threshold of the six sub-range values in the same sampling block within six sub-frames, wherein the upper threshold is the average value plus a number of standard deviations (μ+nσ), and the lower threshold is the average value minus a number of standard deviations (μ-nσ), wherein the size of n is determined according to experimental data and actual needs, and can be an integer or non-integer, such as (but not limited to) 1 or 1.5. In the embodiments shown in Tables 2, 3 and 4 below, n=1 is used as an example for explanation, but the present invention is not limited thereto. Then, the subframes with sub-distance values greater than the upper threshold or less than the lower threshold are eliminated, and the remaining subframes with similar sub-distance values are merged as the final distance value of the frame.
表2、3及4顯示可能的感測結果。在表2所示之範例中,在第一子幀時,取樣區塊A的前方無障礙物。此時,取樣區塊A的距離視為最遠距離(例如500公尺)。在第二子幀時,取樣區塊A有外光侵入,而在第六子幀時,取樣區塊A、B、C、D、E皆有外光侵入。此時,如表2所示,取樣區塊A的第二子幀及第六子幀的距離值低於下閾值,故應視為異常值而濾除。而取樣區塊B、C、D、E的第六子幀的距離值皆低於個別下閾值,故應視為異常值而濾除。 Tables 2, 3, and 4 show possible sensing results. In the example shown in Table 2, in the first subframe, there is no obstacle in front of sampling block A. At this time, the distance of sampling block A is considered to be the farthest distance (e.g., 500 meters). In the second subframe, there is external light intrusion into sampling block A, and in the sixth subframe, there is external light intrusion into sampling blocks A, B, C, D, and E. At this time, as shown in Table 2, the distance values of the second and sixth subframes of sampling block A are lower than the lower threshold value, so they should be considered as abnormal values and filtered out. The distance values of the sixth subframes of sampling blocks B, C, D, and E are all lower than the individual lower threshold values, so they should be considered as abnormal values and filtered out.
在表3所示之範例中,在第四子幀時,取樣區塊A有外光侵入,且測得的距離十分接近正常值,而在第六子幀時,取樣區塊A、B、C、D、E皆有外光侵入。此時,如表3所示,取樣區塊A的第四子幀的距離值高於上閾值,第六子幀的距離值低於下閾值,故應視為異常值而濾除。而取樣區塊B、C、D、E的第六子幀的距離值皆低於個別下閾值,故應視為異常值而濾除。由此,儘管取樣區塊A在第四子幀及第六子幀測得的距離十分接近正常值,該二子幀仍可被正確識別為異常值而濾除。 In the example shown in Table 3, in the fourth subframe, there is external light intrusion in sampling block A, and the measured distance is very close to the normal value, while in the sixth subframe, there is external light intrusion in sampling blocks A, B, C, D, and E. At this time, as shown in Table 3, the distance value of the fourth subframe of sampling block A is higher than the upper threshold value, and the distance value of the sixth subframe is lower than the lower threshold value, so it should be considered as an abnormal value and filtered out. The distance values of the sixth subframes of sampling blocks B, C, D, and E are all lower than the individual lower threshold values, so they should be considered as abnormal values and filtered out. Therefore, even though the distances measured by sampling block A in the fourth and sixth subframes are very close to normal values, the two subframes can still be correctly identified as abnormal values and filtered out.
在表4所示之範例中,在第六子幀時,取樣區塊B、C、D、E有外光侵入。此時,如表4所示,取樣區塊B、C、D、E的第六子幀的距離值皆低於個別下閾值,故應視為異常值而濾除。 In the example shown in Table 4, in the sixth subframe, external light intrudes into sampling blocks B, C, D, and E. At this time, as shown in Table 4, the distance values of the sixth subframe of sampling blocks B, C, D, and E are all lower than the individual lower threshold values, so they should be considered as abnormal values and filtered out.
以上所述者僅為用以解釋本發明的較佳實施例,並非企圖據以對本發明做任何形式上的限制,是以,凡有在相同的發明精神下所作有關本發明的任何修飾或變更,皆仍應包括在本發明意圖保護的範疇。 The above is only used to explain the preferred embodiment of the present invention, and is not intended to limit the present invention in any form. Therefore, any modification or change of the present invention made under the same spirit of the invention should still be included in the scope of protection intended by the present invention.
100:光達系統 100: LiDAR system
101:微控制器 101:Microcontroller
102:雷射光源 102:Laser light source
104:雷射光 104: Laser light
106:鏡頭模組 106: Lens module
108:接收器鏡頭模組 108: Receiver lens module
110:雷射分光鏡模組 110: Laser spectrometer module
112:接收器 112: Receiver
120:目標 120: Target
122:像場 122: Image field
124:視場 124: Field of view
126:反射光 126: Reflected light
Claims (12)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/230,334 US20240045068A1 (en) | 2022-08-05 | 2023-08-04 | LiDAR SYSTEM AND RESOLUSION IMPROVEMENT METHOD THEREOF |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202263395347P | 2022-08-05 | 2022-08-05 | |
| US63/395,347 | 2022-08-05 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| TW202407380A TW202407380A (en) | 2024-02-16 |
| TWI858694B true TWI858694B (en) | 2024-10-11 |
Family
ID=89753737
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW112117055A TWI858694B (en) | 2022-08-05 | 2023-05-08 | Lidar system and resolusion improvement method thereof |
| TW112117043A TWI876338B (en) | 2022-08-05 | 2023-05-08 | Lidar system and crosstalk reduction method thereof |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW112117043A TWI876338B (en) | 2022-08-05 | 2023-05-08 | Lidar system and crosstalk reduction method thereof |
Country Status (2)
| Country | Link |
|---|---|
| CN (2) | CN117518185A (en) |
| TW (2) | TWI858694B (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TW201643463A (en) * | 2015-04-07 | 2016-12-16 | 光電波有限公司 | Small light system |
| US20200096613A1 (en) * | 2016-06-02 | 2020-03-26 | Academy Of Opto-Electronics ,Chinese Academy Of Sciences | Lidar system based on visible-near infrared-shortwave infrared light bands |
| US20200200874A1 (en) * | 2016-04-22 | 2020-06-25 | OPSYS Tech Ltd. | Multi-Wavelength LIDAR System |
| WO2020182591A1 (en) * | 2019-03-08 | 2020-09-17 | Osram Gmbh | Component for a lidar sensor system, lidar sensor system, lidar sensor device, method for a lidar sensor system and method for a lidar sensor device |
| EP3796046A1 (en) * | 2016-12-30 | 2021-03-24 | XenomatiX NV | System for characterizing surroundings of a vehicle |
| CN108027425B (en) * | 2015-09-18 | 2022-04-08 | 罗伯特·博世有限公司 | Laser radar sensor |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6764863B2 (en) * | 2015-07-03 | 2020-10-07 | パナソニックセミコンダクターソリューションズ株式会社 | Distance measuring device |
| WO2020047248A1 (en) * | 2018-08-29 | 2020-03-05 | Sense Photonics, Inc. | Glare mitigation in lidar applications |
| US11287517B2 (en) * | 2019-04-19 | 2022-03-29 | Sense Photonics, Inc. | Single frame distance disambiguation |
| US20220334253A1 (en) * | 2019-10-01 | 2022-10-20 | Sense Photonics, Inc. | Strobe based configurable 3d field of view lidar system |
| US11977169B2 (en) * | 2020-08-24 | 2024-05-07 | Innoviz Technologies Ltd. | Multi-beam laser emitter with common optical path |
| US11450103B2 (en) * | 2020-10-05 | 2022-09-20 | Crazing Lab, Inc. | Vision based light detection and ranging system using dynamic vision sensor |
-
2023
- 2023-05-08 CN CN202310513142.7A patent/CN117518185A/en active Pending
- 2023-05-08 TW TW112117055A patent/TWI858694B/en active
- 2023-05-08 CN CN202310510239.2A patent/CN117518184A/en active Pending
- 2023-05-08 TW TW112117043A patent/TWI876338B/en active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TW201643463A (en) * | 2015-04-07 | 2016-12-16 | 光電波有限公司 | Small light system |
| CN108027425B (en) * | 2015-09-18 | 2022-04-08 | 罗伯特·博世有限公司 | Laser radar sensor |
| US20200200874A1 (en) * | 2016-04-22 | 2020-06-25 | OPSYS Tech Ltd. | Multi-Wavelength LIDAR System |
| US20200096613A1 (en) * | 2016-06-02 | 2020-03-26 | Academy Of Opto-Electronics ,Chinese Academy Of Sciences | Lidar system based on visible-near infrared-shortwave infrared light bands |
| EP3796046A1 (en) * | 2016-12-30 | 2021-03-24 | XenomatiX NV | System for characterizing surroundings of a vehicle |
| WO2020182591A1 (en) * | 2019-03-08 | 2020-09-17 | Osram Gmbh | Component for a lidar sensor system, lidar sensor system, lidar sensor device, method for a lidar sensor system and method for a lidar sensor device |
Also Published As
| Publication number | Publication date |
|---|---|
| TW202407379A (en) | 2024-02-16 |
| CN117518185A (en) | 2024-02-06 |
| TW202407380A (en) | 2024-02-16 |
| CN117518184A (en) | 2024-02-06 |
| TWI876338B (en) | 2025-03-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7695186B2 (en) | Improved 3D Sensing | |
| US10782392B2 (en) | Scanning optical system and light projecting and receiving apparatus | |
| US10754036B2 (en) | Scanning illuminated three-dimensional imaging systems | |
| JP2015178975A (en) | Object detection device and sensing device | |
| CN113848537B (en) | Chromatic dispersion spectrum photosensitive assembly, receiving end and laser radar system | |
| EP3206074B1 (en) | Scanning optical system and light projection and reception device | |
| JP2021170033A (en) | Scanner | |
| US12164033B2 (en) | Lidar projection apparatus | |
| JP2023145783A (en) | Light projecting device, projecting/receiving device, and distance measuring device | |
| TWI858694B (en) | Lidar system and resolusion improvement method thereof | |
| CN110662980B (en) | Method and apparatus for scanning a solid angle | |
| CN110850388A (en) | Lidar device with at least one scattering disk element | |
| US20240045068A1 (en) | LiDAR SYSTEM AND RESOLUSION IMPROVEMENT METHOD THEREOF | |
| CN108885260B (en) | Time-of-flight detector with single axis scanning | |
| CN113848538B (en) | Dispersion spectrum laser radar system and measuring method | |
| CN113447947A (en) | Device and method for generating scene data | |
| CN113597569B (en) | LiDAR system with holographic imaging optics | |
| US12442899B2 (en) | MEMS actuated vibratory Risley prism for LiDAR | |
| WO2022194006A1 (en) | Detection apparatus | |
| US7379220B2 (en) | Multi-beam color scanning device | |
| CN113302513A (en) | Transmitting unit for a lidar sensor, lidar sensor and method for actuating a transmitting unit | |
| TWI835610B (en) | Light emitting module, camera module and electronic equipment | |
| CN119959907B (en) | Large-view-field solid-state laser radar device and detection method | |
| WO2025069291A1 (en) | Imaging system, electromagnetic wave irradiation system, measurement method, and program | |
| JP2025074345A (en) | Light projecting device, light projecting/receiving device, and distance measuring device |