GB2374743A - Surface profile measurement - Google Patents
Surface profile measurement Download PDFInfo
- Publication number
- GB2374743A GB2374743A GB0108497A GB0108497A GB2374743A GB 2374743 A GB2374743 A GB 2374743A GB 0108497 A GB0108497 A GB 0108497A GB 0108497 A GB0108497 A GB 0108497A GB 2374743 A GB2374743 A GB 2374743A
- Authority
- GB
- United Kingdom
- Prior art keywords
- light
- light source
- detector
- array
- imaged
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/51—Display arrangements
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
- G05D1/024—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0248—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means in combination with a laser
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/027—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0272—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Networks & Wireless Communication (AREA)
- Optics & Photonics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
An imaging system scans a pulsed laser light source 40 over an object 12 and receives the reflected pulse signal in a stationary two dimensional (2D) multi region light detector (array) 70, 54. The time of flight (TOF) of the signal 56 is calculated to give the distance to the image and hence allows a three dimensional (3D) image of the object to be constructed 18. The light source scanning mechanism 42, 44 is synchronised 52 so the different regions of the object are scanned and imaged onto different regions of the detector 70, where the different regions of the detector can be actuated separately 52. Alternatively, there may be a plurality of light sources rather than a single scanned source. The object may be scanned a second time and only the regions of the array that were illuminated in the first scan are used to determine the time of flight.
Description
SURFACE PROFILE MEASUREMENT
The invention relates to apparatus for measuring the surface profile of a sample, and in 5 particular relates to non-contact 3D surface profile measurement systems.
The non-contact measurement of three-dimensional (3D) objects to extract data regarding their physical form and location in space is a topic which has excited much research. Many techniques have been developed to suit the distance to the object, the precision with which 10 the object's features need to be measured and so on.
One common technique is to illuminate the remote object with a light source of known pattern; so called "structured illumination". A camera is located some distance away from the structured light source and arranged to collect an image of the projection of the 15 structured light pattern onto the surface of the remote object. Figure 1 shows this known profile measurement using structured lighting.
A structured light source 10 projects light onto the object 12, and the reflected light is captured by a camera 14 having a field of view 15 which covers the object 12. An image
20 processor 16 derives the 3D profile data 18 by analysis of the deformation of the structured light pattern, which is representative of the distance to the remote object.
The 3D nature of the remote object causes the structured light pattern in the image captured by the camera to be deformed. From knowledge of the physical separation of the camera 25 from the light source (the baseline 20) and trigonometry, it is possible to compute the 3D surface profile of the remote object from the deformation of the image of the structured light pattern.
A benefit of this system is that if a video camera is used to capture the image, then with 30 sufficient processing, the 3D surface profile can be measured at video frame rates (50 or 60 times per second). As this system works on trigonometric principles, the depth accuracy is related to the baseline of the system and the resolution of the sensor. As a result, practical
considerations tend to limit the application of such systems to remote objects no more than a few metres away. In addition, as the baseline is lengthened, the angle between the incident light pattern and the line of sight of the camera becomes more obtuse and shadows cast by features in the remote object can obscure the 3D profile measurement.
Alternative techniques rely on scanning a device which measures the distance to a remote point; e.g. a laser rangefinder (LRF) or interferometer, using a remote controlled pan and tilt unit. Such a system is shown in Figure 2.
lo A distance measuring device such as a laser rangef nder 30 measure the distance to a single point on the object, and is controlled by a pan-andtilt unit 32. Control electronics 34 under the control of a computer causes the pan and tilt unit 32 to scan the line of sight of the distance measuring device 30 across the object to be measured to build up a 3D matrix of the azimuth, elevation and distance to the remote object. This matrix of numbers represents 15 the 3D surface profile of the remote object measured in polar co-ordinates from the axis of rotation of the pan and tilt unit. Some corrections may be applied to the data if the axes of rotation of the pan and tilt unit are not co-incident with the line of sight of the measuring device. The resultant 3D surface profile 18 can be transformed to other co-ordinate systems using known mathematical techniques. This technique is analogous to radar systems where 20 the time of flight of a scanned micro-wave radio signal is used to determine the distance to one or more remote objects.
This scanned LRF approach does not suffer from the range limitations of the structured light approach, but is relatively expensive to implement because high precision and costly 25 components are required for good accuracy. In addition, because the whole mass of the distance measuring device is scanned, achieving scanning at rates sufficient to give "real time" 3D image data is problematic.
To overcome this difficulty, systems using a single scanned mirror in front of the LRF have 30 been built. This enables the LRF to remain static. However, the mirror must be sufficiently large to encompass both the laser beam and lens aperture of the receiving system without
vignetting or optical crosstalk between the emitted laser beam and received beam and so such mirror scanned systems still remain expensive.
Hybrid approaches, where the distance measuring device is scanned along a line in one direction and the remote object is rotated have been used to create a 3D map of the complete surface of a the remote object. However, these techniques are not appropriate for all targets.
According to the invention, there is provided an imaging system comprising: 10 a light source; means for scanning the light from the light source over an object to be imaged; stationary receiving optics for receiving light reflected from the object to be imaged; a photodiode light detector for detecting light received from the receiving optics; and processing means for measuring the time of flight of light signals from the light 15 source to the detector for all scanning directions.
The direction of scanning and the associated measured time of flight can be used to build up a three dimensional image of the object. The arrangement of the invention provides a simple structure with few moving parts and using a low cost photodiode detector (or 20 photodiode array) as the detector. This avoids the need for complicated image processing software. The light source may be pulsed, so that each pulse represents a different scanning direction.
Alternatively, the light source output may provide a modulated series of pulses rather than a 25 single pulse, and cross correlation can then be used to calculate time delays.
The means for scanning may comprise a reflector for directing the light from the light source to an object to be imaged and a drive mechanism for controlling movement of the reflector for scanning the light from the light source over the object to be imaged. This 30 enables the light source to be fixed.
The light detector may comprise a photodiode array, and wherein different regions of the object to be imaged are imaged onto different regions of the array by the receiving optics.
The photodiode array may be operable in two modes; a first mode in which light signals 5 from all photodiodes in the array are read out in sequence, and a second mode in which light signals from a selected photodiode or photodiodes in the array are read out.
According to a second aspect of the invention, there is provided an imaging system comprising: 10 a plurality light sources, each light source being directed to an object to be imaged; stationary receiving optics for receiving light reflected from the object to be imaged; a light detector for detecting light received from the receiving optics, wherein each light source is imaged by the receiving optics onto a different region of the light detector, wherein the different regions of the light detector can be actuated separately; and 15 processing means for measuring the time of flight of light pulses from the light source to the detector.
The invention also provides a method of obtaining an image of an object, comprising: scanning a light source signal over the object by directing a light source output in a 20 plurality of scanning directions in sequence, and detecting reflected light received from the object using a two dimensional light detector array; determining the regions of the light detector array which are illuminated for each scanning direction; scanning the light source signal over the object again, and detecting reflected light 25 received from the object using only the determined regions of the light detector array; calculating the time of flight of light pulses from the light source to the detector for each seaming direction; and obtaining a 3D profile from the time of flight calculations.
30 Example of the invention will now be described in detail with reference to the accompanying drawings, in which: Figure 1 shows a first known 3D imaging system;
Figure 2 shows a second known 3D imaging system; Figure 3 shows an imaging system of the invention; Figure 4 shows a simplified example of the photodiode array for use in the system of Figure 3; 5 Figure S is used to explain a modification to the system of Figure 3; Figure 6 shows a modification to the photodiode array; and Figure 7 shows how pulse light sources can be used in the imaging system of the invention. 10 The invention provides a 3D imaging system, with a stationary receiving optical system.
One implementation of the invention is illustrated in Figure 3.
sequentially pulsed laser beam output from a laser 40 is scanned across the remote object 12. The scanning is achieved either by scanning the laser itself, or preferably using a 15 scanned mirror 42 implemented using known techniques such as galvanometer or piezo electric drives 44.
A stationary, receiving optical system 46 is arranged to collect all the light from the remote object and focus it onto a photodiode 48. The photodiode 48 is connected to a pre-amplifier, 20 pulse discriminator and timing electronics. A narrow band-pass optical filter 50 may be used to reject all wavelengths except those near the laser wavelength and optimise discrimination of the laser pulse against the background illumination.
Control electronics 52 control the scanning of the laser beam in azimuth and elevation (X, 25 Y) and the timing of laser pulsing. Each laser pulse is reflected from the remote object 12, collected by receiving optics 46 and focused onto the photodiode 48 to generate an electrical pulse. This pulse is amplified and detected by a pulse detector 54. Timing electronics 56 measure the time of flight (TOF) of the laser beam to the remote object and back to the detector, thus enabling the distance between the detector and remote object to be determined 30 for each laser pulse
Because the control electronics 52 is controlling the laser scanning and laser pulse timing, it is able to build up a matrix of numbers comprising the laser scan azimuth and elevation (X,\) and the distance (Z) to the remote object at that laser line of sight which represents the 3D surface profile of the remote object.
It can be seen that with this approach, the only moving part of the system is a scanned mirror which only need be sufficiently large to steer the laser beam. This avoids the high cost of a precision motorised pan and tilt head and enables a high scan rate. Furthermore, because the laser and receiving optical paths can be kept completely separate there is no risk 10 of optical crosstalk.
To minimise size and cost, the laser scanning system 44 may be implemented in a number of ways including using electro-magentically or piezo-electrically scanned mirrors or by mounting a laser chip on a micromachined silicon or compact piezo electric structure.
The performance of the system can be substantially improved by replacing the pulsed laser source with a modulated laser source and the pulse discriminator by a cross-correlation system. Such systems are known, for example, from DEl9949803 to Denso Corp. In particular, the system may include a signal source such as a laser for supplying a 20 modulation signal and a transmission system connected to the signal source for transmitting a transmitted optical signal modulated by the modulation signal.
The modulation signal may be, for example a maximal length sequence. A reception system is then be provided for receiving a reflected and delayed version of the transmitted 25 signal, and a cross-correlator for obtaining the time delay. The cross correlator can be arranged to determine, at a coarse resolution, the time delay of the modulation signal needed to maximise the correlation between the time delayed modulation signal and the received signal. The cross correlator can then determine, at a finer resolution than the coarse resolution, the correlation between the modulation signal and the received signal as a 30 function of the time delay of the modulation signal with respect to the received signal in a smaller time delay range around the determined time delay. A measure of distance is
calculated from the time delay of the modulation signal needed to maximise the correlation between the time delayed modulation signal and the received signal.
The cross-correlator can be implemented digitally. In this way, for a given laser peak 5 power, greater energy can then be delivered to the remote object which improves the signal to noise ratio and hence maximum range of the system. This oversampling approach enables the distance resolution of the system to be improved; and the efficient signal processing method using coarse and fine cross-correlators minimises the processing power needed. The design described above requires the field of view of the receiving optical system to
encompass the whole of the remote object. As a result, the photodiode will collect illumination from the whole of the field of view. This generates a background offset signal
level in the photodetector. Whilst this offset level can be compensated for in the signal 15 detection process, the shot noise inherent in the offset signal level degrades the ratio of detected laser signal level to noise and hence reduces the maximum range of the system.
It is possible to avoid this limitation by replacing the single photodiode by an X-Y addressed photodiode array. A simplified schematic of such a device, in a 2 X 2 format, is illustrated 20 in Figure 4. The device consists of an array of photodiode pixels 60, each of which comprises a photodiode (PDl l to PD22) and associated transistor (TRl l to TR22), which are configured and drive to act as analogue switches. For standard video imaging applications, the device is operated in an integration mode where incident illumination is focussed upon its surface. The incident illumination generates charge within each 25 photodiode by the photoelectric effect. During this integration period, connections X1, X2, Yl and Y2 are all held low so that all transistors are off and the photodiodes are electrically isolated. The photo-generated charge then accumulates in each photodiode and is stored on the self-capacitance of the photodiode.
30 Once sufficient photocharge has been collected, the device is readout as follows. Input X1 is taken to a high potential so that TR1 is turned on thereby allowing charge to flow between the column and a charge sensitive amplifier 62.. Then input Y1 is pulsed high for
addressing a row of pixels, turning PI 1 on allowing and the photogenerated charge stored on photodiode PD11 to flow through TR11 and TR1 to the output amplifier 62 where the charge is converted to a voltage. This creates an output signal whose amplitude is proportional to the level of charge stored on PD11 and hence the level of light incident on 5 TR11.
After the self capacitance of PD11 has been discharged, input Y1 is taken low and input Y2 is taken high, allowing the stored charge on PD12 to be readout. In this way, a column of pixels is read out in turn.
After all the charge collected by PD12 has been discharged, Y2 is taken low and X2 is taken high to allow PD2 1 and PD22 (the pixels in the next column) to be readout sequentially by pulsing Y1 and Y2 in the manner described above.
15 It can be seen that this process allows the 2 2 array to be scanned and an electrical signal that is the analogue of the incident illumination generated. In normal operation, larger numbers of photodiode are used, e.g. 512 x 512, to increase resolution. Often the readout sequence and sensor scanning are arranged to generate a standard video signal.
20 In addition, it may be noted that the basic structure described here has been simplified for the purpose of describing the proposed invention. Practical X-Y addressed photodiode arrays are generally fabricated as single complementary metal oxide semiconductor (CMOS) large scale integrated circuits (LSI) which include many refinements such as on-chip clock circuitry to generate the pulse sequences for electrodes X1 to Xn and Y1 to Yn on-chip and 25 additional pixel and/or column level circuitry improve amplification and detection of the photo-charge. For 3D profile measurement, the inventor has realised that such X-Y addressed photo-diode arrays can be utilised not in an integrating mode, but as a multiplexer, whereby only the 30 individual photodiode receiving the reflected image of the laser spot on the remote object is addressed.
This may be understood by considering Figure 5. Where the same components are used as in the embodiment of Figure 3, the same reference numerals are used. The laser beam is again scanned by known means to illuminate a point on the surface of the remote object 12.
An image of the laser spot is formed on the surface of the X-Y addressed photodiode array 5 70. The control electronics apply logic level signals to the relevant X and Y control lines of the X-Y addressed array so that the photodiode illuminated by the image of the laser spot is connected to the pre-amplifier and time of flight detection electronics 56. The reflected laser pulse is captured by this photodiode and the resultant electrical signal routed to the electrical pulse detector and TOF measurement circuitry. This computes the TOF of the laser pulse to 10 the spot on the remote object and back to the photodiode on the X-Y addressed array and hence distance from the remote object to the X-Y addressed array. This process is then repeated for many points over the surface of the remote object to measure the surface profile of the remote object. If the image formed of the laser spot is larger than a single pixel then the control electronics can cause the detector to address a group of adjacent 15 photodiodes (e.g. a 2 x 2 sub-array of photodiodes) in parallel to optimise collection and detection of the laser energy.
It can be seen that because only one or a small number of the photodiodes is connected to the receiving amplifier and time of flight electronics at any one time, the background offset
20 signal will be limited to that generated by the part of the field of view focussed onto the
individual photodiode/photodiode group, rather than from the whole of the field of view of
the optics as for the system described above.
For example, if the laser beam is arranged to sequentially scan 100 x 100 equispaced points 25 within the optical field of view, then the background signal level collected by each
photodiode/photodiode group will be nominally reduced by 10,000 times in comparison to the simple system described above, which confers substantial benefits on system performance. 30 In a preferred embodiment, the X-Y sensor, time of flight measurement system and control electronics are fabricated on a single integrated circuit to minimise manufacturing cost. The photodiodes can be manufactured and operated as avalanche photodiodes to provide signal g
amplification by the avalanche effect, prior to signal detection. The tune of flight measurement approach used is as described above in connection with Figure 3.
The laser scanning pattern will often be a repeating pattern arranged to cover the optical 5 field of view whilst providing adequate time resolution to measure the position of moving
objects in the field of view. The pattern is typically arranged as a conventional raster scan
for ease of display on conventional monitors. However, it can be seen that other patterns may be used. One useful pattern is a spiral scan pattern where by controlling the velocity of the laser scan, increased spatial resolution may be achieved in the centre of the scan whilst 10 still maintaining a low spatial resolution to detect objects appearing at the periphery of the scan. For those applications where it is necessary to monitor the 3D motion of a few specific objects within the field of view, the scan can be controlled adaptively to track the objects
15 and ignore the rest of the field of view. This approach can increase the temporal resolution
for the tracked objects.
An important benefit of the approach described here is that the X-Y addressed array can still be operated in an imaging mode, rather than a multiplexed time of flight detector mode.
20 This can be achieved simply by returning the sequence of pulses applied to the X-Y addressed array to a conventional video scanning sequence.
This has several significant benefits. First, for optimal performance, it is important that only the photodiode or local group of photodiodes receiving the image of the laser spot at 25 any point in time are addressed; i.e. that the laser scanning and the photodiode array scanning are synchronized. This would normally require extremely precise calibration of the scanner and optical system. However, if the laser is scanned whilst the sensor is in an imaging mode, an image of the laser path can be collected by the control electronics. This image can be used to determine the precise path of the laser beam image on the surface of 30 the photodiode array and hence set up the correct addressing sequence for the X-Y addressed array and/or laser pulsing sequence, to ensure synchronization in the multiplexed time of flight detector mode. Thus, the normal addressing mode is used as a calibration
stage for the higher performance multiplexing mode. In effect, the system can be self calibrating which is a major benefit for systems which have to operate over large temperature ranges.
5 Secondly, if the laser is turned off and the X-Y addressed photodiode array is operated in an imaging mode, then the system can be used to a conventional imaging device; i.e. a video or still camera to provide additional information regarding the remote object.
Third, because the same detector is used to capture a 3D surface profile and standard 10 image, the registration between the standard image and 3D data is near perfect.
This enables beneficial combinations of conventional and imaging and 3D profile measuring modes to be used. For example, by toggling the system between capturing a 3D scan and a conventional image scan both a video image sequence and 3D sequence to be captured and 15 overlaid on one another. This approach is particularly beneficial for collision avoidance or intruder detection. If the 3D image shows that an object within the field of view is too close
or is on a path likely to lead to a collision, the corresponding part of the conventional image can be coloured or caused to flash to draw attention to the problem.
20 Another benefit of capturing perfectly registered 3D and conventional image data is that image processing methods can be applied simultaneously to both sets of data, using the optimal method to extract important features from each image and combining the feature data to yield more information about objects in the field of view. For example, the 3D data
can be used to determine the orientation and size of an object in the field of view. This data
25 can then be used to select, or generate from a 3D solid model of a known object, an appropriate pattern for pattern matching and object/target recognition.
It can be seen that the benefits of acquiring perfectly registered 3D and conventional image data are many.
Where higher scanning rates are required, the system described above can be improved by modifying the optics of the laser to illuminate a line, rather than a point, of the remote
object and modifying the structure of the X-Y addressed array to permit a row of photodiodes to be connected in parallel to multiple time of flight measurement processors; i.e. a Y scanned, X parallel output array.
One implementation of the revised photodiode array structure is shown for a 2 X 2 array in 5 Figure 6. Each column switch transistor has been replaced by an amplifier 80 and time of flight measurement (TOF) circuitry comprising a pulse detector 82, counter 84 and latch 86.
All counters 84 are clocked at the same frequency by a common clock input 88. The laser scanning system is set to illuminate a strip of the remote object. The image of the 10 illuminated strip is focussed onto row N of the photodiode array. The laser is pulsed on, the TOF counters 84 are reset to zero and the control electronics takes the relevant Y electrode (Y1,Y2) high to connect all the photodiodes along the row to the corresponding amplifier & TOF circuitry. As each photodiode receives the reflected pulse of laser light, the pulse detector causes the corresponding counter 84 to stop counting. Once all reflected pulses 15 have been received, the number held on each counter 84 is latched into the latches 86 and is readout in a normal manner, whilst the next row is illuminated and the process repeated.
The numbers stored in each latch are proportional to the time of flight allowing the distance to each of the points illuminated by the laser line and imaged by the photodiodes to be measured. The time of flight measurement circuitry can use the same principles described above.
With this approach, parallax effects must be taken into account for objects close to the optical system. These effects may be minirnised either by configuring the laser source and 25 receiving optics to be as close as possible, or preferably co-incident through the use of a beam splitter. Alternatively, the laser line can be defocused to illuminate a broad strip of the remote object and ensure that a single row of photodiodes will all receive reflected light.
Another approach is for the control electronics to cause not one, but several photodiodes in each column to be connected to the column amplifier at once. These techniques can be used 30 singly or in combination.
To remove the need for any moving parts, an illumination system comprising multiple pulsed or modulated light sources can be used. This approach is illustrated in Figure 7 which shows an illumination system comprising three laser diodes 90A, 90B, 90C configured so that each laser diode illuminates a region of the field of view of the receiving
5 optics. Each light source would be operated in sequence to cause the region of illumination to be synchronized with the readout of the Y scanned, X parallel output array. Preferably, the laser diodes are fabricated on a common substrate with integral drive circuitry.
It can be seen that many alternative laser/multiplexor combinations can be used, depending 0 upon the requirements of the end application.
For example, if it is only required to measure the distance profile along a line, then a one dimensional array of photodiodes may be used. Alternatively, a very simple system with no moving parts can be realised using a single bright pulsed or modulated light source to 15 illuminate the whole of the field of view simultaneously and an X-Y addressed array, or a Y
scanned, X parallel output array used to capture the 3D image data in the manner described above. For the very highest speed applications, the pulsed light source can be arranged to Co illuminate the whole of the field of view and TOF measurement circuitry provided for each
photodiode in the array. This enables the TOF and hence distance to the remote object to be measured in parallel for all points in the field of view.
Claims (9)
- i. An imaging system comprising: 5 a light source; means for scanning the light from the light source over an object to be imaged; stationary receiving optics for receiving light reflected from the object to be imaged; a photodiode light detector for detecting light received from the receiving optics; and processing means for measuring the time of flight of light signals from the light 10 source to the detector for all scanning directions.
- An imaging system as claimed in claim 1, wherein the light source is a pulsed laser.
- 3. An imaging system as claimed in claim 1, wherein the light source is a laser having 5 a modulated output.
- 4. An imaging system as claimed in any preceding claim, wherein the means for scanning comprises: a reflector for directing the light from the light source to an object to be imaged; and 20 a drive mechanism for controlling movement of the reflector for scanning the light from the light source over the object to be imaged.
- 5. An imaging system as claimed in any preceding claim, wherein the light detector comprises a photodiode array, and wherein different regions of the Object to be imaged are 25 imaged onto different regions of the array by the receiving optics.
- 6. An imaging system as claimed in claim 5, wherein the photodiode array is operable in two modes; a first mode in which light signals from all photodiodes in the array are read out in sequence, and a second mode in which light signals from a selected photodiode or 30 photodiodes in the array are read out.
- 7. An imaging system comprising:a plurality light sources, each light source being directed to an object to be imaged; stationary receiving optics for receiving light reflected from the object to be imaged; a light detector for detecting light received from the receiving optics, wherein each light source is imaged by the receiving optics onto a different region of the light detector, wherein the different regions of the light detector can be actuated separately; and processing means for measuring the time of flight of light pulses from the light source lo the detector.
- 8. An imaging system as claimed in claim 7, wherein each light source comprises a 0 pulsed laser.
- 9. A method of obtaining an image of an object, comprising: scanning a light source signal over the object by directing a light source output in a plurality of scanning directions in sequence, and detecting reflected light received from the 15 object using a two dimensional light detector array; determining the regions of the light detector array which are illuminated for each scanning direction; scanning the light source signal over the object again, and detecting reflected light received from the object using only the determined regions of the light detector array; 20 calculating the time of flight of light pulses from the light source to the detector for each scanning direction; and obtaining a 3D profile from the time of flight calculations.
Priority Applications (17)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB0108497A GB2374743A (en) | 2001-04-04 | 2001-04-04 | Surface profile measurement |
| GB0110577A GB2374228A (en) | 2001-04-04 | 2001-04-30 | A collision warning device |
| PCT/GB2002/001612 WO2002082016A1 (en) | 2001-04-04 | 2002-04-04 | Surface profile measurement |
| ES02718331T ES2264476T3 (en) | 2001-04-04 | 2002-04-04 | SURFACE PROFILE MEASUREMENT. |
| PCT/GB2002/001614 WO2002082201A1 (en) | 2001-04-04 | 2002-04-04 | Image analysis apparatus |
| AT02720176T ATE310268T1 (en) | 2001-04-04 | 2002-04-04 | SYSTEM FOR IMAGE ANALYSIS |
| AT02718331T ATE326685T1 (en) | 2001-04-04 | 2002-04-04 | MEASURING A SURFACE PROFILE |
| EP02718331A EP1373830B1 (en) | 2001-04-04 | 2002-04-04 | Surface profile measurement |
| DE60207395T DE60207395T2 (en) | 2001-04-04 | 2002-04-04 | SYSTEM FOR IMAGE ANALYSIS |
| DE60211497T DE60211497T2 (en) | 2001-04-04 | 2002-04-04 | MEASUREMENT OF A SURFACE PROFILE |
| ES02720176T ES2250636T3 (en) | 2001-04-04 | 2002-04-04 | IMAGE ANALYSIS TEAM. |
| US10/474,236 US7319777B2 (en) | 2001-04-04 | 2002-04-04 | Image analysis apparatus |
| US10/474,293 US7248344B2 (en) | 2001-04-04 | 2002-04-04 | Surface profile measurement |
| PT02718331T PT1373830E (en) | 2001-04-04 | 2002-04-04 | MEDICATION OF THE PROFILE OF A SURFACE |
| JP2002579742A JP4405154B2 (en) | 2001-04-04 | 2002-04-04 | Imaging system and method for acquiring an image of an object |
| JP2002580103A JP4405155B2 (en) | 2001-04-04 | 2002-04-04 | Image analysis system |
| EP02720176A EP1374002B1 (en) | 2001-04-04 | 2002-04-04 | Image analysis apparatus |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB0108497A GB2374743A (en) | 2001-04-04 | 2001-04-04 | Surface profile measurement |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| GB0108497D0 GB0108497D0 (en) | 2001-05-23 |
| GB2374743A true GB2374743A (en) | 2002-10-23 |
Family
ID=9912263
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB0108497A Withdrawn GB2374743A (en) | 2001-04-04 | 2001-04-04 | Surface profile measurement |
| GB0110577A Withdrawn GB2374228A (en) | 2001-04-04 | 2001-04-30 | A collision warning device |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB0110577A Withdrawn GB2374228A (en) | 2001-04-04 | 2001-04-30 | A collision warning device |
Country Status (1)
| Country | Link |
|---|---|
| GB (2) | GB2374743A (en) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2421383A (en) * | 2004-12-07 | 2006-06-21 | Instro Prec Ltd | Surface profile measurement |
| WO2008061307A1 (en) * | 2006-11-23 | 2008-05-29 | Newsouth Innovations Pty Ltd | A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle |
| DE102008039838A1 (en) | 2008-08-27 | 2010-03-04 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Measuring object's three dimensional surface scanning method, involves executing scanning movement of laser light beam by beam deflecting unit, and utilizing laser light beam for measuring and displaying data of scanning points |
| CN103064087A (en) * | 2012-12-25 | 2013-04-24 | 符建 | Three-dimensional imaging radar system and method based on multiple integral |
| CN105807285A (en) * | 2016-04-21 | 2016-07-27 | 深圳市金立通信设备有限公司 | Multi-zone distance measuring method and device and terminal |
| CN109814128A (en) * | 2019-01-23 | 2019-05-28 | 北京理工大学 | The high-resolution fast imaging system and method that time flight is combined with relevance imaging |
| CN111727602A (en) * | 2017-12-26 | 2020-09-29 | 罗伯特·博世有限公司 | Single-chip RGB-D camera |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE10303044A1 (en) * | 2003-01-24 | 2004-08-12 | Daimlerchrysler Ag | Device and method for improving the visibility in motor vehicles |
| DE10329054A1 (en) * | 2003-06-27 | 2005-01-13 | Volkswagen Ag | Collision object recognition system |
| ES2323637T3 (en) | 2004-07-22 | 2009-07-22 | Bea S.A. | THERMOSENSIBLE PRESENCE DETECTION DEVICE AROUND AUTOMATIC DOORS. |
| ATE387620T1 (en) * | 2004-07-22 | 2008-03-15 | Bea Sa | LIGHT SCANNING DEVICE FOR DETECTION AROUND AUTOMATIC DOORS |
| DE102004037870B4 (en) * | 2004-08-04 | 2007-02-15 | Siemens Ag | Optical module for an outer vestibule in the direction of travel of a motor vehicle detecting assistance system |
| DE102007048848A1 (en) * | 2007-10-11 | 2009-04-16 | Robert Bosch Gmbh | Spatial driver assistance system |
| DE102009045558B4 (en) * | 2009-10-12 | 2021-07-15 | pmdtechnologies ag | Camera system |
| GB2494414A (en) * | 2011-09-06 | 2013-03-13 | Land Rover Uk Ltd | Terrain visualisation for vehicle using combined colour camera and time of flight (ToF) camera images for augmented display |
| US9002112B2 (en) | 2013-08-27 | 2015-04-07 | Trimble Navigation Limited | Video alignment system |
| WO2015075926A1 (en) * | 2013-11-20 | 2015-05-28 | パナソニックIpマネジメント株式会社 | Distance measurement and imaging system |
| FR3052561B1 (en) | 2016-06-14 | 2018-06-29 | Inria Institut National De Recherche En Informatique Et En Automatique | AUTONOMOUS GUIDING MACHINE |
| WO2021249831A1 (en) * | 2020-06-09 | 2021-12-16 | Osram Gmbh | Lidar system with coarse angle control |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH04274707A (en) * | 1991-03-01 | 1992-09-30 | Nippon Telegr & Teleph Corp <Ntt> | Shape measuring device |
| GB2286495A (en) * | 1994-02-10 | 1995-08-16 | Mitsubishi Electric Corp | Optical radar apparatus for vehicle |
| US5682229A (en) * | 1995-04-14 | 1997-10-28 | Schwartz Electro-Optics, Inc. | Laser range camera |
| WO1997040342A2 (en) * | 1996-04-24 | 1997-10-30 | Cyra Technologies, Inc. | Integrated system for imaging and modeling three-dimensional objects |
| WO2000005564A1 (en) * | 1998-07-23 | 2000-02-03 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Imbibition measuring apparatus for determining powder wettability |
| US6094270A (en) * | 1996-08-07 | 2000-07-25 | Matsushita Electric Industrial Co., Ltd. | Range finder |
| US6323942B1 (en) * | 1999-04-30 | 2001-11-27 | Canesta, Inc. | CMOS-compatible three-dimensional image sensor IC |
| US20010048519A1 (en) * | 2000-06-06 | 2001-12-06 | Canesta, Inc, | CMOS-Compatible three-dimensional image sensing using reduced peak energy |
| US20020003617A1 (en) * | 1999-03-18 | 2002-01-10 | Guenter Doemens | Spatially resolving range-finding system |
| US20020036765A1 (en) * | 2000-08-09 | 2002-03-28 | Mccaffrey Nathaniel Joseph | High resolution 3-D imaging range finder |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3123303B2 (en) * | 1992-07-21 | 2001-01-09 | 日産自動車株式会社 | Vehicle image processing device |
| JP3192875B2 (en) * | 1994-06-30 | 2001-07-30 | キヤノン株式会社 | Image synthesis method and image synthesis device |
| US6384859B1 (en) * | 1995-03-29 | 2002-05-07 | Sanyo Electric Co., Ltd. | Methods for creating an image for a three-dimensional display, for calculating depth information and for image processing using the depth information |
| JPH10142331A (en) * | 1996-11-14 | 1998-05-29 | Komatsu Ltd | Vehicles with millimeter wave radar |
-
2001
- 2001-04-04 GB GB0108497A patent/GB2374743A/en not_active Withdrawn
- 2001-04-30 GB GB0110577A patent/GB2374228A/en not_active Withdrawn
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH04274707A (en) * | 1991-03-01 | 1992-09-30 | Nippon Telegr & Teleph Corp <Ntt> | Shape measuring device |
| GB2286495A (en) * | 1994-02-10 | 1995-08-16 | Mitsubishi Electric Corp | Optical radar apparatus for vehicle |
| US5682229A (en) * | 1995-04-14 | 1997-10-28 | Schwartz Electro-Optics, Inc. | Laser range camera |
| WO1997040342A2 (en) * | 1996-04-24 | 1997-10-30 | Cyra Technologies, Inc. | Integrated system for imaging and modeling three-dimensional objects |
| US6094270A (en) * | 1996-08-07 | 2000-07-25 | Matsushita Electric Industrial Co., Ltd. | Range finder |
| WO2000005564A1 (en) * | 1998-07-23 | 2000-02-03 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Imbibition measuring apparatus for determining powder wettability |
| US20020003617A1 (en) * | 1999-03-18 | 2002-01-10 | Guenter Doemens | Spatially resolving range-finding system |
| US6323942B1 (en) * | 1999-04-30 | 2001-11-27 | Canesta, Inc. | CMOS-compatible three-dimensional image sensor IC |
| US20010048519A1 (en) * | 2000-06-06 | 2001-12-06 | Canesta, Inc, | CMOS-Compatible three-dimensional image sensing using reduced peak energy |
| US20020036765A1 (en) * | 2000-08-09 | 2002-03-28 | Mccaffrey Nathaniel Joseph | High resolution 3-D imaging range finder |
Cited By (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2421383A (en) * | 2004-12-07 | 2006-06-21 | Instro Prec Ltd | Surface profile measurement |
| US7834985B2 (en) | 2004-12-07 | 2010-11-16 | Instro Precision Limited | Surface profile measurement |
| WO2008061307A1 (en) * | 2006-11-23 | 2008-05-29 | Newsouth Innovations Pty Ltd | A method of determining characteristics of a remote surface with application to the landing of an aerial vehicle |
| DE102008039838A1 (en) | 2008-08-27 | 2010-03-04 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Measuring object's three dimensional surface scanning method, involves executing scanning movement of laser light beam by beam deflecting unit, and utilizing laser light beam for measuring and displaying data of scanning points |
| CN103064087A (en) * | 2012-12-25 | 2013-04-24 | 符建 | Three-dimensional imaging radar system and method based on multiple integral |
| CN103064087B (en) * | 2012-12-25 | 2015-02-25 | 符建 | Three-dimensional imaging radar system and method based on multiple integral |
| CN105807285A (en) * | 2016-04-21 | 2016-07-27 | 深圳市金立通信设备有限公司 | Multi-zone distance measuring method and device and terminal |
| CN105807285B (en) * | 2016-04-21 | 2019-07-12 | 深圳市金立通信设备有限公司 | Multizone distance measuring method, range unit and terminal |
| CN111727602A (en) * | 2017-12-26 | 2020-09-29 | 罗伯特·博世有限公司 | Single-chip RGB-D camera |
| CN109814128A (en) * | 2019-01-23 | 2019-05-28 | 北京理工大学 | The high-resolution fast imaging system and method that time flight is combined with relevance imaging |
| CN109814128B (en) * | 2019-01-23 | 2020-08-11 | 北京理工大学 | High-resolution rapid imaging system and method combining time flight and associated imaging |
Also Published As
| Publication number | Publication date |
|---|---|
| GB0110577D0 (en) | 2001-06-20 |
| GB0108497D0 (en) | 2001-05-23 |
| GB2374228A (en) | 2002-10-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US7248344B2 (en) | Surface profile measurement | |
| US7834985B2 (en) | Surface profile measurement | |
| US10739460B2 (en) | Time-of-flight detector with single-axis scan | |
| GB2374743A (en) | Surface profile measurement | |
| US11408983B2 (en) | Lidar 2D receiver array architecture | |
| US8606496B2 (en) | Laser ranging, tracking and designation using 3-D focal planes | |
| US10739444B2 (en) | LIDAR signal acquisition | |
| US7586077B2 (en) | Reference pixel array with varying sensitivities for time of flight (TOF) sensor | |
| US11269065B2 (en) | Muilti-detector with interleaved photodetector arrays and analog readout circuits for lidar receiver | |
| IL258130A (en) | Time of flight distance sensor | |
| EP3391076A1 (en) | Light detection and ranging sensor | |
| CN108885260B (en) | Time-of-flight detector with single axis scanning | |
| US20210055419A1 (en) | Depth sensor with interlaced sampling structure | |
| US12399278B1 (en) | Hybrid LIDAR with optically enhanced scanned laser | |
| US20260029536A1 (en) | Hybrid LIDAR with Optically Enhanced Scanned Laser | |
| JP3029005B2 (en) | Stereo vision camera | |
| EP3665503A1 (en) | Lidar signal acquisition |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |