[go: up one dir, main page]

WO1992011508A1 - Object location system - Google Patents

Object location system Download PDF

Info

Publication number
WO1992011508A1
WO1992011508A1 PCT/AU1991/000589 AU9100589W WO9211508A1 WO 1992011508 A1 WO1992011508 A1 WO 1992011508A1 AU 9100589 W AU9100589 W AU 9100589W WO 9211508 A1 WO9211508 A1 WO 9211508A1
Authority
WO
WIPO (PCT)
Prior art keywords
lookup table
range
angle
output signal
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/AU1991/000589
Other languages
French (fr)
Inventor
Kemal Ajay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Monash University
Original Assignee
Monash University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Monash University filed Critical Monash University
Priority to AU91115/91A priority Critical patent/AU657110B2/en
Publication of WO1992011508A1 publication Critical patent/WO1992011508A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/10Measuring distances in line of sight; Optical rangefinders using a parallactic triangle with variable angles and a base of fixed length in the observation station, e.g. in the instrument
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • This invention relates to an object location system and, in particular, to a rangefinder for use in robotics where it is desired that sensory information regarding the shape and pose of objects and a distance to them be provided. This information is intended to allow for obstacle avoidance by mobile robots or remotely controlled vehicles.
  • Robot vision systems which provide rangefinding capability to enable a mobile robot to move and avoid objects have been proposed.
  • many such systems are still very much in the experimental stage and are complicated and extremely expensive.
  • most systems utilize triangulation to locate an object in the field of view of the vision system and require complicated computations in order to enable range values to be determined so that the robot is ' provided with an indication of where obstacles are located so that the robot or remote controlled vehicle can move around the obstacles without hitting them.
  • the object of this invention is to provide a relatively inexpensive object location system which quickly provides range information in respect of objects so that a robot or remote controlled vehicle or the like can automatically control itself to avoid the obstacle.
  • the invention may be said to reside in an object location system comprising signal output means for outputting a signal into a field of view containing an object at an angle to a predetermined base line, a detector for detecting the output signal in the field of view, processing means for producing a first signal indicative of the angle of the output signal and for providing a second signal indicative of the angle, with respect to the base line, at which the output signal is detected, a lookup table in said processing means for storing precalculated range values relating to the field of view of the system, such that said first and second signals can be fed to said lookup table to provide a range output value from the lookup table which will provide the range of the object.
  • the present invention may also be said to reside in an object location method comprising the steps of; outputting an output signal at predetermined known angles into a field of view containing an object and detecting the output signal in the field of view, providing a lookup table having range values relating to the field of view, using information relating to the angle of the output signal with respect to a base line and information relating to the angle of detection of the output signal with respect to the base line to select an appropriate range value from the lookup table to provide the range of the object in the field of view.
  • range values for objects which can be positioned in the field of view of the system can be stored in the lookup table.
  • information relating to the angle of the output signal and the angle of detection of the output signal need only be utilized to select an appropriate range value from the lookup table to provide the range to that object.
  • the system does not require complicated computations to be made in order to calculate a position and the range value can be selected very quickly thereby providing considerable speed in the determination of a range value for an object in the field of view of the system.
  • the lookup table can be provided which compensates for motion in the direction of observation. If the system is being moved at a constant speed, and this speed is known, a lookup table precalculated to that speed may be provided so that the range values which compensate for the movement of the system are provided.
  • a plurality of lookup tables could be stored, each having range values applicable to a particular constant speed of the system.
  • the processing means provides for calibration to precalculate the range values for storage in the lookup table.
  • the lookup table is an element array and the detector is a camera having a plurality of pixels, said precalculated range values being stored in the element array such that one index of the element array is a pixel X coordinate of the camera which thereby provides an indication of the angle of detection of the output signal, and the other index of the element array is the corresponding angular position value of the output means thereby providing an indication of the angle of the output signal, a value written into the appropriate array element identified by those indexes containing a precalculated range value corresponding to the' said pixel X coordinate, and the corresponding output means angular position value, so that in use when an object is detected by the pixel at that X coordinate and with the output means at that angular position value, the range value can be merely read from the lookup table to provide the range to that object.
  • the signal output means is a laser and the output signal is a beam of infra-red electromagnetic radiation.
  • Figure 2 shows range find geometry relating to the system of Figure 1;
  • FIG. 3 is a block diagram of the processing circuitry according to the preferred embodiment of the invention.
  • FIGS. 4 and 4A to 4F are a detailed view showing the processing circuitry.
  • Figure 5 is a diagram relating to calibration of the system.
  • the system comprises a chassis 10 which supports a camera 12 and a laser 14.
  • the chassis also includes a card 16 upon which motor scanning control circuitry is located, together with a motor and gearbox 18 for controlling movement of the laser 14.
  • the motor and gearbox 18 are connected to the laser 14 by a connection rod 20 and are operable to smoothly move the laser in the direction of double headed arrow A so that a beam 22 from the laser scans a field of view 24 to locate objects 26 in the field of view.
  • the camera 12 is connected to a host computer 30 and, if desired, to a monitor 32 so that an image produced by the camera 12 can be viewed. In actual use of the system the monitor 32 would probably not be provided, and is generally only used for calibration purposes to be described hereinafter.
  • the laser 14 may have a lens system for forming the laser beam into the stripe 22 or a fan beam etc.
  • the distance between the camera 12 and laser 14 is accurately known and is represented by the distance in Figure 2.
  • the laser 14 is moved by the motor and gearbox 80 and connecting rod 20 so that the stripe 22 scans the field of view 24 so that stripe 22 on the objects in the field of view can be detected by the camera 12.
  • the movement of the laser 14 is controlled so that the angle ⁇ of the stripe 22 with respect to the base line between the camera and projector can be accurately determined and, in the preferred embodiment, is provided by a count number indicative of the angle through which the laser has moved.
  • the angle ⁇ of detection of the light beam on an object, such as point PO in Figure 2 is determined by the pixel upon which maximum intensity of the detected light beam occurs.
  • a video signal from camera 12 is passed through a peak detector 40 so that the stripe 22 on the objects 26 in Figure 1 can be readily identified.
  • the video signal is also provided to a sync processor 42 which in turn is coupled to vertical and horizontal pixel counters 44 and 46.
  • the peak detector and the vertical and horizontal pixel counters are connected to a memory store 46.
  • the vertical pixel counter and the horizontal pixel counter are also connected to control circuitry 48 and DMA access controller 50.
  • the horizontal pixel counter is also connected to a lookup table 52 as is the memory store 46.
  • the control circuitry is also connected to the memory store 46 and the lookup table 52 and the DMA access controller are both connected to the host computer 30 ( Figure 1).
  • the lookup table is provided with range values for, in practical terms, all points in the field of view of the system. All the range values for all the points in the field of view are stored in the lookup table so that the lookup table can be addressed to find a particular range value merely from the fixed direction pixel coordinate from the camera 12 which detects the stripe 22 (which effectively gives the angle ⁇ in Figure 2) and the current angle ⁇ of the stripe 22.
  • the memory 46 contains the stripe angle for each point XY where the stripe can be detected so the contents of the memory 46 and its X index can be used to drive the lookup table 52 to output range values.
  • the table is wired into a plug-in card which carries the circuitry shown in Figure 4 and which is to reside in the computer 30, although in other embodiments the lookup table could be a software lookup table in the form of a two dimensional array.
  • the processing circuitry will be described in more detail.
  • the video signal from the camera 12 is inputted at 58 and is received by the peak detector 40 which includes a comparator 62, a buffer 64 and a buffer 66.
  • the video signal is buffered by the buffer 64 and passes through a differentiating circuit comprising capacitor 68 and resistors 69 and 70.
  • the differentiated signal is supplied to the inverting input of the comparator 62.
  • a threshold for the comparator appears at its non-inventing input and is set by trimmer circuit 71 and is buffered by buffer 66.
  • the resistors 72 set the hysteresis of the comparator 62.
  • the comparator output passes through a shaping buffer 73 to improve its time rise and flip-flops 74 capture the positive going transitions of the comparator output and hold them until a pixel clock 90 accepts that information.
  • the comparator triggers on the zero crossing point ofthe signal provided to its inverting input and thereby provides an indication of the peak of the signal outputted from the camera.
  • the peak is indicative of the stripe 22 on the objects 26.
  • a state counter is provided by the circuits 75 and the circuits 76 are provided to monitor the state of the processing circuitry.
  • the circuit 75 records one of four states, and they are a reset state, a clear frame store state, an acquire pre-range data state and a done state.
  • the circuit 77 is enabled and maintains a count of the number of video frames up to a pre-set terminal count. This terminal count is set by link jumper 78.
  • the circuit 76 controls the operation of the state counter section of the processing circuitry.
  • the sync processor 42 receives the video signal from the input 58 and generates the necessary synchronization signals to enable and reset memory address counters 80, 81, 82 and 83, together with blanking counters 84 and 85.
  • the video signal is decoded by decoder 86 which detects the onset of horizontal and vertical synchronization pulses as well as determining odd and even fields and transitions at the outputs are sharpened by the circuits 87.
  • the circuit 76 also assists in the control of the organization processor 42 and part of the circuit 76 so forms part of the synchronization circuit 42.
  • Pixel clock generator 90 generates the necessary clock signals to advance the address counters 80 to 83 and to sample the output from the peak detector circuit 40.
  • the pixel clock 90 controls the write signal to a frame store (to be described hereinafter) and also clocks the horizontal blanking counter 84.
  • the address counters 80 to 83 together with part of circuit 93 make up an address counter system and are enabled in synchronism with the raster scan of the camera 12 so that each address corresponds to a particular pixel in the camera's image.
  • the counters 80 to 83 generate the addresses to index into the frame store (to be described hereinafter) during acquisition and during DMA transfer.
  • DMA read only circuits 82 and 83 are used for dressing lookup table 52 comprised of circuits 100 and 101.
  • DMA write circuits 80 and 81 are also used.
  • blanking counters 84 and 85 are used.
  • the blanking counter 84 disables the clock to the address generators for 32 pixel clock times after the end of the horizontal sync pulse and the blanking counter 85 does the same for 16 lines after the start of a frame.
  • Part of the circuit 93 is also used to control circuits 84 and 85.
  • the memory controller 49 is a PAL device programmed to issue chip enable and output enable to memory 46 which includes circuits 103 and 104 and also to the lookup table circuits 100 and 101 based on several control signals.
  • the memory circuits 103 and 104 comprise a frame store which is indexed by the address counters 80 to 83. Since these counters are synchronised to the cameras raster, each memory location in the store maps to a unique pixel location in the camera's image frame. When a pixel in the camera is illuminated by the peak reflection of the laser stripe from an object 26, the peak detector circuit fires, causing a write to the frame store at the location corresponding to that pixel. The value stored in that location is the current value of the laser position counter produced by the position counter circuits 106.
  • the circuits 106 are connected to a buffer 107 so that output signals are presented to the frame store circuits 103 and 104 at the write time.
  • Circuits 100 and 101 are a lookup table and it is loaded from the host computer 30 ( Figure 1) using a DMA write transfer.
  • the lookup table is preferably a 256 x 256 element array where one index is a pixel X coordinate, and the other is the corresponding laser position value.
  • these are derived from the low order byte and data output respectively of the frame store circuits 103 and 104.
  • the high and low order address bytes are sourced from the address counters 80 to 83.
  • Multiplexer circuits 109 are coupled to the lookup table circuits 100 and 101 because the lookup table address lines are sourced differently depending on the current operating state.
  • the lookup table is written to from the host computer (during calibration of the system which will be described hereinafter) the upper and lower address bytes are sourced from the address counters 80 to 83.
  • the low order address byte is sourced from the data output byte of the frame store, while the high order address byte is sourced from the circuits 80 to 83.
  • the multiplexer 109 selects between these two sources depending on whether the DMA transfer operation is a read or a write.
  • the circuit 110 constitutes a status register so that when the range acquisition process is complete, the done output of the circuit 76 is set and this may be read by the host computer 30 as a bit zero of the circuit 110.
  • the circuits 111 perform the task of address decoding the DMA transfer control.
  • One of the circuits is a comparator which compares the dip switch settings with the host computer address lines and the other is a PAL device.
  • the circuit 112 is a tri-state bi-directional buffer which transfers data during DMA read and write cycles.
  • the counters made up by the circuits 106 keep track of the position of the laser stripe 22 as it sweeps across the field of view 24 and the buffer 107 asserts the counter data onto the frame store memory bus whenever a peak is detected from the peak detector 42.
  • the system In order to initially set the range values in the lookup table, the system must be calibrated. Since in the preferred embodiment of the invention the lookup table is a 256 x 256 element array, 65,536 range values can be stored.
  • the field of view ' of the vision system is preferably in the order of 3 meters, and therefore the number of range values to be stored for that field of view provide sufficient information to enable the range of any object which may fall within the field of view to be obtained with a degree of accuracy sufficient to enable the robot to take evasive action during movement.
  • a calibration object 200 is arranged within the field of view of the camera 12.
  • the laser 14 is moved to scan the calibration object.
  • the host computer 30 preferably contains a program to enable the range values to be calculated for all of the 256 x 256 elements in the lookup table array based on triangulation and the angle of the laser 14 and the angle of detection of the reflected stripe 22 from the calibration object 200.
  • some information is keyed into the computer 30, and that information comprises the dimensions of the calibration object and its distance from the camera 12.
  • Information relating to the limits of the field of view can also be inserted in terms of a position nearest the system, and a position furthest from the system.
  • the nearest position can be in the order of 20 cm and the furthest in the order of 3 meters.
  • the range lookup table can be calculated in the following way. If we know ⁇ and ⁇ (which are provided by the counter 106 in Figure 2 and the X pixel coordinate of the camera 12) and the length of the base line K, the distance to object Po can be calculated. Since we actually have a counter-value and a pixel location, we require a relationship between those values and location, the base line K and the distance FPO. The following expression can be used:
  • IW is the camera's image plane width in pixels, which is preferably obtained by calibration and may be different from camera manufacturer's specification
  • B is the camera angle in radians
  • V is the counter value of counter 106
  • U is the pixel location.
  • a range acquisition cycle occurs as follows: When a start signal is issued from the host microcomputer (30), state counters 75 are cleared. This is the reset state or 00. The next vertical sync signal from the camera advances the state to 01. This state is decoded to enable the clear buffer or CLRBF signal. This clears the laser position counter 106 to 0, and enables this value to be stored in the frame 46.
  • the address counters 30 - 83 are made to advance during this time so that every location in the frame store is cleared in this way.
  • the next vertical sync signal advances the state counter to binary 10.
  • the frame store 46 is made to accept data from the laser position detector 106, whenever the peak detector circuit 42 detects the laser stripe's image in the video signal.
  • This state is maintained for a number of frames, this number being selectable as 64, 128 or 256.
  • This number is determined by the frame counter, 77.
  • Counter 77 is cleared in state 2, and in state 3, counts the number of vertical sync pulses.
  • Link 78 sets the terminal count by connecting the appropriate output from the counter to the state controller, 76.
  • the laser position counter 106 advances. Also, for each frame, the address counters 80 - 83 are reset and then made to count through every address in the frame store 46, in synchronism with the camera's raster scan. This means that address value has a unique and corresponding pixel in the image plane of the camera. Thus, it follows that each address of the frame store has a corresponding pixel in the image plane.
  • the peak detector 42 fires, the current value of the laser position counter is stored at the current address in the frame store.
  • the inputs to the lookup table are the x-position of the stripe in the camera's image plane and the corresponding laser position value. Both of these are 8 bits wide so there are 256 x 256 possible combinations, representing 65,536 possible range values (some of which are infeasible) .
  • the lookup table has been pre-calculated, as described above, to include all of these values and they are loaded into the lookup table store.
  • the frame store data values are presented, in succession, to the low order byte of the lookup table 52 and the corresponding "x" values are presented to the high order byte of the lookup table 52.
  • the lookup table value (range value) thus indexed is placed on the data bus to be read by the host computer, 30.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An object location system is disclosed which has particular application as a rangefinder for use in robotics. The system utilizes a laser (14) for outputting a beam and a camera (12) for monitoring the beam when reflected from objects. The laser (14) is moved by a motor and gearbox (18) so that the beam produced by the laser scans the desired area. The camera (12) is connected to a computer (30) which processes the image of the reflected beam and produces a first signal indicative of the angle of the beam from the laser and the second signal indicative of the angle at which the beam is detected. A lookup table is provided in the processing means for storing calculated range values relating to the field of view of the system so that the first and second signals can be fed into the lookup table to provide a range output value from the lookup table which will provide the range of an object.

Description

OBJECT LOCATION SYSTEM This invention relates to an object location system and, in particular, to a rangefinder for use in robotics where it is desired that sensory information regarding the shape and pose of objects and a distance to them be provided. This information is intended to allow for obstacle avoidance by mobile robots or remotely controlled vehicles.
Robot vision systems which provide rangefinding capability to enable a mobile robot to move and avoid objects have been proposed. However, many such systems are still very much in the experimental stage and are complicated and extremely expensive. In particular, most systems utilize triangulation to locate an object in the field of view of the vision system and require complicated computations in order to enable range values to be determined so that the robot is 'provided with an indication of where obstacles are located so that the robot or remote controlled vehicle can move around the obstacles without hitting them.
The object of this invention is to provide a relatively inexpensive object location system which quickly provides range information in respect of objects so that a robot or remote controlled vehicle or the like can automatically control itself to avoid the obstacle.
The invention may be said to reside in an object location system comprising signal output means for outputting a signal into a field of view containing an object at an angle to a predetermined base line, a detector for detecting the output signal in the field of view, processing means for producing a first signal indicative of the angle of the output signal and for providing a second signal indicative of the angle, with respect to the base line, at which the output signal is detected, a lookup table in said processing means for storing precalculated range values relating to the field of view of the system, such that said first and second signals can be fed to said lookup table to provide a range output value from the lookup table which will provide the range of the object.
The present invention may also be said to reside in an object location method comprising the steps of; outputting an output signal at predetermined known angles into a field of view containing an object and detecting the output signal in the field of view, providing a lookup table having range values relating to the field of view, using information relating to the angle of the output signal with respect to a base line and information relating to the angle of detection of the output signal with respect to the base line to select an appropriate range value from the lookup table to provide the range of the object in the field of view.
Since the present invention utilizes a lookup table, range values for objects which can be positioned in the field of view of the system can be stored in the lookup table. Thus, when an object is detected, information relating to the angle of the output signal and the angle of detection of the output signal need only be utilized to select an appropriate range value from the lookup table to provide the range to that object. Thus, the system does not require complicated computations to be made in order to calculate a position and the range value can be selected very quickly thereby providing considerable speed in the determination of a range value for an object in the field of view of the system.
In a preferred embodiment of the present invention, the lookup table can be provided which compensates for motion in the direction of observation. If the system is being moved at a constant speed, and this speed is known, a lookup table precalculated to that speed may be provided so that the range values which compensate for the movement of the system are provided.
In the above mentioned embodiment of the invention, a plurality of lookup tables could be stored, each having range values applicable to a particular constant speed of the system.
Preferably, the processing means provides for calibration to precalculate the range values for storage in the lookup table. Preferably, the lookup table is an element array and the detector is a camera having a plurality of pixels, said precalculated range values being stored in the element array such that one index of the element array is a pixel X coordinate of the camera which thereby provides an indication of the angle of detection of the output signal, and the other index of the element array is the corresponding angular position value of the output means thereby providing an indication of the angle of the output signal, a value written into the appropriate array element identified by those indexes containing a precalculated range value corresponding to the' said pixel X coordinate, and the corresponding output means angular position value, so that in use when an object is detected by the pixel at that X coordinate and with the output means at that angular position value, the range value can be merely read from the lookup table to provide the range to that object.
Preferably, the signal output means is a laser and the output signal is a beam of infra-red electromagnetic radiation.
A preferred embodiment of the invention will be described by way of example with reference to the accompanying drawings in which; Figure 1 is a schematic view of the system embodying the invention;
Figure 2 shows range find geometry relating to the system of Figure 1;
Figure 3 is a block diagram of the processing circuitry according to the preferred embodiment of the invention;
Figures 4 and 4A to 4F are a detailed view showing the processing circuitry; and
Figure 5 is a diagram relating to calibration of the system.
With reference to Figures 1 and 2, the system according to the preferred embodiment comprises a chassis 10 which supports a camera 12 and a laser 14. The chassis also includes a card 16 upon which motor scanning control circuitry is located, together with a motor and gearbox 18 for controlling movement of the laser 14. The motor and gearbox 18 are connected to the laser 14 by a connection rod 20 and are operable to smoothly move the laser in the direction of double headed arrow A so that a beam 22 from the laser scans a field of view 24 to locate objects 26 in the field of view.
The camera 12 is connected to a host computer 30 and, if desired, to a monitor 32 so that an image produced by the camera 12 can be viewed. In actual use of the system the monitor 32 would probably not be provided, and is generally only used for calibration purposes to be described hereinafter. The laser 14 may have a lens system for forming the laser beam into the stripe 22 or a fan beam etc.
With reference to Figure 2, the distance between the camera 12 and laser 14 is accurately known and is represented by the distance in Figure 2. The laser 14 is moved by the motor and gearbox 80 and connecting rod 20 so that the stripe 22 scans the field of view 24 so that stripe 22 on the objects in the field of view can be detected by the camera 12. The movement of the laser 14 is controlled so that the angle θ of the stripe 22 with respect to the base line between the camera and projector can be accurately determined and, in the preferred embodiment, is provided by a count number indicative of the angle through which the laser has moved. The angle φ of detection of the light beam on an object, such as point PO in Figure 2, is determined by the pixel upon which maximum intensity of the detected light beam occurs.
With reference to Figure 3 which shows a block diagram of the processing circuitry which will be described in further detail with reference to Figure 4, a video signal from camera 12 is passed through a peak detector 40 so that the stripe 22 on the objects 26 in Figure 1 can be readily identified. The video signal is also provided to a sync processor 42 which in turn is coupled to vertical and horizontal pixel counters 44 and 46. The peak detector and the vertical and horizontal pixel counters are connected to a memory store 46. The vertical pixel counter and the horizontal pixel counter are also connected to control circuitry 48 and DMA access controller 50. The horizontal pixel counter is also connected to a lookup table 52 as is the memory store 46. The control circuitry is also connected to the memory store 46 and the lookup table 52 and the DMA access controller are both connected to the host computer 30 (Figure 1). The lookup table is provided with range values for, in practical terms, all points in the field of view of the system. All the range values for all the points in the field of view are stored in the lookup table so that the lookup table can be addressed to find a particular range value merely from the fixed direction pixel coordinate from the camera 12 which detects the stripe 22 (which effectively gives the angle φ in Figure 2) and the current angle θ of the stripe 22. The memory 46 contains the stripe angle for each point XY where the stripe can be detected so the contents of the memory 46 and its X index can be used to drive the lookup table 52 to output range values.
In the preferred embodiment of the invention the table is wired into a plug-in card which carries the circuitry shown in Figure 4 and which is to reside in the computer 30, although in other embodiments the lookup table could be a software lookup table in the form of a two dimensional array. With reference to Figure 4, the processing circuitry will be described in more detail. The video signal from the camera 12 is inputted at 58 and is received by the peak detector 40 which includes a comparator 62, a buffer 64 and a buffer 66. The video signal is buffered by the buffer 64 and passes through a differentiating circuit comprising capacitor 68 and resistors 69 and 70. The differentiated signal is supplied to the inverting input of the comparator 62. A threshold for the comparator appears at its non-inventing input and is set by trimmer circuit 71 and is buffered by buffer 66. The resistors 72 set the hysteresis of the comparator 62. The comparator output passes through a shaping buffer 73 to improve its time rise and flip-flops 74 capture the positive going transitions of the comparator output and hold them until a pixel clock 90 accepts that information.
Thus, the comparator triggers on the zero crossing point ofthe signal provided to its inverting input and thereby provides an indication of the peak of the signal outputted from the camera. The peak, in turn, is indicative of the stripe 22 on the objects 26.
A state counter is provided by the circuits 75 and the circuits 76 are provided to monitor the state of the processing circuitry. The circuit 75 records one of four states, and they are a reset state, a clear frame store state, an acquire pre-range data state and a done state. In the acquire pre-range data state the circuit 77 is enabled and maintains a count of the number of video frames up to a pre-set terminal count. This terminal count is set by link jumper 78. The circuit 76 controls the operation of the state counter section of the processing circuitry.
The sync processor 42 receives the video signal from the input 58 and generates the necessary synchronization signals to enable and reset memory address counters 80, 81, 82 and 83, together with blanking counters 84 and 85. The video signal is decoded by decoder 86 which detects the onset of horizontal and vertical synchronization pulses as well as determining odd and even fields and transitions at the outputs are sharpened by the circuits 87. The circuit 76 also assists in the control of the organization processor 42 and part of the circuit 76 so forms part of the synchronization circuit 42. Pixel clock generator 90 generates the necessary clock signals to advance the address counters 80 to 83 and to sample the output from the peak detector circuit 40. The pixel clock 90 controls the write signal to a frame store (to be described hereinafter) and also clocks the horizontal blanking counter 84.
The address counters 80 to 83 together with part of circuit 93 make up an address counter system and are enabled in synchronism with the raster scan of the camera 12 so that each address corresponds to a particular pixel in the camera's image. The counters 80 to 83 generate the addresses to index into the frame store (to be described hereinafter) during acquisition and during DMA transfer. During DMA read only circuits 82 and 83 are used for dressing lookup table 52 comprised of circuits 100 and 101. During DMA write circuits 80 and 81 are also used.
To ensure that the peak detector output from the circuit 40 is sampled where the video image is away from its boundaries, blanking counters 84 and 85 are used. The blanking counter 84 disables the clock to the address generators for 32 pixel clock times after the end of the horizontal sync pulse and the blanking counter 85 does the same for 16 lines after the start of a frame. Part of the circuit 93 is also used to control circuits 84 and 85. The memory controller 49 is a PAL device programmed to issue chip enable and output enable to memory 46 which includes circuits 103 and 104 and also to the lookup table circuits 100 and 101 based on several control signals.
The memory circuits 103 and 104 comprise a frame store which is indexed by the address counters 80 to 83. Since these counters are synchronised to the cameras raster, each memory location in the store maps to a unique pixel location in the camera's image frame. When a pixel in the camera is illuminated by the peak reflection of the laser stripe from an object 26, the peak detector circuit fires, causing a write to the frame store at the location corresponding to that pixel. The value stored in that location is the current value of the laser position counter produced by the position counter circuits 106. The circuits 106 are connected to a buffer 107 so that output signals are presented to the frame store circuits 103 and 104 at the write time. Circuits 100 and 101 are a lookup table and it is loaded from the host computer 30 (Figure 1) using a DMA write transfer. The lookup table is preferably a 256 x 256 element array where one index is a pixel X coordinate, and the other is the corresponding laser position value. During a DMA read transfer, these are derived from the low order byte and data output respectively of the frame store circuits 103 and 104. During a DMA write transfer, the high and low order address bytes are sourced from the address counters 80 to 83.
Multiplexer circuits 109 are coupled to the lookup table circuits 100 and 101 because the lookup table address lines are sourced differently depending on the current operating state. During a DMA write transfer, when the lookup table is written to from the host computer (during calibration of the system which will be described hereinafter) the upper and lower address bytes are sourced from the address counters 80 to 83. However, during a DMA read when the host computer is reading via the lookup table (i.e. when the system is actually in use on a robot or vehicle or the like) the low order address byte is sourced from the data output byte of the frame store, while the high order address byte is sourced from the circuits 80 to 83. The multiplexer 109 selects between these two sources depending on whether the DMA transfer operation is a read or a write.
The circuit 110 constitutes a status register so that when the range acquisition process is complete, the done output of the circuit 76 is set and this may be read by the host computer 30 as a bit zero of the circuit 110. The circuits 111 perform the task of address decoding the DMA transfer control. One of the circuits is a comparator which compares the dip switch settings with the host computer address lines and the other is a PAL device. The circuit 112 is a tri-state bi-directional buffer which transfers data during DMA read and write cycles.
The counters made up by the circuits 106 keep track of the position of the laser stripe 22 as it sweeps across the field of view 24 and the buffer 107 asserts the counter data onto the frame store memory bus whenever a peak is detected from the peak detector 42.
In order to initially set the range values in the lookup table, the system must be calibrated. Since in the preferred embodiment of the invention the lookup table is a 256 x 256 element array, 65,536 range values can be stored. The field of view' of the vision system is preferably in the order of 3 meters, and therefore the number of range values to be stored for that field of view provide sufficient information to enable the range of any object which may fall within the field of view to be obtained with a degree of accuracy sufficient to enable the robot to take evasive action during movement.
With reference to Figure 5, in order to calibrate the system to provide the range values, a calibration object 200 is arranged within the field of view of the camera 12. The laser 14 is moved to scan the calibration object.
The host computer 30 preferably contains a program to enable the range values to be calculated for all of the 256 x 256 elements in the lookup table array based on triangulation and the angle of the laser 14 and the angle of detection of the reflected stripe 22 from the calibration object 200.
In order to provide correct calibration, some information is keyed into the computer 30, and that information comprises the dimensions of the calibration object and its distance from the camera 12. Information relating to the limits of the field of view can also be inserted in terms of a position nearest the system, and a position furthest from the system. For example, the nearest position can be in the order of 20 cm and the furthest in the order of 3 meters.
The right and left hand ends of the calibration object 200 can then be detected and the count from the circuits 106 (Figure 4) to give an indication of the angular position of the laser 14 when detecting the right and left hand edges of the calibration object 200 can be obtained. This will set the limits of the count for a sweep of the laser 14 across the entire field of view. With reference to Figure 2, the range lookup table can be calculated in the following way. If we know θ and φ (which are provided by the counter 106 in Figure 2 and the X pixel coordinate of the camera 12) and the length of the base line K, the distance to object Po can be calculated. Since we actually have a counter-value and a pixel location, we require a relationship between those values and location, the base line K and the distance FPO. The following expression can be used:
φ = τr-B-artan
Figure imgf000013_0001
where IW is the camera's image plane width in pixels, which is preferably obtained by calibration and may be different from camera manufacturer's specification, B is the camera angle in radians, V is the counter value of counter 106, and U is the pixel location.
Also, θ = ^dv + Y
and iFPoi - K sSiin(φ-θ)
wherein B, Y and K are constants which can also be determined by calibration. Thus, the values of U and V will directly give a value for the distance FPo.
For all possible U and V values, all possible range values based on the above equations can be calculated. Unconstrained, this would lead to an enormous lookup table, but since we only deal with 256 projection positions (in view of the 256 x 256 element array referred to above) and 256 pixel locations, this provides a maximum number of entries totalling 65,536 which is quite manageable.
According to the preferred embodiment of the invention a range acquisition cycle occurs as follows: When a start signal is issued from the host microcomputer (30), state counters 75 are cleared. This is the reset state or 00. The next vertical sync signal from the camera advances the state to 01. This state is decoded to enable the clear buffer or CLRBF signal. This clears the laser position counter 106 to 0, and enables this value to be stored in the frame 46. The address counters 30 - 83 are made to advance during this time so that every location in the frame store is cleared in this way.
The next vertical sync signal advances the state counter to binary 10. In this state the frame store 46 is made to accept data from the laser position detector 106, whenever the peak detector circuit 42 detects the laser stripe's image in the video signal. This state is maintained for a number of frames, this number being selectable as 64, 128 or 256. This number is determined by the frame counter, 77. Counter 77 is cleared in state 2, and in state 3, counts the number of vertical sync pulses. Link 78 sets the terminal count by connecting the appropriate output from the counter to the state controller, 76.
For each frame in this stage, the laser position counter 106 advances. Also, for each frame, the address counters 80 - 83 are reset and then made to count through every address in the frame store 46, in synchronism with the camera's raster scan. This means that address value has a unique and corresponding pixel in the image plane of the camera. Thus, it follows that each address of the frame store has a corresponding pixel in the image plane. When the peak detector 42 fires, the current value of the laser position counter is stored at the current address in the frame store.
So, when a stripe is detected in the camera's video signal, two pieces of information are effectively stored, which are necessary for determining the range to an object. They are firstly, the angular position of the laser stripe, from counter 106, and secondly, the x-position of the stripe in the camera's image plane. This last value is implicit in the address used to store the laser position in the frame store. By the end of this stage, the preliminary information required to determine the range, namely the laser position and the corresponding image pixel locations are embodied in the frame store. The last state, binary 11, is the done state. This signals that the scan is complete and that the results may be transferred to the host computer, via the lookup table. The inputs to the lookup table are the x-position of the stripe in the camera's image plane and the corresponding laser position value. Both of these are 8 bits wide so there are 256 x 256 possible combinations, representing 65,536 possible range values (some of which are infeasible) . The lookup table has been pre-calculated, as described above, to include all of these values and they are loaded into the lookup table store.
During DMA read by the host computer, the frame store data values are presented, in succession, to the low order byte of the lookup table 52 and the corresponding "x" values are presented to the high order byte of the lookup table 52. The lookup table value (range value) thus indexed is placed on the data bus to be read by the host computer, 30.
Since modifications within the spirit and scope of the invention may readily be effected by persons skilled within the art, it is to be understood that this invention is not limited to the particular embodiment described by way of example hereinabove.

Claims

THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1. An object location system comprising: signal output means for outputting a signal into a field of view containing an object, at an angle to a predetermined base line; a detector for detecting the output signal in the field of view; processing means for producing a first signal indicative of the angle of the output signal and for providing a second signal indicative of the angle, with respect to the base line, at which the output signal is detected; a lookup table in said processing means for storing precalculated range values relating to the field of view of the system; such that said first and second signals can be fed to said lookup table to provide a range output value from the lookup table which will provide the range of the object.
2. An object location method comprising the steps of; outputting an output signal at predetermined known angles into a field of view containing an object and detecting the output signal in the field of view, providing a lookup table having range values relating to the field of view, using information relating to the angle of the output signal with respect to a base line and information relating to the angle of detection of the output signal with respect to the base line to select an appropriate range value from the lookup table to provide the range of the object in the field of view.
3. The apparatus of claim 1 or the method of claim 2 wherein the lookup table compensates for motion in the direction of observation.
4. The apparatus of claim 1 or the method of claim 2 wherein a plurality of lookup tables are stored, each having range values applicable to a particular constant speed of the system.
5. The apparatus of claim 1 wherein the processing means provides for calibration to precalculate the range values for storage in the lookup table.
6. The apparatus of claim 1 or the method of claim 2 wherein the lookup table is an element array and the detector is a camera having a plurality of pixels, said precalculated range values being stored in the element array such that one index of the element array is a pixel X coordinate of the camera which thereby provides an indication of the angle of detection of the output signal, and the other index of the element array is the corresponding angular position value of the output means thereby providing an indication of the angle of the output signal, a value written into the appropriate array element identified by those indexes containing a precalculated range value corresponding to the said pixel X coordinate, and the corresponding output means angular position value, so that in use when an object is detected by the pixel at that X coordinate and with the output means at that angular position value, the range value can be merely read from the lookup table to provide the range to that object.
7. The apparatus of claim 6 or the method of claim 6 wherein the signal output means is a laser and the output signal is a beam of infra-red electromagnetic radiation.
PCT/AU1991/000589 1990-12-20 1991-12-20 Object location system Ceased WO1992011508A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU91115/91A AU657110B2 (en) 1990-12-20 1991-12-20 Object location system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AUPK398990 1990-12-20
AUPK3989 1990-12-20

Publications (1)

Publication Number Publication Date
WO1992011508A1 true WO1992011508A1 (en) 1992-07-09

Family

ID=3775149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU1991/000589 Ceased WO1992011508A1 (en) 1990-12-20 1991-12-20 Object location system

Country Status (1)

Country Link
WO (1) WO1992011508A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0921091A3 (en) * 1997-12-05 1999-06-16 Grove U.S. LLC Smart tele-cylinder
ES2152171A1 (en) * 1998-11-30 2001-01-16 Univ Madrid Carlos Iii 3D vision system with hardware processing of the video signal
FR2838514A1 (en) * 2002-04-10 2003-10-17 Bosch Gmbh Robert Motor vehicle parallactic telemetry system error correction method in which a distance measured using the system is compared with one independently determined using a relative velocity method so that corrections can be applied
EP1653251A3 (en) * 2004-10-29 2006-07-12 Deere & Company Method and system for obstacle detection
US20200193624A1 (en) * 2018-12-13 2020-06-18 Zebra Technologies Corporation Method and apparatus for dimensioning objects

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3554646A (en) * 1969-01-28 1971-01-12 Gen Electric Optical distance gage
US4522492A (en) * 1981-07-10 1985-06-11 Canon Kabushiki Kaisha Distance measuring device
US4568182A (en) * 1981-12-22 1986-02-04 Summagraphics Corporation Optical system for determining the position of a cursor
AU6202086A (en) * 1985-09-26 1987-04-02 Unisearch Limited Robot vision & optical location systems
AU3893189A (en) * 1988-07-25 1990-01-25 Unisearch Limited Improvements in optical location systems

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3554646A (en) * 1969-01-28 1971-01-12 Gen Electric Optical distance gage
US4522492A (en) * 1981-07-10 1985-06-11 Canon Kabushiki Kaisha Distance measuring device
US4568182A (en) * 1981-12-22 1986-02-04 Summagraphics Corporation Optical system for determining the position of a cursor
AU6202086A (en) * 1985-09-26 1987-04-02 Unisearch Limited Robot vision & optical location systems
AU3893189A (en) * 1988-07-25 1990-01-25 Unisearch Limited Improvements in optical location systems

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0921091A3 (en) * 1997-12-05 1999-06-16 Grove U.S. LLC Smart tele-cylinder
EP0921093A3 (en) * 1997-12-05 1999-06-16 Grove U.S. LLC Luffing angle measurement system
US6473715B1 (en) 1997-12-05 2002-10-29 Grove U.S. L.L.C. Luffing angle measurement system
ES2152171A1 (en) * 1998-11-30 2001-01-16 Univ Madrid Carlos Iii 3D vision system with hardware processing of the video signal
FR2838514A1 (en) * 2002-04-10 2003-10-17 Bosch Gmbh Robert Motor vehicle parallactic telemetry system error correction method in which a distance measured using the system is compared with one independently determined using a relative velocity method so that corrections can be applied
EP1653251A3 (en) * 2004-10-29 2006-07-12 Deere & Company Method and system for obstacle detection
US7164118B2 (en) 2004-10-29 2007-01-16 Deere & Company Method and system for obstacle detection
US20200193624A1 (en) * 2018-12-13 2020-06-18 Zebra Technologies Corporation Method and apparatus for dimensioning objects

Similar Documents

Publication Publication Date Title
US7277187B2 (en) Overhead dimensioning system and method
US5513276A (en) Apparatus and method for three-dimensional perspective imaging of objects
US4272756A (en) Method of pattern recognition and apparatus thereof
EP1043642A2 (en) Robot system having image processing function
AU2002315499A1 (en) Overhead dimensioning system and method
JPH09187038A (en) 3D shape extraction device
US4486842A (en) Apparatus and procedure for locating three-dimensional objects packed in bulk for purposes of controlling a gripping terminal
EP0908846A2 (en) Moving object detection apparatus and method
WO1992011508A1 (en) Object location system
JP3991501B2 (en) 3D input device
AU657110B2 (en) Object location system
JP3237975B2 (en) Image processing device
US7158665B2 (en) Image processing device for stereo image processing
JP2002022425A (en) 3D image input device
JPH0617795B2 (en) Three-dimensional position measuring device
JP2002027502A (en) 3D image input device
JPS61241612A (en) Three-dimensional form measuring system
JPH0797020B2 (en) Coordinate measuring device
JPS5818110A (en) Three-dimensional measurement method
JPH09325019A (en) Three-dimensional measuring device
JP3800841B2 (en) Method and apparatus for measuring 3D shape and storage medium storing 3D shape measurement program
JP2000171222A (en) Method and device for three-dimensional input
JPH0749936B2 (en) Three-dimensional coordinate measuring device
JPS6214014A (en) Range measurement
JPH09325010A (en) Three-dimensional measuring device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IT LU MC NL SE

122 Ep: pct application non-entry in european phase