[go: up one dir, main page]

GB2564232A - A system for use in a vehicle - Google Patents

A system for use in a vehicle Download PDF

Info

Publication number
GB2564232A
GB2564232A GB1807667.9A GB201807667A GB2564232A GB 2564232 A GB2564232 A GB 2564232A GB 201807667 A GB201807667 A GB 201807667A GB 2564232 A GB2564232 A GB 2564232A
Authority
GB
United Kingdom
Prior art keywords
sensors
target object
sensor
range data
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1807667.9A
Other versions
GB201807667D0 (en
Inventor
George Hoare Edward
Bystrov Alex
Gashinova Marina
Shishanov Sergei
Cherniakov Mikhail
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Publication of GB201807667D0 publication Critical patent/GB201807667D0/en
Publication of GB2564232A publication Critical patent/GB2564232A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/878Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/46Indirect determination of position data
    • G01S2013/466Indirect determination of position data by Trilateration, i.e. two antennas or two sensors determine separately the distance to a target, whereby with the knowledge of the baseline length, i.e. the distance between the antennas or sensors, the position data of the target is determined
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

A system and method for use in a vehicle 10 for estimating a vertical height of a target object 16 in the vicinity of the vehicle. The system includes a receiver configured to receive range data from two or more on-board vehicle sensors 12a,b,c. Each sensor has a non-zero displacement in a vertical direction from each of the other sensors, and the range data for each sensor is indicative of the distance between a target reflection point on the target object and said sensor. The system includes a processor 18 configured to compare the range data from each of the sensors and to calculate a vertical distance between the target reflection point on the target object and a surface on which the target object is located based on the comparison of the range data so as to estimate the vertical height of the target object. The comparison of the range data comprising calculating two or more 2D geometrical shapes and determining the intersection points of the shapes.

Description

A SYSTEM FOR USE IN A VEHICLE
TECHNICAL FIELD
The present disclosure relates to a system for use in a vehicle and particularly, but not exclusively, to a system for estimating a vertical height of a target object in the vicinity of the vehicle. Aspects of the invention relate to a system, to a method, and to a vehicle.
BACKGROUND
In the drive towards modern vehicles becoming increasingly autonomous, it is important for a vehicle to have an accurate representation of its surroundings. For example, it is desirable for the vehicle to determine whether there are obstructions or obstacles in the likely path of the vehicle which, if they are encountered, may cause damage to the vehicle or undue discomfort to the passengers. One type of obstruction that is commonplace on highways and other roadways is potholes. If a vehicle encounters a pothole that is not anticipated, damage can be done to the vehicle, as well as to the vehicle contents, and the passengers may experience an uncomfortable jolt. Clearly, it is also important that obstacles such as pedestrians in the road are detected accurately and early so that appropriate action may be taken by the vehicle or its driver.
Typically, a vehicle includes sensor systems capable of measuring data relating to objects in the vicinity of the vehicle. Such sensor systems may include one or more of radar, camera, sonar, LIDAR and ultrasonic sensors. The basic parameters determined from measured sensor data may be the distance between the vehicle and object, the angle between the object and a direction ahead of the vehicle, and the velocity of the object. Distance or range may be provided by direct or indirect measurement of time of flight of energy that is transmitted from a sensor to the object and then received back. Angle may be determined by a mechanical scanning, multiple beams, phased array or single processing method. Velocity may be determined via a Doppler shift method or range rate measurement.
These parameters may be determined from distinct parts of the sensor system and it can be costly and bulky to include all of the sensor equipment on the vehicle that is necessary to build up a complete image of the surroundings.
It is an aim of the present invention to address disadvantages associated with the prior art.
SUMMARY OF THE INVENTION
According to an aspect of the present invention there is provided a system for use in a vehicle for estimating a vertical height of a target object in the vicinity of the vehicle, the system comprising a receiver configured to receive range data from two or more on-board vehicle sensors. Each sensor has a non-zero displacement in a vertical direction from each of the other sensors, and the range data for each sensor is indicative of the distance between a target reflection point on the target object and said sensor. The system also comprises a processor configured to compare the range data from each of the sensors and to calculate a vertical distance between the target reflection point on the target object and a surface on which the target object is located based on the comparison of the range data so as to estimate the vertical height of the target object.
Current vehicle sensors provide measurements of the range, azimuthal angle and speed of a target object in the vicinity of a vehicle. By additionally making a measurement of the height (or vertical angle) of the target object, the capability of systems of the vehicle to build a three-dimensional composite description of a target is enhanced. In particular, the additional information provides the opportunity for detection, identification, classification, track and potential intent of the target object. For example, target detection provides only information that something is there; target identification determines the probability that a valid target is present; target classification determines the type of object that is present (e.g. low height moving target with fast two leg Doppler components); target track determines a historical track and predicted future track of the target object so as to indicate interception with a path of the vehicle; and, target intent determines a likelihood of collision (e.g. target object is a small human running towards path of the vehicle, therefore there is high probability of collision).
The present invention is advantageous in that the height of a target object may be estimated using only range data from sensors, i.e. data to estimate the target object height may be obtained by using additional ranging sensors in preference to relatively expensive narrow beam, monopulse, mechanically scanned or phased array antennas.
The processor may comprise an electronic processor having an electrical input for receiving the range data from the two or more vehicle sensors, and an electronic memory device electrically coupled to the electronic processor and having instructions stored therein. The processor may be configured to access the memory device and execute the instructions stored therein such that it is operable to compare the range data from each of the sensors and to calculate a vertical distance between the target reflection point on the target object and a surface on which the target object is located based on the comparison of the range data so as to estimate the vertical height of the target object.
The processor may be configured to compare the range data by calculating two or more two-dimensional geometric shapes based on the range data.
The processor may be configured to compare the range data by calculating an intersection point between the calculated geometric shapes, the intersection point coinciding with the target reflection point on the target object.
The two or more calculated geometric shapes may be one or more of a circle, an ellipse, and a hyperbola.
At least one of the calculated geometric shapes may be a circle centred at the location of one of the sensors with a radius equal to the distance between said sensor location and the target reflection point on the target object according to the received range data.
At least one of the calculated geometric shapes may be an ellipse in which the locations of two of the two or more sensors correspond to foci of the ellipse.
At least one of the calculated geometric shapes may be a hyperbola in which the locations of two of the two or more sensors correspond to foci of the hyperbola.
The receiver may be configured to receive range data from at least three on-board vehicle sensors.
The processor may be configured to calculate an ellipse in which the locations of two of the three or more sensors correspond to foci of the ellipse and to calculate a circle in which another one of the three or more sensors correspond to the centre of the circle.
The sensor whose location corresponds to the centre of the circle may be located between the two sensors whose locations correspond to the foci of the ellipse.
The processor may be configured to calculate a hyperbola in which the locations of two of the three or more sensors correspond to foci of the hyperbola and to calculate a circle in which another one of the three or more sensors correspond to the centre of the circle.
The sensor whose location corresponds to the centre of the circle may be located between the two sensors whose locations correspond to the foci of the hyperbola.
The processor may be configured to calculate first and second hyperbolae in which the locations of two of the three or more sensors correspond to foci of the first and second hyperbolae, and wherein at least one of the foci of the first hyperbola may be different from at least one of the foci of the second hyperbola.
The processor may be configured to calculate a measured phase difference between measured phases associated with the range data received at two of the sensors.
The processor may be configured to calculate a reference vector, the reference vector being a matrix of reference phase difference values corresponding to various target object heights and various target object ranges.
The processor may be configured to calculate a difference between the measured phase difference and each of the reference phase difference values, and to select a minimum such phase difference.
The processor may be configured to estimate the target object height to be equal to the reference vector height corresponding to the selected minimum phase difference.
The processor may be configured to determine the target reflection point on the target object based on the received range data.
At least one of the sensors may be at least one of a radar sensor, a sonar sensor, a LIDAR sensor and an ultrasonic sensor.
At least one of the sensors may be a radar sensor arranged to transmit signals (pulses) having a frequency of at least 120 Ghz.
The distance between each pair of the sensors may be proportional to the wavelength of the waves transmitted by the sensors.
The distance between one or more pairs of the sensors may be substantially equal to half of the wavelength of the transmitted waves.
The received range data may comprise data relating to reflection points other than the target reflection point, and the processor may be configured to select the target reflection point from the other reflection points.
Selection of the target reflection point may be based at least partially on the angular position of each reflection point relative to one of the sensors.
According to another aspect of the invention there is provided a method for use in a vehicle for estimating the height of a target object in the vicinity of the vehicle. The method comprises receiving range data from two or more on-board vehicle sensors, 5 each sensor having a non-zero displacement in a vertical direction from each of the other sensors, and the range data for each sensor being indicative of the distance between a target reflection point on the target object and said sensor. The method also comprises comparing the range data from each of the sensors, and calculating a vertical distance between the target reflection point on the target object and a surface on which the target object is located based on the comparison of the range data so as to estimate the vertical height of the target object.
According to another aspect of the invention there is provided a vehicle comprising a system as described above.
The vehicle may comprise two or more on-board vehicle sensors, each sensor having a non-zero displacement in a vertical direction from each of the other sensors, wherein the sensors are for transmitting a signal to the target object in the vicinity of the vehicle and for receiving a reflected signal of the transmitted signal from the target object.
According to yet another aspect of the invention there is provided a non-transitory, computer-readable storage medium storing instructions thereon that when executed by one or more processors causes the one or more processors to carry out the method described above.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
Figure 1a is a side view of a vehicle including a system according to an embodiment of the invention, the system including a processor and a plurality of vehicle-mounted sensors, and Figure 1a also showing a target object in the vicinity of the vehicle;
Figure 1 b is a front view of the vehicle of Figure 1 a;
Figure 2 shows a schematic view of one of the sensors of Figure 1a measuring the range to a reflection point located on a circle centred at the sensor and having a radius equal to the measured range;
Figure 3a shows a schematic view of two of the sensors of Figure 1a each measuring the respective range to a reflection point in the vicinity of the vehicle, the reflection point being located at the intersection between two circles centred respectively at each of the two sensors, each circle having a radius equal to the respective measured ranges;
Figure 3b shows the schematic view of Figure 3a, showing the circumference of each of the circles having a width that is dependent on the bandwidth of the transmitted sensor signal;
Figure 3c shows the geometry of the arrangement depicted schematically in Figure 3a;
Figure 4a shows a schematic view of three of the sensors of Figure 1a each measuring the respective range to a reflection point in the vicinity of the vehicle, the reflection point being located at the intersection between a circle centred at one of the sensors and having a radius equal to the measured range by that sensor, and a hyperbola indicating a contour of constant range difference between the other two sensors;
Figure 4b shows the geometry of the arrangement depicted schematically in Figure 4a;
Figure 5a shows a schematic view of three of the sensors of Figure 1a each measuring the respective range to a reflection point in the vicinity of the vehicle, the reflection point being located at the intersection between a circle centred at one of the sensors and having a radius equal to the measured range by that sensor and an ellipse having foci at the other two sensors;
Figure 5b shows the geometry of the arrangement depicted schematically in Figure 5a;
Figure 6a shows a schematic view of three of the sensors of Figure 1a each measuring the respective range to a reflection point in the vicinity of the vehicle, the reflection point being located at the intersection between two hyperbolas, in which a first one of the hyperbolas indicates a contour of constant range difference between a first pair of the three sensors and a second one of the hyperbolas indicates a contour of constant range difference between a second pair of the three sensors;
Figure 6b shows the geometry of the arrangement depicted schematically in Figure 6a;
Figures 7a and 7b shows a schematic view of two and three of the sensors, respectively, of Figure 1a each measuring the respective range to a reflection point in the vicinity of the vehicle, and indicating the additional parameter measurements used to reduce the number of false reflection points that are detected;
Figure 8 shows a plot of the necessary signal frequency for separating two reflection points with different height and equal range against the height of the reflection points from the sensors of Figure 1a for different values of the range between the sensors and reflection points;
Figure 9 shows the steps of a method carried out by the system of Figure 1a for determining the height of a target object in the vicinity of the vehicle; and
Figure 10 shows a schematic view of two of the sensors of Figure 1a each measuring the respective range to a reflection point in the vicinity of the vehicle, and indicates additional parameters used to calculate the height of the target object; and
Figure 11 shows the steps of another method carried out by the system of Figure 1a for determining the height of a target object in the vicinity of the vehicle.
Figure 12 shows a schematic view of two of the sensors of Figure 1a each measuring the respective range to a reflection point in the vicinity of the vehicle, the reflection point being located at the intersection between two circles, an ellipse and a hyperbola, in which the circles are centred at each of the sensors and having a radius equal to the measured range by that sensor, the ellipse having foci at both sensors and the hyperbola indicates a contour of constant range difference between the two sensors.
DETAILED DESCRIPTION
Modern vehicles contain many sensors designed to interrogate the area around the vehicle so as to assist the driver. These include radar and vision sensors designed to provide object detection to enable vehicle control systems or the driver to react to such objects. Traditional automotive radar sensors provide measurements of the range, azimuthal angle and speed of an object in the vicinity of a vehicle. This data may be used to provide high resolution, all-weather imaging for vehicle control and driver assistance. Measurement of the height of a target object is currently a parameter that is not measured by automotive radars but is an additional parameter that would provide information to allow 3D imaging. Target object height estimation would also provide additional ability to discriminate between different types of objects such as child or adult pedestrians, speed bumps, speed humps, potholes, rocks, fallen trees etc. The present invention relates to the estimation of the height of an object in the vicinity of a vehicle. In particular, two methods are described in which range data received by radar sensors is used to estimate the height of a target object, namely, range triangulation and phase comparison methods.
Range Triangulation Method
Range triangulation relies on using several sensors at different positions, each determining the range to an object, and using the individual range measurements in conjunction with the knowledge of the positions of the sensors relative to each other to calculate the position.
Figures 1a and 1b show one embodiment of a vehicle 10 having three on-board radar sensors 12a, 12b, 12c located at the front of the vehicle 10. Commonly, radar sensors are used to send and receive radar signals to collect sensor output data to be input to, for example, adaptive cruise control (ACC) systems. In an ACC system, the time between a radar signal being sent and then received back is measured, and then the time interval to a vehicle in front is calculated. This information may be sent to other systems of the vehicle (throttle control, brake control etc.) and the necessary action is taken to maintain a constant time interval to the vehicle in front. The radar sensors in an ACC system are typically able to detect an obstacle up to about 150 metres in front of the vehicle; other ACC systems may use shorter range, wider angle radars, or a combination of both.
The three sensors 12a, 12b, 12c are positioned in a vertical plane relative to a surface 14 over which the vehicle 10 travels. The distance between the lower two sensors 12a, 12b equals the distance between the upper two sensors 12b, 12c. The vertical distance between the surface 14 and each of the sensors 12a, 12b, 12c will be known.
The sensors 12a, 12b, 12c are configured to transmit waves from transmitting antennas, which propagate through the environment surrounding the vehicle 10 and which are scattered from objects and surfaces in the vicinity of the vehicle 10. The transmitting antennas are isotropic antennas, i.e. the waves propagate out in different directions. In the present embodiment, the propagated waves are low terahertz waves, for example at least 120 GHz. However, any appropriate values may be used. In particular, in the automotive industry the currently licensed bands for short-range radar are restricted to 21.65 - 26.65 GHz and 76 - 81 GHz and so carrier waves in these frequency ranges may be used. Scattered waves from, in this case, the target object 16 propagate back to receiving antennas of the sensors 12a, 12b, 12c. The received waves provide range data relating to the target object 16, i.e. data relating to the distance between the vehicle 10 and the object 16.
The range data received by the sensors 12a, 12b, 12c is input into a processor 18 of the vehicle 10. The processor 18 then uses the range data to estimate the height of the object 16, as will be discussed in detail below.
The waves that are transmitted from the sensors 12a, 12b, 12c will be scattered from not only the target object 16, but also any other objects in the vicinity of the vehicle. In addition, the target object 16 will have several points from which propagated waves will reflect back to the sensors 12a, 12b, 12c. The processor 18 needs to both identify which received range data corresponds to the target object 16 and which received range data corresponds to a reflection point on the target object 16 from which an estimation of the height of the target object 16 may be made.
A distributed target such as the target object 16 usually has certain areas with a stronger reflection than others. These areas are called the reflection points. The reflection point is a surface area reflecting signal in the direction of the sensor. Often the reflection point is similar to a corner reflector or a flat surface perpendicular to the direction of the electromagnetic wave. Each distributed target has several reflection points. If the resolution of the system is larger or equal to the size of the target object, the time domain signal reflected from the object can be considered as one reflection which is the superposition of reflections from a number of the single reflection points. If the resolution of the system is smaller than the size of the target, the reflected signal consists of a number of reflection points which can be used to estimate the size, and in particular the height, of the target (the size of the target is more than the resolution of the system). In this case the target has a range profile and it can be identified by this profile. In the following, we initially consider detection of a single reflection point. We then move on to a consideration of one or more distributed targets (having several reflection points) and, in particular, consider how to select a particular reflection point that will enable an estimation of the height of the target object 16 to be made.
With reference to Figure 2, consider one of the range sensors 12b receiving a scattered wave from a reflection point 20 on an object in the vicinity of the vehicle 10 (such as the target object 16). A measurement of the distance between the sensor 12b and the reflection point 20 will produce the position of the reflection point 20 located anywhere on a circle 22 whose radius 23 is the measured range. In particular, the measurement is a time delay between a transmitted signal from the sensor 12b and a received signal reflected from the reflection point 20. So the position of the reflection point 20 in range is known but the position in angle is unknown.
Now with reference to Figure 3a, a measurement of the distance between another sensor 12a and the reflection point 20 will produce the position of the reflection point 20 located anywhere on a circle 24 whose radius 25 is the measured range. As mentioned above, the sensors 12a, 12b are positioned in a vertical plane (as shown in Figures 1a and 1b), and so it is noted that the vertical plane is in fact shown horizontally in Figure 3a. Combining the two circles 22, 24 with the distance between the sensors 12a, 12b will give the position of the reflection point 20 not only in range, but also in angle (described below), at the intersection of the circles 22, 24. It is noted that, as shown in Figure 3a, this will actually give two possible position points for the reflection point 20 with a separation of 180 degrees; however, it is likely that one of these points can easily be disregarded as not being physically sensible.
The range triangulation method relies on using several sensors at different positions, each determining the range to an object, and using the individual range measurements in conjunction with the knowledge of the positions of the sensors relative to each other to calculate the position in range and angle. The relatively short duration of the transmitted electromagnetic signal means that it will be spread over a range of frequencies, and the range triangulation technique relies on the bandwidth of the transmitted signal, i.e. the spread of individual frequencies forming the content of the transmitted signal, to provide adequate range resolution. Figure 3b shows the diagram of Figure 3a, except that the circumference of each of the circles 22, 24 are shown as having a finite, non-zero thickness. The thickness of the circle circumferences varies with the bandwidth of the transmitted waves from the sensors 12a, 12b, which in turn sets an inherent limit to the accuracy of range measurement, and hence the accuracy of the target position calculation. This is because it is not the exact position of the reflected point 20 that is calculated; rather, the determination that is made is that the reflection point 20 is located somewhere in the overlap between the circles 22, 24. Therefore, the thinner the circle circumferences, the more precise the location of the reflected point 20 from the range measurements. In particular, the circle circumference thickness (range resolution) is directly proportional to the transmitted signal bandwidth and, specifically, the higher the bandwidth, the thinner the circle circumferences, and so the more accurate the positional determination.
The bandwidth of the transmitted signals is predominantly proportional to their frequency. This may be in the region of 5%. Therefore the THz frequencies used in some examples of the present invention, of approximately 120 GHz, would have a bandwidth of approximately 6GHz. These greater frequencies, and therefore larger bandwidths, allow for greater range resolution and therefore more accurate triangulation. This can also negate the requirement for conducting the more complex method of phase measurement sometimes required for increased precision.
The geometry of the arrangement in Figure 3a is depicted schematically in Figure 3c. In particular, the positions of the two transceiver or sensors 12a, 12b are 1 and 2, %i and x2 are abscissae of transceiver positions, d is the distance between the first and the second transceiver positions (the baseline), Tg is the reflected point 20, xtg and ytg are the coordinates of the reflected point 20 (abscissa and ordinate), and F?i and R2 are the ranges between the first and the second transceiver positions and the reflected points, i.e. the radii 23, 25.
If the sensor 12a is assumed to be at the origin of the coordinate system then the coordinates of the reflected point 20 may be expressed as
ΓT?!2 + d2 -Rl , 2~d (
The angular position of the reflected point 20 may then be calculated using the values of xtg and ytg. For example, the angle Θ between the (lowest) sensor 12a and the reflected point 20 relative to the horizontal axis y is given by
That is, the reflected point 20 is inclined at the angle Strom the sensor 12a.
Figures 3a and 3b describe how the range data from two sensors 12a, 12b is examined as the intersection of two circles 22, 24 described by the measured range providing the radius of each circle (accurate to a finite width of circle circumference in dependence on signal bandwidth). This is referred to as the ‘circle and circle’ method. As shown in Figures 1a and 1b, the use of three sensors 12a, 12b, 12c is described to estimate the coordinates of the reflected point 20. In particular, methods using different geometric shapes such as ellipses and hyperbolae may be used in addition to, or instead of, circles to estimate the coordinates of the reflected point 20.
An ellipse may be described by using two sensor positions as the foci of the ellipse, with the received range data for each sensor being used to describe the ellipse on which the reflected point lies.
Similarly, a hyperbola may be described by using two sensor positions as the foci of the hyperbola, with the received range data for each sensor being used to describe the hyperbola on which the reflected point lies. In particular, the hyperbola is a line of constant range difference between the received range data for each sensor.
Note that, while the described embodiment uses three sensors 12a, 12b, 12c, it is possible to use methods using ellipses and hyperbolae with only two sensors.
The described embodiment uses combinations of circles, ellipses and hyperbolae to estimate the coordinates of the reflected point 20 in range and angle by superimposing. Examples of methods using range data from the three sensors 12a, 12b, 12c to estimate the coordinates, i.e. range and angular position, of the reflected point 20 are described with reference to Figures 4 to 6.
Figures 4a and 4b shows the use of a circle 22 and a hyperbola 26 defined using the received range data from the sensors 12a, 12b, 12c to estimate the coordinates of the reflected point 20. This is referred to as the ‘circle and hyperbola’ method. In particular, measurement of the distance between the (centre) sensor 12b and reflection point 20 defines the position of the reflection point 20 as being located anywhere on the circle 22 with radius Ro. Also, the difference between the range measurement from the sensor 12a to the reflected point 20, Ri, and the range measurement from the sensor
12c to the reflected point 20, R2, defines the position of the reflection point 20 as being located anywhere on the hyperbola 26 with foci at the sensors 12a, 12c. That is, the value AR = Ri - R2 is defined.
Using the general equations for a circle and a hyperbola, the coordinates of the reflection point 20 in Figures 5a and 5b may be expressed as follows:
, Xtg = ±JRo-ytgFigures 5a and 5b shows the use of a circle 22 and an ellipse 28 defined using the received range data from the sensors 12a, 12b, 12c to estimate the coordinates of the reflected point 20. This is referred to as the ‘circle and ellipse’ method. In particular, measurement of the distance between the (centre) sensor 12b and reflection point 20 defines the position of the reflection point 20 as being located anywhere on the circle 22 with radius Ro. Also, the sum between the range measurement from the sensor 12a to the reflected point 20, Ri, and the range measurement from the sensor 12c to the reflected point 20, R2, defines the position of the reflection point 20 as being located anywhere on the ellipse 28 with foci at the sensors 12a, 12c. That is, the value = Ri + R2 is defined.
Using the general equations for a circle and an ellipse, the coordinates of the reflection point 20 in Figures 5a and 5b may be expressed as follows:
Xtg = ±JRo-ytgFigures 6a and 6b shows the use of two hyperbolas 26a, 26b defined using the received range data from the sensors 12a, 12b, 12c to estimate the coordinates of the reflected point 20. This is referred to as the ‘two hyperbolas’ method. In particular, the difference between the range measurement from the sensor 12a to the reflected point
20, Ri, and the range measurement from the sensor 12b to the reflected point 20, Ro, defines the position of the reflection point 20 as being located anywhere on the hyperbola 26a with foci at the sensors 12a, 12b. That is, the value ARi = Ri - Ro is defined. Also, the difference between the range measurement from the sensor 12c to the reflected point 20, R2, and the range measurement from the sensor 12b to the reflected point 20, Ro, defines the position of the reflection point 20 as being located anywhere on the hyperbola 26b with foci at the sensors 12b, 12c. That is, the value AR2 = R2 - Ro is defined.
Using the general equation for a hyperbola, the coordinates of the reflection point 20 in Figures 6a and 6b may be expressed as follows:
ί/2Δ/?,2 - 2Δ/?2Δ/?2 2 + Δ/Χ/2 _ 2ζ/(δ/?| 2 -Δ/X) ’
tg
It will be acknowledged that the x coordinate in each of the above cases is a vertical coordinate providing an estimation of the vertical displacement between the reflection point 20 and the sensor 12a. As the vertical displacement between the sensor 12a and the surface 14 is known, then the vertical height of the reflection point 20 may be calculated. For example, the vertical height of the reflection point 20 may be calculated by summing the calculated xtg value and the known vertical displacement between the sensor 12a and the surface 14.
The above description relating to Figures 2 to 6 describes the case of data relating to only a single reflection point 20 being received at the sensors 12a, 12b, 12c. As mentioned previously, data will be received relating to different reflection points on a single object as well as reflection points from different objects.
In some example of the invention the methods described above can be combined in multiple ways. It is possible to combine all three geometries in a single method. A single sensor can also be used to define multiple geometries. In figure 12 for example sensors 12a and 12b form the foci for their own respective circles 24, 22, a hyperbola 26 and an ellipse 28. It can be conceived that the more sensors that are within the array the greater the combinations of foci for generating different geometries and thus improving the accuracy of the system.
When several objects are present in the vicinity of the vehicle 10, it is possible for incorrect detection of reflection points to be made. This is caused by crossing of position lines corresponding to different objects. The quantity of false or incorrect detections or measurements rises sharply with an increasing number of objects. The possible ways to exclude false measurements are, firstly, to increase the number of sensors and, secondly, to use the different methods described above, but measure the reflection point coordinates from the same positions. When the three sensors 12a, 12b, 12c are used to measure distance to two or more reflection points, the coordinates of the reflection points are estimated using range measurements from the two edge sensors 12a, 12c. The centre sensor 12b is then used to select the reflection point located at the correct distances from this sensor 12b.
In a multi-object environment each method of coordinate estimation detects real and false reflection points. Two point vectors are defined by the following:
Pd) '1
-,.(1) λ/1)! “P(2)_ '1 Γ-,.(2) λ/2)Ί
At At
r(D A2 ti1 II O-G P(2) '2 r(2) A2 (2)
A A A
d) Λ,(1) P(2) (2) ,,(2)
_-*wi Xvi_ _UV2 _ _N2 y n2_
The first vector is the result of the ‘circle and circle’ method and the second vector is the result of the ‘circle and hyperbola’ method.
Reflection points are located on the intersections of uncertainty areas (see Figure 3b). In order to separate real and false objects it is necessary to choose all combinations of elements of the first and the second vector and calculate the Euclidean distances between them:
^=7(^-^/+(^-^)2.
In the above equation, / is the point number from the first vector, j is the point number from the second vector. The real reflection points correspond to the vector elements Gt ( g2 ) that satisfy dij < Rij ’ where Rij is the radius of uncertainty of /2|.
A method of reducing the number of detected false reflection points is described with reference to Figure 7. An increase in the number of objects in the vicinity of the vehicle 10 results in a sharp increase in the dimensions of vectors Gi and G2. In the ‘circle and circle’ method the dimension of Gi is equal to A/2, and in the ‘circle and hyperbola’ method the dimension of G2 is equal to 2-A/3, where N is the number of identified reflection points by the centre sensor 12b.
For each pair of primary measured parameters AR = Ri - R2 and Ro the ‘circle and hyperbola’ method gives two coordinate estimations, where Ro, Ri and R2 are the linear distances from the reflection point 20 to the sensors 12b, 12a, 12c. If the value AR = Ri - R2 is greater than zero, then the object is located in the right half-plane (as viewed in Figure 7); if less than zero then the object is located in the left half-plane. By selecting the reflection point located in the correct half-plane, the number of false reflection points can be reduced by half.
When calculating the coordinates using the ‘circle and circle’ method, if the distance to the reflection point from the sensor 12a is Ri, then the distance to the same object from the sensor 12c will be in the range ring limited by the maximum and minimum possible ranges of Rmax and Rmin as shown in Figure 7. These are defined as the distances from the sensor 12c to the point of intersection of the circle of radius Ri with the edges of the detection zone.
Similarly, when calculating the coordinates using the ‘circle and hyperbola’ method, if the distance to the reflection point from the (centre) sensor 12b is Ro, the distances to the same reflection points from the (edge) sensors 12a, 12c will be in the range rings limited by the maximum and minimum possible ranges of Rmaxi and Rmini, Rmax2 and Rmirn. These are defined as the distances from the sensors 12a, 12c to the point of intersection of the circle of radius Ro with the edges of the detection zone.
As a result of the above assumptions it is possible to reduce the dimension of the vectors Gi and G2, as only coordinates lying in respective rings should be considered in calculating the coordinates of a reflection point.
Once vectors Gi and G2 are obtained, the difference AR = Ri - R2 determines the direction, i.e. the angle, to the reflection point. Breaking up the detection areas of the sensors into azimuthal strips, the width of which depend on a priori information on the error of range measurement, the area in which the reflection point is located can be localised. Thus the dimensions of the vectors Gi and G2can be reduced.
As mentioned above, the target object 16 may have several reflection points from which scattered waves are reflected back to the sensors 12a, 12b, 12c if the bandwidth of the transmitted waves is sufficiently high. The target object 16 has dimensions bigger than the range and angle resolution of the sensors 12a, 12b, 12c, and so the target object 16 may have several reflection points.
Figure 8 shows the necessary signal frequency for separating two reflection or target points with different height and equal range. In Figure 8, the line labelled 1 is the necessary signal frequency for a range between the target and sensor of about 20m, the line labelled 2 is the necessary signal frequency for a range of about 50m, and the line labelled 3 is the necessary signal frequency for a range of about 100m. In particular, the number of reflection points depends on the target shape, the wavelength, the range and angle resolutions, and the observation angle.
When the target object 16 has a plurality of reflection points 20, then the height of each reflection point 20 may be calculated using the methods described above with reference to Figures 3 to 6. The reflection point having the greatest distance from the surface over which the vehicle 10 travels may be taken to be the height of the object
16. This reflection point is identified using the calculated angle for each of the identified reflection points. For example, the reflection point to be used to estimate object height may be the reflection point having the greatest angle of inclination from one of the sensors, assuming all reflection points are from a single target.
Figure 9 summarises a method 40 carried out by the processor 18 to estimate the height of the target object 16. As mentioned above, at step 42 the processor 18 receives range data from the sensors 12a, 12b, 12c indicative of the distance between the sensors and reflection points on objects (such as the target object 16) from which propagated waves are reflected.
For each of the detected reflection points, at step 44 the processor 18 uses the received range data at each of the sensors 12a, 12b, 12b to define geometric shapes (e.g. circle, ellipse, hyperbola) associated with the reflection point, as described above.
At step 46, the processor 18 determines the coordinates of each of the reflection points on a suitably defined coordinate system. In particular, this corresponds to the intersection of two or more of the calculated geometric shapes associated with a given reflection point, the intersection being calculated as described above.
At step 48, the processor 18 eliminates those reflection points relating to ‘fake’ objects. For example, the ‘real’ reflections points can be determined by applying two or more of the ‘circle and circle’, ‘circle and ellipse’ and ‘two hyperbolas’ methods (or any other suitable method) to the range data of a particular reflection point, and comparing the results of both methods, as described above. Also at step 48, the processor 18 selects which of a plurality of reflection points identified for a particular target object should be used to estimate the height of the target object. This selection is based on the determined coordinates or relative angular position of each reflection point, as described above.
Finally, at step 50 the height of the selected reflection point from the ground or surface is calculated. As mentioned above, this may be a simple summation of the calculated vertical coordinate of the selected reflection point with a known vertical displacement of one of the sensors from the surface over which the vehicle travels.
In the above-described embodiment, there are three radar sensors provided; however, any suitable number of sensors may be used, in particular at least two sensors.
The above embodiment describes the use of radar sensors to collect range data; however, any suitable type of sensor may be used, for example, sonar, LIDAR or ultrasonic sensors may be used.
In the above-described embodiment, the radar sensors 12a, 12b, 12c are positioned in a vertical plane relative to a surface over which the vehicle travels; however, this need not be the case. The sensors need only be positioned such that there is a non-zero vertical displacement between them.
In the above-described embodiment, the sensors 12a, 12b, 12c are spaced apart equidistantly; however, the distance and/or vertical displacement between a first pair of the sensors need not be equal to the distance between a second pair of the sensors.
The sensors 12a, 12b, 12c in the above embodiment are positioned at the front of the vehicle 10; however, sensors may alternatively or additionally be positioned at any suitable position on the vehicle 10, e.g. the side, rear or top, to estimate the height of an object located anywhere in the vicinity of the vehicle 10.
In practice, it is possible to use either individual transmit and receive devices or to share common antenna.
Phase Comparison Method
Like the range triangulation method described above, the phase comparison method relies on using several sensors at different positions. In the phase comparison method, the phase difference between waves received at respective sensors that have been reflected from a target object is used in conjunction with the knowledge of the positions of the sensors relative to each other to calculate the position. In particular, the different path lengths travelled by the received signals results in a difference in phase between the two received signals. This difference is a function of the path length travelled.
Figure 10 shows a schematic arrangement for use when determining the height or vertical distance hn of the reflected point 20 from the surface 14. In particular, two sensors 12a, 12b are needed and, specifically, a signal of known relative phase is transmitted and then the reflected waves are received at the two sensors 12a, 12b and compared in order to determine object height. The sensor 12a is a height /-/above the surface 14, and the sensor 12b is a height d above the sensor 12a. The sensors 12a, 12b are a horizontal distance Ln from the reflection point 20, the sensor 12b is a linear distance Fh from the reflection point 20, and the sensor is a linear distance R2 from the reflection point 20.
The values of d and H are fixed and known, the values of and R2 are measured, and Ln and hn are unknown values to be calculated. For example, the distance c/may be substantially equal to half of the wavelength of the transmitted signal.
Figure 11 shows the steps involved in the method 60 carried out by the processor 18 to determine the height of the target object 16 using the phase comparison method. At step 62, reflected point data from the object 20 is received by the sensors 12a, 12b. Similarly to the range triangulation method described above, as there is no angular information in the received signals the range measurement can correspond to a point anywhere on a circle of radius and R2 for the sensors 12a, 12b, respectively, and centred at the location of the sensors 12a, 12b.
The measured distances and R2 may also be expressed as + +L„2
The phase of a signal reflected from the n-th range circle in a first sensor position may be expressed as:
(βη — (Prnd + tytg.tr + (Ptg.rc+ Ψ noise , where random initial phase;
<?&.&·- signal propagation from the transmitter to the target;
<Pig.rc - signal propagation from the target to the receiver;
<Pno/se - phase caused by the presence of noise.
The signal phases, caused by its propagation, may be expressed as:
2πΚ
2πΚ , <Plg.rc =At step 64, the difference in phase between the received signals is calculated. Mathematically, this involves multiplying one of the signals by the complex conjugate of the other. In particular, this may be expressed as (!) CQ vj (J)
A, · A2 = A,e'^ · A2e= A, · A2e''^ = A, · Α2ί> ?) where and A2 are complex signal samples received by the first and second sensors 12a, 12b, * is complex conjugation, Aj and A2 are the modules of the complex signal samples of the first and second sensors 12a, 12b, and (¾ and φ2 are phases of complex signal samples of the first and second sensors 12a, 12b.
The resulting phase difference up to a constant (random initial phase) depends on the distance to the target. In the case of one-dimensional phase retrieval the so-called Itoh algorithm may be used. For the case of two-dimensional phase retrieval the following algorithms can be used: Goldstein branch cut algorithm; Mask cut algorithm; or, Minimum norm methods (e.g. the method of least squares). Each of these methods will be well-known to the skilled person.
At step 66, a so-called reference vector is calculated. In particular, the reference vector is a predetermined profile that is stored in the processor 18 as a table which is then used to match the measured phase difference against. Specifically, a predetermined phase difference is calculated using the above equations for each individual range value as a function of height. As the reference vector values change with height as well as range, the reference vector table may be thought of as a matrix of reference phase difference values corresponding to various target object heights along the rows and various target object ranges along the columns, and vice versa. The dimension of the reference vector is selected from the maximum expected target height, angular resolution, signal to noise ratio and accuracy of measurements. For wideband signal the wavelength is not equal for each carrier frequency. The signal phase in each carrier frequency is different (for equal range). The phase of the reflected signal is determined by the phases of all signal spectral harmonics.
The difference between the calculated phase difference from step 64 and each entry in the reference vector is calculated, and the smallest such difference is selected. The difference between the estimated phase difference and each element of the reference vector corresponding to the range ring can be calculated using the formula:
Δ;=Δρ'-Δβ, where Δ, is the difference between the estimated phase difference \φ and the /-th element of the reference vector, which is expected for a given range ring.
The smallest or minimum value, i.e. the closest match, is representative of the height of the target. Expressed differently, the minimum such value of the smallest error between the measured or calculated phase difference and the predetermined phase profile that represents height. The predetermined profile is a continuous function and so the comparison may not be exact, hence the nearest value is used.
At step 68, the target reflection point height is determined by referring back to the reference vector, with the estimated target object height being the height associated with the reference vector phase difference used to calculate the closest match.
Many modifications may be made to the above examples without departing from the scope of the present invention as defined in the accompanying claims.

Claims (24)

1. A system for use in a vehicle for estimating a vertical height of a target object in the vicinity of the vehicle, the system comprising:
a receiver configured to receive range data from two or more on-board vehicle sensors, each sensor having a non-zero displacement in a vertical direction from each of the other sensors, and the range data for each sensor being indicative of the distance between a target reflection point on the target object and said sensor; and a processor configured to compare the range data from each of the sensors and to calculate a vertical distance between the target reflection point on the target object and a surface on which the target object is located based on the comparison of the range data so as to estimate the vertical height of the target object, wherein the processor is configured to compare the range data by calculating two or more different two-dimensional geometric shapes based on the range data.
2. A system according to Claim 1, wherein:
the processor comprises an electronic processor having an electrical input for receiving the range data from the two or more vehicle sensors; and an electronic memory device electrically coupled to the electronic processor and having instructions stored therein, the processor being configured to access the memory device and execute the instructions stored therein such that it is operable to compare the range data from each of the sensors and to calculate a vertical distance between the target reflection point on the target object and a surface on which the target object is located based on the comparison of the range data so as to estimate the vertical height of the target object.
3. A system according to Claim 1 or Claim 2, wherein the processor is configured to compare the range data by calculating an intersection point between the calculated geometric shapes, the intersection point coinciding with the target reflection point on the target object.
4. A system according to any preceding claim, the two or more calculated geometric shapes being one or more of a circle, an ellipse, and a hyperbola.
5. A system according to any preceding claim, wherein at least one of the calculated geometric shapes is a circle centred at the location of one of the sensors with a radius equal to the distance between said sensor location and the target reflection point on the target object according to the received range data.
6. A system according to any preceding claim, wherein at least one of the calculated geometric shapes is an ellipse in which the locations of two of the two or more sensors correspond to foci of the ellipse.
7. A system according to any preceding claim, wherein at least one of the calculated geometric shapes is a hyperbola in which the locations of two of the two or more sensors correspond to foci of the hyperbola.
8. A system according to any preceding claim, wherein the receiver is configured to receive range data from at least three on-board vehicle sensors.
9. A system according to Claim 8, wherein the processor is configured to calculate an ellipse in which the locations of two of the three or more sensors correspond to foci of the ellipse and to calculate a circle in which another one of the three or more sensors correspond to the centre of the circle.
10. A system according to Claim 9, wherein the sensor whose location corresponds to the centre of the circle is located between the two sensors whose locations correspond to the foci of the ellipse.
11. A system according to Claim 8, wherein the processor is configured to calculate a hyperbola in which the locations of two of the three or more sensors correspond to foci of the hyperbola and to calculate a circle in which another one of the three or more sensors correspond to the centre of the circle.
12. A system according to Claim 11, wherein the sensor whose location corresponds to the centre of the circle is located between the two sensors whose locations correspond to the foci of the hyperbola.
13. A system according to Claim 8, wherein the processor is configured to calculate first and second hyperbolae in which the locations of two of the three or more sensors correspond to foci of the first and second hyperbolae, and wherein at least one of the foci of the first hyperbola is different from at least one of the foci of the second hyperbola.
14. A system according to any previous claim, wherein the processor is configured to calculate a measured phase difference between measured phases associated with the range data received at two of the sensors.
15. A system according to any previous claim, wherein the processor is configured to determine the target reflection point on the target object based on the received range data.
16. A system according to any previous claim, wherein at least one of the sensors is at least one of a radar sensor, a sonar sensor, a LIDAR sensor and an ultrasonic sensor.
17. A system according to any previous claim, wherein at least one of the sensors is a radar sensor arranged to transmit signals having a frequency of at least 120 GHz.
18. A system according to any previous claim, wherein the distance between each pair of the sensors is proportional to the wavelength of the waves transmitted by the sensors.
19. A system according to Claim 18, wherein the distance between one or more pairs of the sensors is substantially equal to half of the wavelength of the transmitted waves.
20. A system according to any previous claim, the received range data comprising data relating to reflection points other than the target reflection point, and the processor being configured to select the target reflection point from the other reflection points.
21. A system according to Claim 20, wherein selection of the target reflection point is based at least partially on the angular position of each reflection point relative to one of the sensors.
22. A method for use in a vehicle for estimating the height of a target object in the vicinity of the vehicle, the method comprising:
receiving range data from two or more on-board vehicle sensors, each sensor having a non-zero displacement in a vertical direction from each of the other sensors, and the range data for each sensor being indicative of the distance between a target reflection point on the target object and said sensor;
comparing the range data from each of the sensors; and calculating a vertical distance between the target reflection point on the target object and a surface on which the target object is located based on the comparison of the range data so as to estimate the vertical height of the target object.
23. A vehicle comprising a system according to any of Claims 1 to 21.
24. A vehicle according to Claim 23, comprising two or more on-board vehicle sensors, each sensor having a non-zero displacement in a vertical direction from each of the other sensors, wherein the sensors are for transmitting a signal to the target object in the vicinity of the vehicle and for receiving a reflected signal of the transmitted signal from the target object.
GB1807667.9A 2017-05-18 2018-05-11 A system for use in a vehicle Withdrawn GB2564232A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB1707975.7A GB201707975D0 (en) 2017-05-18 2017-05-18 A system for use in a vehicle

Publications (2)

Publication Number Publication Date
GB201807667D0 GB201807667D0 (en) 2018-06-27
GB2564232A true GB2564232A (en) 2019-01-09

Family

ID=59220544

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB1707975.7A Ceased GB201707975D0 (en) 2017-05-18 2017-05-18 A system for use in a vehicle
GB1807667.9A Withdrawn GB2564232A (en) 2017-05-18 2018-05-11 A system for use in a vehicle

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB1707975.7A Ceased GB201707975D0 (en) 2017-05-18 2017-05-18 A system for use in a vehicle

Country Status (2)

Country Link
DE (1) DE102018207362A1 (en)
GB (2) GB201707975D0 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019214365A1 (en) * 2019-09-20 2021-03-25 Robert Bosch Gmbh Height measurement using ultrasonic sensors
WO2025153865A1 (en) * 2024-01-17 2025-07-24 ロベルト•ボッシュ•ゲゼルシャフト•ミト•ベシュレンクテル•ハフツング Processing apparatus and processing method for acquiring a position of a mobile terminal relative to a vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1278076A2 (en) * 2001-07-13 2003-01-22 Valeo Schalter und Sensoren GmbH Distance measuring system
US20050110620A1 (en) * 2003-11-17 2005-05-26 Denso Corporation Driving assisting apparatus for vehicles
JP2010096650A (en) * 2008-10-17 2010-04-30 Nippon Signal Co Ltd:The Radar system
US20100220550A1 (en) * 2009-02-27 2010-09-02 Nippon Soken, Inc. Obstacle detection apparatus and method for detecting obstacle
WO2012140769A1 (en) * 2011-04-14 2012-10-18 トヨタ自動車株式会社 Object detection device and method for vehicle
GB2512440A (en) * 2013-01-18 2014-10-01 Bosch Gmbh Robert Driver assistance system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1278076A2 (en) * 2001-07-13 2003-01-22 Valeo Schalter und Sensoren GmbH Distance measuring system
US20050110620A1 (en) * 2003-11-17 2005-05-26 Denso Corporation Driving assisting apparatus for vehicles
JP2010096650A (en) * 2008-10-17 2010-04-30 Nippon Signal Co Ltd:The Radar system
US20100220550A1 (en) * 2009-02-27 2010-09-02 Nippon Soken, Inc. Obstacle detection apparatus and method for detecting obstacle
WO2012140769A1 (en) * 2011-04-14 2012-10-18 トヨタ自動車株式会社 Object detection device and method for vehicle
GB2512440A (en) * 2013-01-18 2014-10-01 Bosch Gmbh Robert Driver assistance system

Also Published As

Publication number Publication date
GB201807667D0 (en) 2018-06-27
GB201707975D0 (en) 2017-07-05
DE102018207362A1 (en) 2018-11-22

Similar Documents

Publication Publication Date Title
EP3663790B1 (en) Method and apparatus for processing radar data
US11828839B2 (en) Method and apparatus for operating radar
CN109490874B (en) Method for determining the suitability of a radar target as a location landmark
US10222463B2 (en) Systems and methods for 4-dimensional radar tracking
Qian et al. 3D point cloud generation with millimeter-wave radar
US12326500B2 (en) Method and device to process radar signal
US11340342B2 (en) Automotive radar using 3D printed luneburg lens
JP7051882B2 (en) A method for classifying objects using multipolar radar data, and a device suitable for the method.
US20160154099A1 (en) Object detection apparatus and road mirror
EP3460515B1 (en) Mapping for autonomous robotic devices
US10191148B2 (en) Radar system for vehicle and method for measuring azimuth therein
JP2019518946A (en) Radar sensor device for a motor vehicle, driver assistance system, motor vehicle and method for sensing an object
Shishanov et al. Height-finding for automotive THz radars
US11435442B2 (en) Method for capturing a surrounding region of a motor vehicle with object classification, control device, driver assistance system and motor vehicle
Hu et al. Automotive squint-forward-looking SAR: High resolution and early warning
US9285468B2 (en) Extended angular resolution in sensor arrays using secondary echoes
CN105103004A (en) Apparatus and method for determining the elevation angle in a radar system
US12078714B2 (en) Angular resolution refinement in a vehicle radar for object identification
GB2564232A (en) A system for use in a vehicle
Andres et al. 3D detection of automobile scattering centers using UWB radar sensors at 24/77 GHz
Phippen et al. Trilateration of targets using a 300GHz radar system
US12366650B2 (en) Method and system for high-integrity vehicle localization
US12259466B2 (en) System for extraction of a region of interest (ROI) from a composite synthetic aperture radar (SAR) system phase history
Brisken et al. Elevation estimation with a high resolution automotive radar
Phippen et al. 3D images of elevated automotive radar targets at 300GHz

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)