WO2020053165A1 - Method for determining at least one position parameter of an object in an environment of a vehicle, computer program product, driver assistance system and vehicle - Google Patents
Method for determining at least one position parameter of an object in an environment of a vehicle, computer program product, driver assistance system and vehicle Download PDFInfo
- Publication number
- WO2020053165A1 WO2020053165A1 PCT/EP2019/074026 EP2019074026W WO2020053165A1 WO 2020053165 A1 WO2020053165 A1 WO 2020053165A1 EP 2019074026 W EP2019074026 W EP 2019074026W WO 2020053165 A1 WO2020053165 A1 WO 2020053165A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor data
- sensor
- vehicle
- position parameter
- fused
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/862—Combination of radar systems with sonar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- the invention relates to a method for determining at least one position parameter of an object in an environment of a vehicle on the basis of fused sensor data, wherein first sensor data relating to the object are captured by means of at least one first sensor of the vehicle and second sensor data relating to the object are captured by means of at least one second sensor of the vehicle. Moreover, the captured first and second sensor data are fused and in dependency of the fused first and second sensor data the at least one first position parameter of the object is determined.
- the invention also relates to a corresponding computer program product, a driver assistance system and a vehicle.
- US 2017/0285161 A1 describes an object detection system comprising a radar sensor and a camera, wherein based on the range and the direction of the radar detection a range-map for the image captured by the camera is determined and based on the range-map a detection zone in the image is defined and only the detection zone of the image is processed to determine an identity of the object.
- the speed of the object in the image can be determined from a range rate information provided by the radar sensor.
- sensor data cannot only complement each other but sensor data can also be used to provide an object detection with higher accuracy and reliability.
- data fusion algorithms can be used, which can be divided in three main phases, namely the data alignment also called synchronization, association and fusion of the objects, as well as tracking.
- data alignment also called synchronization
- association and fusion of the objects
- tracking A method for tracking objects by using association between previously and newly detected objects, though not based on fused sensor data, is described in US 8, 799, 201 B2.
- the speed is estimated to be not zero. However, this can be an issue, because also when objects are static, they are usually subject to some small fluctuations in distance. Thus the object speed, which is estimated, can be wrong and static vehicles are erroneously considered to have a relative speed, which then will generate some trajectories.
- an object of the present invention to provide a method, a computer program product, a driver assistance system and a vehicle, which allow for determining at least one position parameter of an object in an environment of the vehicle on the basis of fused sensor data in a more reliable way.
- a method for determining at least one first position parameter of an object in an environment of a vehicle on the basis of fused sensor data, wherein first sensor data relating to the object are captured by means of at least one first sensor of the vehicle and second sensor data relating to the object of captured by means of at least one second sensor of the vehicle. Moreover, the captured first and second sensor data are fused and in dependency of the fused first and second sensor data the at least one first position parameter of the object is determined.
- the at least one first position parameter is corrected in dependency of only the first sensor data and the result is provided as corrected at least one first position parameter.
- the invention is based on the idea that every kind of sensor has its strengths and weaknesses with regard to determining certain properties or certain parameters of objects in an environment of the vehicle. For example, by means of a LIDAR sensor distances to objects can be measured in a highly accurate way, whereas velocities of objects can be measured for example by a radar sensor very accurately, and other object parameter can again be determined on the basis of fused sensor data the most reliable way. Thus, for example a certain position parameter provided by the sensor data fusion can be corrected on the basis of those sensor data captured by a certain sensor or certain sensor type, which is capable of capturing this certain position parameter in a much more accurate and reliable way than the other sensors or even than the sensor data fusion.
- the strengths of a certain sensor with regard to the measurement of a certain position parameter can be used specifically to correct the result from the sensor data fusion of this specific positon parameter.
- This allows for combining the advantages provided by the sensor data fusion, namely providing more comprehensive information about the environment and providing e.g. some other object parameters in a very accurate way and reliable way, with specific strengths of at least one sensor type with regard to accuracy and precision of measuring at least one specific position parameter.
- position parameters of objects in the environment of a vehicle can be detected in a much more accurate and reliable way.
- a position parameter is a parameter, which depends on the position of the object in the environment of the vehicle.
- This position parameter can be the position of the object itself, but also a velocity of the object or an acceleration or any further temporal deviation of the position of the object.
- the at least one first position parameter of the object much more information about the object and properties of the object can be provided in dependency of the fused first and second sensor data, for example the spatial extent of the object, its direction of movement, its orientation, and also a classification of such object can be performed based on the fused first and second sensor data, for example a classification of the kind of object. Furthermore, the correction does not have to be performed of each determined parameter of the object.
- the first sensor and the second sensor of the vehicle preferably are configured as sensors of a different kind, especially using different measuring principles. Thereby, the environment can be described on the basis of the captured sensor data in the most comprehensive and accurate way.
- the vehicle can also comprise further sensors of further sensor types, the sensor data of which are fused, especially for a corresponding time step, and on the basis of which the at least one first position parameter is
- a laser scanner especially a LIDAR sensor, a radar sensor, a camera, a ultrasonic sensor. Also several sensors of the same type can be used.
- the at least one first sensor preferably is a LIDAR and/or a radar, which has especially great advantages, which is described later on.
- the first sensor comprises a higher accuracy with respect to a determining of the at least one first position parameter than the at least one second sensor.
- each of the at least one first and at least one second sensor is associated with a measurement accuracy, especially in form of an error model, wherein the at least one first position parameter, which is determined in dependency of the fused first and second sensor data, is determined in dependency of each measurement accuracy.
- Each measurement accuracy in form of said error model can be provided by the manufacturer is of the respective sensors and therefore be predefined.
- Such an error model may define an error range or also several error ranges depending on the direction. For example for each direction in space a corresponding error range can be defined for each sensor. Also, an error range can be defined only for two directions in space as in most cases is sufficiently to know the position or other properties of objects in the horizontal plane, namely perpendicular to the vehicle’s vertical axis.
- this fused position can be determined such that the fused position coincides with respective positions provided by each sensor within the respective error ranges.
- the fused position, or any other fused position parameter can also be calculated as some kind of weighted average based on the sensor data provided by each sensor, wherein the weights are determined in dependency of the respective error models, such that those sensor data comprising a higher accuracy are weighted stronger.
- the at least one first position parameter constitutes a distance between the vehicle and the object.
- the error model associated with a specific sensor might fail, which especially might result in a very inaccurate determination of the distance of the object with respect to the vehicle based on the sensor data fusion.
- cumulated delays which are due to the usage of filters, result in a higher inaccuracy with respect to the determination of the distance between the object and the vehicle.
- most problems associated with the data fusion impact the accuracy of the determined distance between the object and the vehicle. So, according to this advantageous embodiment of the invention, these problems can be overcome by correcting the distance calculated in dependency of the fused first and second sensor data.
- the first sensor comprises a laser scanner, especially a LIDAR sensor.
- a LIDAR sensor the distance information associated with objects in the environment of the vehicle can be provided in a very accurate way. Further, this accuracy can hardly be influenced by any environmental condition, e.g. when the road has a certain slope for instance.
- the distance to the object which is calculated based on the fused sensor data, can finally be corrected on the basis of the raw data provided by the LIDAR sensor or LIDAR sensors in case the vehicle comprises several LIDAR sensors, which provide sensor data related to the same object.
- the first position of the object including a first position of a boundary of the object facing the vehicle is determined related to the vehicle in dependency of the fused first and second sensor data, wherein the first sensor data are provided in form of measurement points defining a measured second position of the object boundary related to the vehicle, wherein the determined first position of the object is corrected by shifting the determined first position of the object such that the first position of the boundary matches the measured second position of the object boundary.
- the measurement points provided by the LIDAR sensor which define the position of the object boundary facing the vehicle, especially facing the LIDAR sensor, allow to perform a realigning step, which is especially advantageous when the distance error model associated with a vehicle sensor fails.
- This also allows to compensate the delay generated by the tracking. Therefore, also in very difficult situations, which lead to higher inaccuracy of the determination of object parameters, especially position parameters, based on the fused sensor data, can now be handled in a very accurate way.
- the first sensor data can be provided in form of an occupancy gridmap representation.
- the occupancy gridmap is a very suitable representation especially for sensor data provided by LIDAR sensors.
- other sensor data can advantageously be represented by such an occupancy gridmap, like those provided by a camera, a radar and/or an ultrasonic sensor.
- Each cell of such gridmap is associated with a certain position region of the environment of the vehicle and moreover each cell can be associated with at least one state based, which is assigned to the respective cells based on the captured sensor data.
- Such states can be for example occupied or not occupied, meaning that at least part of an object is present within that position region of the environment, which is associated with a certain cell, which has been classified as occupied, and no object is present in a position region of the environment associated with a certain cell, which has been classified as being not occupied.
- Another possible state assigned to each cell could also be static, dynamic or unknown.
- velocity information can be assigned to each cell depending on the determined velocities of objects detected at position regions associated with the respective cells of the occupancy gridmap. Therefore, a variety of advantageous information about the environment can easily by represented by means of an occupancy gridmap.
- the at least one first position parameter is a velocity of the object.
- objects are static, due to small fluctuations in the determined distances based on the fused sensor data, those objects are erroneously classified as being dynamic and are associated with a certain relative speed. This also can now advantageously be avoided by providing a correction of the velocity of the object.
- the at least one first sensor is or comprises a radar sensor.
- a classification of objects with regard to their dynamic properties namely whether they are static or dynamic, can be provided with very high reliability and accuracy.
- Such velocity correction can for example be performed as follows: In case the velocity of the object determined by the radar sensor is zero and in case the velocity of the object determined in dependency of the fused first and second sensor data is different from zero, then the final corrected velocity of the object is set to zero. This allows for a very accurate and reliable determination of whether an object is static or not.
- advantageous embodiment is based on the idea that in case an object moves and therefore is dynamic, small fluctuations do not compromise the determination of the velocity based on the fused sensor data so much, and especially do not influence the correct classification as dynamic objects at all.
- an object in case an object is static, even small fluctuations may cause a wrong classification of the object as dynamic.
- the absolute speed of this object can be set to zero and its state can be set to be standing or stopped and on so fluctuations in the sensor data cannot lead to a false classification of such an object anymore.
- the first position parameter can be the distance of the object to the vehicle, which advantageously can be corrected by using the raw LIDAR sensor data
- the second position parameter can be the velocity of the detected object, which advantageously can be corrected by using the raw radar sensor data, especially as already described above.
- the allover detection accuracy can be enhanced.
- the invention also relates to a computer program product comprising program code stored in a computer readable medium, and which when executed by a processor of an electronic control device causes the processor to perform the method according to the invention or its embodiments.
- the invention also relates to a driver assistance system for determining at least one first position parameter of an object in the environment of a vehicle on the basis of fused sensor data, therein the driver assistance system comprises at least one first sensor, which is configured to capture first sensor data relating to the object, and the driver assistance system comprises at least one second sensor, which is configured to capture sensor data relating to the object. Moreover, the driver assistance system comprises a control unit, which is configured to fuse the captured first and second sensor data and to determine the at least one first position parameter of the object in dependency of the fused first and second sensor data. Moreover, the control unit is configured to correct the at least one first position parameter in dependency of only the first sensor data and to provide the result as corrected at least one first position parameter.
- the invention also relates to a vehicle comprising a driver assistance system according to the invention or its embodiments.
- Fig. 1 a schematic illustration of a vehicle, especially in a top view, comprising a driver assistance system for determining at least one position parameter of an object in an environment of the vehicle according to an embodiment of the invention
- Fig. 2 a schematic illustration of a sensor data fusion process, which is performed by a control unit of the vehicle according to an embodiment of the invention
- Fig. 3 a schematic illustration of the process of correcting a position parameter, which is determined based on a result of the data fusion process, wherein in this example the position parameter constitutes the determined distance between the vehicle and the object, according to an embodiment of the invention
- Fig. 4 a schematic illustration of the process of correcting a position parameter, which is determined based on a result of the data fusion process, wherein in this example the position parameter constitutes the determined velocity of the object, according to an embodiment of the invention.
- Fig. 5 a schematic illustration of the determining of at least one position
- Fig. 1 shows a schematic illustration of a vehicle 1 , especially in a top view, comprising a driver assistance system 2 for determining at least one position parameter of an object 3, which in this example is configured as another vehicle 3 traveling ahead of the vehicle 1 , in an environment 4 of the vehicle 1 according to an embodiment of the invention.
- the driver assistance system 2 of the vehicle 1 comprises several sensors of different sensor types.
- the driver assistance system 2 comprises several LIDAR sensors 5, several radar sensors 6 and several cameras 7.
- the driver assistance system 2 comprises a control unit 9.
- the sensor data captured by each of these sensors 5, 6, 7 are transmitted to the control unit 9 and are processed by this control unit 9.
- the sensor data captured by the LIDAR sensors 5 are denoted here by the reference signs 15, the sensor data captured by the radar sensors 6 are donated by the reference signs 16 and the sensor data captured by the cameras 7 are denoted by the reference signs 17.
- the processing of the captured sensor data 15, 16, 17 performed by the control unit 9 is now explained in the following.
- Fig. 2 shows a schematic illustration of a sensor data fusion process, which is performed by the control unit 9 according to an embodiment of the invention.
- the process starts in step S1 , which illustrates the process of object association.
- step S1 the sensor data 15, 16, 17 of the different sensors 5, 6, 7 relating to one and the same object, in this case the vehicle 3 traveling ahead, are associated.
- the sensor data 15, 16, 17 of only three sensors 5, 6, 7 of a different type are illustrated.
- 3a denotes the detection result of one of the cameras 7, 3b the detection result of one of the radar sensors 6 and 3c the detection result of one of the LIDAR sensors 5, especially defined with regard to the vehicle coordinate system 8.
- These detection results 3a, 3b, 3c of each sensor 5, 6, 7 may include a variety of information about the detected object 3, like the spatial extent of the object 3, its position, it's velocity, its size, and so on.
- this object association is performed on the basis of synchronized sensor data 15, 16, 17, meaning that the sensor data 15, 16, 17 captured by each sensor or 5, 6, 7 at slightly different times are assigned to common times steps.
- each of the sensors 5, 6, 7 is associated with a corresponding error model. Based on this error model, for each measurement point measured by each respective sensor 5, 6, 7 a corresponding error range can be defined in step S2 as illustrated in Fig. 2 in the middle. For each sensor 5, 6, 7 are representative measurement point P1 , P2, P3 and the corresponding error range 10, 1 1 , 12 is illustrated. Based on these respective
- a fusion point 13 is calculated, especially considering thereby the respective error models, represented here by the respective error ranges 10, 1 1 , 12. Also this fusion point 13 is finally associated with a corresponding error range 14, which can be calculated based on the respective error ranges 10, 1 1 , 12 of the respective measurement points P1 , P2, P3. As illustrated in Fig. 2 in step S2, the resulting error range 14 of the fusion point 13 is much smaller than the respective error or ranges 10, 1 1 , 12 of their respective measurement points P1 , P2, P3. This means that at least in most situations the accuracy and reliability of the detection based on the fused sensor data 15, 16, 17 can be improved by this sensor data fusion. Such data fusion as described with regard to step S2 can be performed with all the available sensor data 15, 16, 17 relating to the same object 3 and which assigned to the same time step, which leads to a fused object 18 as output and step S3.
- Fig. 3 shows a schematic illustration of the process of correcting a position parameter, which is determined based on the fused object 18, wherein in this example the position parameter constitutes the determined distance between the vehicle 1 and the object 3.
- Fig. 3 on the left-hand side illustrates the distance d, which has been determined on the basis of the fused object 18.
- This distance d is now corrected to a distance d' in dependency of the raw sensor data 15 provided by the LIDAR sensors of the driver assistance system 2.
- These sensor data 15 are here is illustrated in form of single measurement points, which relate to the boundary 19 (compare Fig. 1 ) of the vehicle 3 facing the vehicle 1. Because it can be assumed that these measurement points of the sensor data 15 provided by the LIDAR sensors 5 are highly accurate, a realigning step can be performed in step S5. Thereby, the fused object 18 is shifted in its position such that the boundary 20 of the fused object 18 matches, e.g.
- the fused object 18 can be shifted as a whole resulting in a positionally corrected fused object 18' with a positionally corrected distance d’ to the vehicle 1 .
- All or at least some other object parameters, which have been determined based on the fusion result, namely the fused object 18, can remain unchanged, like the spatial extent of the fused object 18, its size, its kind, or other optionally determined parameters. Also the determined velocity or its classifications as static or dynamic can remain unchanged, however it is very
- Fig. 4 shows a schematic illustration of the correction process for correcting the velocity of the object 3 determined based on the fused object 18 according to an embodiment of the invention.
- the fused object 18 is associated with a velocity v, which is different from 0, and which also has been determined based of the fused sensor data 15, 16, 17.
- step S6 the raw data 16 of the radar sensor 6, especially the raw sensor data 16 around the fused object 18 are isolated and based on these sensor data 16 the velocity of the object 3 is determined, which can be done in a very accurately way.
- the velocity v of fused object 18 is corrected to a velocity v' of 0 in step S7 as illustrated on the right-hand side in Fig. 4.
- the resolution of the radar data is extremely reliable and allows therefore to distinguish static from dynamic raw data.
- By isolating the raw data around the fused object 18 it can be determined very accurately, whether or not the object 3 is moving, and if not, the absolute speed of that object 3 can be set to 0 and the classification state of the object 3 can be set to be standing or stopped or static.
- This velocity correction can be performed alternatively or additionally to the distance correction as describe with regard to Fig. 3.
- the velocity correction can be performed prior to the distance correction, temporally after the distance correction or simultaneously with the distance correction.
- a kind of hybrid architecture can be provided using fused sensor data on the one hand and raw data of at least one single sensor on the other hand, which is further illustrated in Fig. 5.
- Fig. 5 shows another schematic illustration of the determining of at least one position parameter of the object 3 on the basis of the fused sensor data 15, 16, 17 and the correction of the result in dependency of the raw sensor data 15, 16, 17, according to an embodiment of the invention.
- the respective sensors 5, 6, 7, which are of a different sensor type, provide corresponding sensor data 15, 16, 17.
- These sensor data 15, 16, 17 can be provided in form of corresponding object lists, like the detection results 3, 3b, 3c as illustrated in Fig. 2, or such object lists can be created on the basis of the respective sensor data 15, 16, 17.
- Those sensor raw data 15, 16, 17 can be represented in form of an occupancy gridmap 21.
- Such a static occupancy gridmap 21 is particularly suitable for representing the freespace information provided by the camera 7 in form of the respective cameras sensor data 17, the raw data 15 of the one or more LIDAR sensors 5, as well as the raw data 16 of the radar sensor 6 or radar sensors 6. This occupancy gridmap 21 now can
- a distance correction can be performed using the raw data 15 provided by the one or more LIDAR sensors 5, whereby the distance d as determined from the fused object 18 can be corrected to the corrected distance d' as final result.
- the velocity v as a result of the object fusion can be corrected on the basis of the radar data 16 to the corrected velocity v' as the final result.
- a corrected fused object 18' can be provided comprising respective corrected object parameters, especially the corrected position parameters d' and v'.
- the invention or its embodiments advantageously allow to resolve the inherent problems related to objects fusion by using the raw data from LIDAR and/or radar as input, on the basis of which the fusion result can be corrected.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention relates to a method for determining a position parameter (d, d', v, v') of an object (3) in an environment (4) of a vehicle (1) on the basis of fused sensor data (18), wherein first and second sensor data (15, 16) relating to the object (3) are captured by means of at least one first and second sensor (5, 6) of the vehicle (1), respectively, the first and second sensor data (15, 16) are fused and in dependency of the fused first and second sensor data (15, 16) the position parameter (d, v) of the object (3) is determined. Further, the position parameter (d, v) is corrected in dependency of only the first sensor data (15, 16) and the result is provided as corrected position parameter (d', v').
Description
Method for determining at least one position parameter of an object in an environment of a vehicle, computer program product, driver assistance system and vehicle
The invention relates to a method for determining at least one position parameter of an object in an environment of a vehicle on the basis of fused sensor data, wherein first sensor data relating to the object are captured by means of at least one first sensor of the vehicle and second sensor data relating to the object are captured by means of at least one second sensor of the vehicle. Moreover, the captured first and second sensor data are fused and in dependency of the fused first and second sensor data the at least one first position parameter of the object is determined. The invention also relates to a corresponding computer program product, a driver assistance system and a vehicle.
Many driver assistance systems known from the prior art are based on detecting objects in the environment of the vehicle, especially static objects as well as dynamic objects, for example other road users, pedestrians, guardrails, and so on. Moreover, for detecting objects in the environment of a vehicle a variety of sensors can be used, like cameras, radars, and so on. In this context US 2017/0285161 A1 describes an object detection system comprising a radar sensor and a camera, wherein based on the range and the direction of the radar detection a range-map for the image captured by the camera is determined and based on the range-map a detection zone in the image is defined and only the detection zone of the image is processed to determine an identity of the object. Alternatively the speed of the object in the image can be determined from a range rate information provided by the radar sensor.
Generally, by combining information captured by different sensors, especially sensors of different types, a more comprehensive and accurate description of the environment can be provided. Thereby sensor data cannot only complement each other but sensor data can also be used to provide an object detection with higher accuracy and reliability.
Especially for more advanced driver assistance systems such as adaptive cruise control and especially to provide fully automated vehicles, such vehicles require a high level of equipment in term of sensors and electronic computation units and several sensor technologies have to be used simultaneously to create a consistent environment.
Especially, object fusion using several sensors has to be performed. For that purpose, data fusion algorithms can be used, which can be divided in three main phases, namely
the data alignment also called synchronization, association and fusion of the objects, as well as tracking. A method for tracking objects by using association between previously and newly detected objects, though not based on fused sensor data, is described in US 8, 799, 201 B2.
However, there are also inherent problems related to such data fusion processes: first of all, when objects captured by different sensors are associated together, they are fused using a certain error model, which is associated with the sensors. The error model associated with each sensor provides good performance most of the time, but in some special situations the model associated with the sensors turns to be wrong and this will have a negative impact on the fusion result. For example, measurement errors of the camera in case of a slope or the road can be larger than specified by the associated error model. Futher, when tracking objects filters like a Kalman filter, are used, wherein usually such filtering induces a delay. So when those objects are processed, the delay may be cumulated, which might lead to large errors as well. Furthermore, when an object is tracked also as speed associated with this object is estimated. If the object has moved between the previous time step and the current time step, the speed is estimated to be not zero. However, this can be an issue, because also when objects are static, they are usually subject to some small fluctuations in distance. Thus the object speed, which is estimated, can be wrong and static vehicles are erroneously considered to have a relative speed, which then will generate some trajectories.
Therefore it's an object of the present invention to provide a method, a computer program product, a driver assistance system and a vehicle, which allow for determining at least one position parameter of an object in an environment of the vehicle on the basis of fused sensor data in a more reliable way.
This object is solved by a method, a computer program product, a driver assistance system and a vehicle with the features according to the respective independent claims. Advantages embodiment of the invention are presented in the dependent claims, the description and the drawings.
In the context of the invention a method is provided for determining at least one first position parameter of an object in an environment of a vehicle on the basis of fused sensor data, wherein first sensor data relating to the object are captured by means of at least one first sensor of the vehicle and second sensor data relating to the object of captured by means of at least one second sensor of the vehicle. Moreover, the captured
first and second sensor data are fused and in dependency of the fused first and second sensor data the at least one first position parameter of the object is determined.
Moreover, the at least one first position parameter is corrected in dependency of only the first sensor data and the result is provided as corrected at least one first position parameter.
The invention is based on the idea that every kind of sensor has its strengths and weaknesses with regard to determining certain properties or certain parameters of objects in an environment of the vehicle. For example, by means of a LIDAR sensor distances to objects can be measured in a highly accurate way, whereas velocities of objects can be measured for example by a radar sensor very accurately, and other object parameter can again be determined on the basis of fused sensor data the most reliable way. Thus, for example a certain position parameter provided by the sensor data fusion can be corrected on the basis of those sensor data captured by a certain sensor or certain sensor type, which is capable of capturing this certain position parameter in a much more accurate and reliable way than the other sensors or even than the sensor data fusion. Therefore, especially the strengths of a certain sensor with regard to the measurement of a certain position parameter can be used specifically to correct the result from the sensor data fusion of this specific positon parameter. This allows for combining the advantages provided by the sensor data fusion, namely providing more comprehensive information about the environment and providing e.g. some other object parameters in a very accurate way and reliable way, with specific strengths of at least one sensor type with regard to accuracy and precision of measuring at least one specific position parameter. Thereby, position parameters of objects in the environment of a vehicle can be detected in a much more accurate and reliable way.
Generally, a position parameter is a parameter, which depends on the position of the object in the environment of the vehicle. This position parameter can be the position of the object itself, but also a velocity of the object or an acceleration or any further temporal deviation of the position of the object. Furthermore, besides the at least one first position parameter of the object much more information about the object and properties of the object can be provided in dependency of the fused first and second sensor data, for example the spatial extent of the object, its direction of movement, its orientation, and also a classification of such object can be performed based on the fused first and second sensor data, for example a classification of the kind of object. Furthermore, the correction does not have to be performed of each determined parameter of the object. Especially, only those position parameters, for which high accurate sensor data can be provided by at
least one of the sensors of the vehicle, can be corrected. Moreover, the first sensor and the second sensor of the vehicle preferably are configured as sensors of a different kind, especially using different measuring principles. Thereby, the environment can be described on the basis of the captured sensor data in the most comprehensive and accurate way. Besides the at least one first sensor, which preferably is of a first sensor type, and the at least one second sensor, which preferably is of a second sensor type different from the first sensor type, the vehicle can also comprise further sensors of further sensor types, the sensor data of which are fused, especially for a corresponding time step, and on the basis of which the at least one first position parameter is
determined. Generally, for providing the first and second and optional further sensor data at least two of the following sensor types are used: A laser scanner, especially a LIDAR sensor, a radar sensor, a camera, a ultrasonic sensor. Also several sensors of the same type can be used. The at least one first sensor preferably is a LIDAR and/or a radar, which has especially great advantages, which is described later on.
According to an advantages embodiment of the invention the first sensor comprises a higher accuracy with respect to a determining of the at least one first position parameter than the at least one second sensor. Thus, by correcting the at least one first position parameter in dependency of only the first sensor data the result can be improved the most, and advantageously the corrected at least one first position parameter matches more accurately the real position parameter than the uncorrected first position parameter.
Moreover, each of the at least one first and at least one second sensor is associated with a measurement accuracy, especially in form of an error model, wherein the at least one first position parameter, which is determined in dependency of the fused first and second sensor data, is determined in dependency of each measurement accuracy. Each measurement accuracy in form of said error model can be provided by the manufacturer is of the respective sensors and therefore be predefined. Such an error model may define an error range or also several error ranges depending on the direction. For example for each direction in space a corresponding error range can be defined for each sensor. Also, an error range can be defined only for two directions in space as in most cases is sufficiently to know the position or other properties of objects in the horizontal plane, namely perpendicular to the vehicle’s vertical axis.
When calculating the fused object based on the fused sensor data, like determining the fused position of the object, this fused position can be determined such that the fused position coincides with respective positions provided by each sensor within the respective
error ranges. However, e.g. in case one of the error models fails, like described in the beginning, this might not always be possible. Generally, the fused position, or any other fused position parameter, can also be calculated as some kind of weighted average based on the sensor data provided by each sensor, wherein the weights are determined in dependency of the respective error models, such that those sensor data comprising a higher accuracy are weighted stronger. Thereby, very good and accurate results can be achieved, at least in most cases, on the basis of such data fusion.
According to a very advantages embodiment of the invention the at least one first position parameter constitutes a distance between the vehicle and the object. As described in the beginning, in some special situations the error model associated with a specific sensor might fail, which especially might result in a very inaccurate determination of the distance of the object with respect to the vehicle based on the sensor data fusion. Also, cumulated delays, which are due to the usage of filters, result in a higher inaccuracy with respect to the determination of the distance between the object and the vehicle. In other words, most problems associated with the data fusion impact the accuracy of the determined distance between the object and the vehicle. So, according to this advantageous embodiment of the invention, these problems can be overcome by correcting the distance calculated in dependency of the fused first and second sensor data.
For this purpose it is very advantageous when the first sensor comprises a laser scanner, especially a LIDAR sensor. By means of a LIDAR sensor the distance information associated with objects in the environment of the vehicle can be provided in a very accurate way. Further, this accuracy can hardly be influenced by any environmental condition, e.g. when the road has a certain slope for instance. Thus, advantageously, the distance to the object, which is calculated based on the fused sensor data, can finally be corrected on the basis of the raw data provided by the LIDAR sensor or LIDAR sensors in case the vehicle comprises several LIDAR sensors, which provide sensor data related to the same object.
According to another advantages embodiment of the invention the first position of the object including a first position of a boundary of the object facing the vehicle is determined related to the vehicle in dependency of the fused first and second sensor data, wherein the first sensor data are provided in form of measurement points defining a measured second position of the object boundary related to the vehicle, wherein the determined first position of the object is corrected by shifting the determined first position of the object such that the first position of the boundary matches the measured second
position of the object boundary. This way, above-named distance correction can be performed. Thus, by using the raw data from the LIDAR sensors, the measurement points provided by the LIDAR sensor, which define the position of the object boundary facing the vehicle, especially facing the LIDAR sensor, allow to perform a realigning step, which is especially advantageous when the distance error model associated with a vehicle sensor fails. This also allows to compensate the delay generated by the tracking. Therefore, also in very difficult situations, which lead to higher inaccuracy of the determination of object parameters, especially position parameters, based on the fused sensor data, can now be handled in a very accurate way.
Thereby, the first sensor data can be provided in form of an occupancy gridmap representation. Using the occupancy gridmap is a very suitable representation especially for sensor data provided by LIDAR sensors. Also other sensor data can advantageously be represented by such an occupancy gridmap, like those provided by a camera, a radar and/or an ultrasonic sensor. Each cell of such gridmap is associated with a certain position region of the environment of the vehicle and moreover each cell can be associated with at least one state based, which is assigned to the respective cells based on the captured sensor data. Such states can be for example occupied or not occupied, meaning that at least part of an object is present within that position region of the environment, which is associated with a certain cell, which has been classified as occupied, and no object is present in a position region of the environment associated with a certain cell, which has been classified as being not occupied. Another possible state assigned to each cell could also be static, dynamic or unknown. When an object has been detected at the certain position and the object has been classified to be static or dynamic or cannot be classified with respect to its dynamic properties then also the corresponding associated cells can be classified correspondingly as static or dynamic or unknown. Also velocity information can be assigned to each cell depending on the determined velocities of objects detected at position regions associated with the respective cells of the occupancy gridmap. Therefore, a variety of advantageous information about the environment can easily by represented by means of an occupancy gridmap.
Moreover, to perform above described distance correction it would be sufficient to provide an occupancy gridmap representation of the LIDAR sensor data, which provide position information about objects in the environment. Accordingly, gridmap states like occupied and not occupied in sufficiently high spatial resolution would be sufficient. However, it is especially advantageous to additionally provide velocity information of the object by means of a radar sensor, which is explained in the following, and therefore to include this
additional velocity information in the gridmap representation, especially in form of above described additional states static, dynamic and unknown.
Therefore, according to another advantages embodiment of the invention, the at least one first position parameter is a velocity of the object. As also explained in the beginning, though objects are static, due to small fluctuations in the determined distances based on the fused sensor data, those objects are erroneously classified as being dynamic and are associated with a certain relative speed. This also can now advantageously be avoided by providing a correction of the velocity of the object.
For such velocity correction preferably of the sensor data of at least one radar sensor are used, as usually a radar sensor can provide velocity information about objects with very high accuracy. Therefore according to another advantageous embodiment of the invention the at least one first sensor is or comprises a radar sensor. Thus, also a classification of objects with regard to their dynamic properties, namely whether they are static or dynamic, can be provided with very high reliability and accuracy.
Such velocity correction can for example be performed as follows: In case the velocity of the object determined by the radar sensor is zero and in case the velocity of the object determined in dependency of the fused first and second sensor data is different from zero, then the final corrected velocity of the object is set to zero. This allows for a very accurate and reliable determination of whether an object is static or not. This
advantageous embodiment is based on the idea that in case an object moves and therefore is dynamic, small fluctuations do not compromise the determination of the velocity based on the fused sensor data so much, and especially do not influence the correct classification as dynamic objects at all. However, in case an object is static, even small fluctuations may cause a wrong classification of the object as dynamic. Thus, by isolating the raw radar sensor data around the fused object it can easily be determined correctly whether or not the object is moving, and when based on the radar sensor data it is determined that the object is not moving then the absolute speed of this object can be set to zero and its state can be set to be standing or stopped and on so fluctuations in the sensor data cannot lead to a false classification of such an object anymore. In other cases, when the velocity of the object as determined by the fused sensor data or the radar sensor data is unequivocally different from zero, then no correction of the velocity has to be performed, though nevertheless also in this case a velocity correction can be performed.
The described distance correction and the described velocity correction can also be combined. This is very advantageous, as then very accurate distance information and very accurate dynamic information about detected objects in the environment of the vehicle can be provided. Therefore according to another advantages embodiment of the invention additionally to the at least one first position parameter at least one second position parameter of the object is determined in dependency of the fused first and second sensor data and the at least one second position parameter is corrected in dependency of only the second sensor data and the result is provided as corrected at least one second position parameter. So for example the first position parameter can be the distance of the object to the vehicle, which advantageously can be corrected by using the raw LIDAR sensor data, and the second position parameter can be the velocity of the detected object, which advantageously can be corrected by using the raw radar sensor data, especially as already described above. Thus, the allover detection accuracy can be enhanced.
The invention also relates to a computer program product comprising program code stored in a computer readable medium, and which when executed by a processor of an electronic control device causes the processor to perform the method according to the invention or its embodiments.
The invention also relates to a driver assistance system for determining at least one first position parameter of an object in the environment of a vehicle on the basis of fused sensor data, therein the driver assistance system comprises at least one first sensor, which is configured to capture first sensor data relating to the object, and the driver assistance system comprises at least one second sensor, which is configured to capture sensor data relating to the object. Moreover, the driver assistance system comprises a control unit, which is configured to fuse the captured first and second sensor data and to determine the at least one first position parameter of the object in dependency of the fused first and second sensor data. Moreover, the control unit is configured to correct the at least one first position parameter in dependency of only the first sensor data and to provide the result as corrected at least one first position parameter.
The invention also relates to a vehicle comprising a driver assistance system according to the invention or its embodiments.
The advantages described with the method according to the invention and its
embodiments correspondingly apply to the computer program product, the driver
assistance system and the vehicle according to the invention. Moreover the preferred embodiments described with regard to the method according to the invention constitute further advantages corresponding embodiments of the computer program product, the driver assistance system and the vehicle according to the invention.
Further features of the invention are apparent from the claims, the figures and the description of figures. The features and feature combinations mentioned above in the description as well as the features and feature combinations mentioned below in the description of figures and/or shown in the figures alone are usable not only in the respectively specified combination, but also in other combinations without departing from the scope of the invention. Thus, implementations are also to be considered as encompassed and disclosed by the invention, which are not explicitly shown in the figures and explained, but arise from and can be generated by separated feature combinations from the explained implementations. Implementations and feature combinations are also to be considered as disclosed, which thus do not have all of the features of an originally formulated independent claim. Moreover, implementations and feature combinations are to be considered as disclosed, in particular by the implementations set out above, which extend beyond or deviate from the feature combinations set out in the relations of the claims.
Therein show:
Fig. 1 a schematic illustration of a vehicle, especially in a top view, comprising a driver assistance system for determining at least one position parameter of an object in an environment of the vehicle according to an embodiment of the invention;
Fig. 2 a schematic illustration of a sensor data fusion process, which is performed by a control unit of the vehicle according to an embodiment of the invention;
Fig. 3 a schematic illustration of the process of correcting a position parameter, which is determined based on a result of the data fusion process, wherein in this example the position parameter constitutes the determined distance between the vehicle and the object, according to an embodiment of the invention;
Fig. 4 a schematic illustration of the process of correcting a position parameter, which is determined based on a result of the data fusion process, wherein in this example the position parameter constitutes the determined velocity of the object, according to an embodiment of the invention; and
Fig. 5 a schematic illustration of the determining of at least one position
parameter of the object on the basis of the fused sensor data and the correction of the result in dependency of the raw sensor data, according to an embodiment of the invention.
Fig. 1 shows a schematic illustration of a vehicle 1 , especially in a top view, comprising a driver assistance system 2 for determining at least one position parameter of an object 3, which in this example is configured as another vehicle 3 traveling ahead of the vehicle 1 , in an environment 4 of the vehicle 1 according to an embodiment of the invention. For capturing the environment 4 including the vehicle 3 ahead, the driver assistance system 2 of the vehicle 1 comprises several sensors of different sensor types. In this example the driver assistance system 2 comprises several LIDAR sensors 5, several radar sensors 6 and several cameras 7. These sensors 5, 6, 7 can be arranged in such a way on the vehicle 1 , that the environment can be captured within 360° around the vehicle 1 or only a subarea thereof, especially within a horizontal plane, which is illustrated here as the x-y- plane vehicle coordinate system 8. Furthermore, the driver assistance system 2 comprises a control unit 9. The sensor data captured by each of these sensors 5, 6, 7 are transmitted to the control unit 9 and are processed by this control unit 9. The sensor data captured by the LIDAR sensors 5 are denoted here by the reference signs 15, the sensor data captured by the radar sensors 6 are donated by the reference signs 16 and the sensor data captured by the cameras 7 are denoted by the reference signs 17. The processing of the captured sensor data 15, 16, 17 performed by the control unit 9 is now explained in the following.
Fig. 2 shows a schematic illustration of a sensor data fusion process, which is performed by the control unit 9 according to an embodiment of the invention. The process starts in step S1 , which illustrates the process of object association. In this step S1 the sensor data 15, 16, 17 of the different sensors 5, 6, 7 relating to one and the same object, in this case the vehicle 3 traveling ahead, are associated. In this example illustrated here, the sensor data 15, 16, 17 of only three sensors 5, 6, 7 of a different type are illustrated.
Thereby, 3a denotes the detection result of one of the cameras 7, 3b the detection result of one of the radar sensors 6 and 3c the detection result of one of the LIDAR sensors 5, especially defined with regard to the vehicle coordinate system 8. These detection results 3a, 3b, 3c of each sensor 5, 6, 7 may include a variety of information about the detected object 3, like the spatial extent of the object 3, its position, it's velocity, its size, and so on.
Further, this object association is performed on the basis of synchronized sensor data 15, 16, 17, meaning that the sensor data 15, 16, 17 captured by each sensor or 5, 6, 7 at slightly different times are assigned to common times steps. Moreover, each of the sensors 5, 6, 7 is associated with a corresponding error model. Based on this error model, for each measurement point measured by each respective sensor 5, 6, 7 a corresponding error range can be defined in step S2 as illustrated in Fig. 2 in the middle. For each sensor 5, 6, 7 are representative measurement point P1 , P2, P3 and the corresponding error range 10, 1 1 , 12 is illustrated. Based on these respective
measurement points P1 , P2, P3 a fusion point 13 is calculated, especially considering thereby the respective error models, represented here by the respective error ranges 10, 1 1 , 12. Also this fusion point 13 is finally associated with a corresponding error range 14, which can be calculated based on the respective error ranges 10, 1 1 , 12 of the respective measurement points P1 , P2, P3. As illustrated in Fig. 2 in step S2, the resulting error range 14 of the fusion point 13 is much smaller than the respective error or ranges 10, 1 1 , 12 of their respective measurement points P1 , P2, P3. This means that at least in most situations the accuracy and reliability of the detection based on the fused sensor data 15, 16, 17 can be improved by this sensor data fusion. Such data fusion as described with regard to step S2 can be performed with all the available sensor data 15, 16, 17 relating to the same object 3 and which assigned to the same time step, which leads to a fused object 18 as output and step S3.
In most cases this sensor data fusion leads to very good results, meaning that the parameters of the fused object 18 coincide with the real parameters of the object 3 very well. However, there are also some inherent problems of such data fusion, which may cause a higher inaccuracy with regard to the position parameters, which are determined based on the fused object 18. For example, one or more error models associated with the sensors 5, 6, 7 might fail. Especially, the error model 10 associated with a camera 7 fails in case the road comprises as slope. Moreover, the usage of some filters, like Kalman filters, for tracking induces a delay, which is cumulated. Those problems lead to higher inaccuracies for example with regard to the determined distance between the object 3 and the vehicle 1. Moreover, especially when the object 3 does not move, small
fluctuations in the distance determined based on the fused object 18 can lead to a wrong classification of this object 3 as moving. These negative impacts on the determination of the distance and the classification of the object 3 can advantageously be compensated as described in the following with regard to Fig. 3 and Fig. 4, which illustrate method steps S4, S5, S6, S7 performed by the control unit 9 subsequent to the method steps S1 , S2,
S3 as described with regard to Fig. 2.
Fig. 3 shows a schematic illustration of the process of correcting a position parameter, which is determined based on the fused object 18, wherein in this example the position parameter constitutes the determined distance between the vehicle 1 and the object 3.
Fig. 3 on the left-hand side illustrates the distance d, which has been determined on the basis of the fused object 18. This distance d is now corrected to a distance d' in dependency of the raw sensor data 15 provided by the LIDAR sensors of the driver assistance system 2. These sensor data 15 are here is illustrated in form of single measurement points, which relate to the boundary 19 (compare Fig. 1 ) of the vehicle 3 facing the vehicle 1. Because it can be assumed that these measurement points of the sensor data 15 provided by the LIDAR sensors 5 are highly accurate, a realigning step can be performed in step S5. Thereby, the fused object 18 is shifted in its position such that the boundary 20 of the fused object 18 matches, e.g. in average, the measurement points provided by the sensor data 15 of the LIDAR sensors 5. Thereby the fused object 18 can be shifted as a whole resulting in a positionally corrected fused object 18' with a positionally corrected distance d’ to the vehicle 1 . All or at least some other object parameters, which have been determined based on the fusion result, namely the fused object 18, can remain unchanged, like the spatial extent of the fused object 18, its size, its kind, or other optionally determined parameters. Also the determined velocity or its classifications as static or dynamic can remain unchanged, however it is very
advantageous to correct the velocity as alternative or additional position parameter as well, as described now with regard to Fig. 4
Fig. 4 shows a schematic illustration of the correction process for correcting the velocity of the object 3 determined based on the fused object 18 according to an embodiment of the invention. As shown on the left-hand side of Fig. 4 the fused object 18 is associated with a velocity v, which is different from 0, and which also has been determined based of the fused sensor data 15, 16, 17. To perform the correction, first of all in step S6 the raw data 16 of the radar sensor 6, especially the raw sensor data 16 around the fused object 18 are isolated and based on these sensor data 16 the velocity of the object 3 is determined, which can be done in a very accurately way. In case it is determined, that the object 3
does not move based on the sensor data 16 of the radar sensor 6, then the velocity v of fused object 18 is corrected to a velocity v' of 0 in step S7 as illustrated on the right-hand side in Fig. 4. The resolution of the radar data is extremely reliable and allows therefore to distinguish static from dynamic raw data. By isolating the raw data around the fused object 18 it can be determined very accurately, whether or not the object 3 is moving, and if not, the absolute speed of that object 3 can be set to 0 and the classification state of the object 3 can be set to be standing or stopped or static. This velocity correction can be performed alternatively or additionally to the distance correction as describe with regard to Fig. 3. Further, the velocity correction can be performed prior to the distance correction, temporally after the distance correction or simultaneously with the distance correction. As a conclusion, a kind of hybrid architecture can be provided using fused sensor data on the one hand and raw data of at least one single sensor on the other hand, which is further illustrated in Fig. 5.
Fig. 5 shows another schematic illustration of the determining of at least one position parameter of the object 3 on the basis of the fused sensor data 15, 16, 17 and the correction of the result in dependency of the raw sensor data 15, 16, 17, according to an embodiment of the invention. Here again, the respective sensors 5, 6, 7, which are of a different sensor type, provide corresponding sensor data 15, 16, 17. These sensor data 15, 16, 17 can be provided in form of corresponding object lists, like the detection results 3, 3b, 3c as illustrated in Fig. 2, or such object lists can be created on the basis of the respective sensor data 15, 16, 17. These sensor data 15, 16, 17 or the object lists are then fused in a fusion process as described before, the output of which is the fused object 18 with associated position parameters, like the distance d to the vehicle 1 and the velocity v of the object 3. These position parameters d, v can now advantageously be corrected by using the respective sensor raw data 15, 16, 17.
Those sensor raw data 15, 16, 17 can be represented in form of an occupancy gridmap 21. Such a static occupancy gridmap 21 is particularly suitable for representing the freespace information provided by the camera 7 in form of the respective cameras sensor data 17, the raw data 15 of the one or more LIDAR sensors 5, as well as the raw data 16 of the radar sensor 6 or radar sensors 6. This occupancy gridmap 21 now can
advantageously serve as input for the data correction process.
So, advantageously, for example a distance correction can be performed using the raw data 15 provided by the one or more LIDAR sensors 5, whereby the distance d as determined from the fused object 18 can be corrected to the corrected distance d' as final
result. Similarly, also the velocity v as a result of the object fusion can be corrected on the basis of the radar data 16 to the corrected velocity v' as the final result. Finally, a corrected fused object 18' can be provided comprising respective corrected object parameters, especially the corrected position parameters d' and v'. This result can now advantageously be used for performing one or more driver assistance functions, for example an adaptive cruise control function or even a fully automated driving function.
To conclude, the invention or its embodiments advantageously allow to resolve the inherent problems related to objects fusion by using the raw data from LIDAR and/or radar as input, on the basis of which the fusion result can be corrected.
Claims
1. Method for determining at least one first position parameter (d, d’, v, v’) of an object (3) in an environment (4) of a vehicle (1 ) on the basis of fused sensor data (18), wherein
capturing first sensor data (15, 16) relating to the object (3) by means of at least one first sensor (5, 6) of the vehicle (1 );
capturing second sensor data (16, 15) relating to the object (3) by means of at least one second sensor (6, 5) of the vehicle (1 );
fusing the captured first sensor data (15, 16) and the second sensor data (16, 15); and
in dependency of the fused first sensor data (15, 16) and second sensor data (16, 15) determining the at least one first position parameter (d, v) of the object (3);
characterized by the step
correcting the at least one first position parameter (d, v) in dependency of only the first sensor data (15, 16) and providing the result as corrected at least one first position parameter (d’, v’).
2. Method according to claim 1 ,
characterized in that
the at least one first sensor (5, 6) comprises a higher accuracy with respect to a determining of the at least one first position parameter (d, d’, v, v’) than the at least one second sensor (6, 5).
3. Method according to one of the preceding claims,
characterized in that
each of the first and at least one second sensor (5, 6) is associated with a measurement accuracy, especially in form of an error model (1 1 , 12), wherein the at least one first position parameter (d, v), which is determined in dependency of the fused first and second sensor data (15, 16), is determined in dependency of each measurement accuracy.
4. Method according to one of the preceding claims,
characterized in that
the at least one first position parameter (d, d’, v, v’) constitutes a distance (d, d’) between the vehicle (1 ) and the object (3).
5. Method according to one of the preceding claims,
characterized in that
the at least one first sensor (5) comprises a laser scanner (5), especially a LIDAR sensor (5).
Method according to one of the preceding claims,
characterized in that
in dependency of the fused first and second sensor data (15, 16) a first position of the object (3) including a first position of a boundary (20) of the object (3) facing the vehicle (1 ) is determined relative to the vehicle (1 ), wherein the first sensor data (15) are provided in form of measurement points defining a measured second position of the object boundary (19) relative to the vehicle (1 ), wherein the determined first position of the object (3) is corrected by shifting the determined first position of the object (3) such that the first position of the boundary (20) matches the measured second position of the object boundary (19).
Method according to one of the preceding claims,
characterized in that
the first sensor data (15, 16) are provided in form of an occupancy gridmap representation (21 ).
Method according to one of the preceding claims,
characterized in that
the at least one first position parameter (v, v’) is a velocity (v, v’) of the object (3).
9. Method according to one of the preceding claims,
characterized in that
the at least one first sensor (6) comprises a radar sensor (6).
10. Method according to claim 9,
characterized in that
in case the velocity (v’) of the object (3) determined by the radar sensor (6) is zero and in case the velocity (v) of the object (3) determined in dependency of the fused first and second sensor data (15, 16) is different from zero, then the final corrected velocity (v’) of the object (3) is set to zero.
1 1. Method according to one of the preceding claims,
characterized in that
in dependency of the fused first and second sensor data (15, 16) at least one second position parameter (v, d) of the object (3) is determined and the at least one second position parameter (v, d) is corrected in dependency of only the second sensor data (16, 15) and the result is provided as corrected at least one second position parameter (v’, d’).
12. Computer program product comprising program code stored in a computer readable medium, and which when executed by a processor of an electronic control device (9) causes the processor to perform a method according to one of the preceding claims.
13. Driver assistance system (2) for determining at least one first position parameter (d, d’, v, v’) of an object (3) in an environment (4) of a vehicle (1 ) on the basis of fused sensor data (18), wherein
the driver assistance system (2) comprises at least one first sensor (5, 6), which is configured to capture first sensor data (15, 16) relating to the object
(3) ;
the driver assistance system (2) comprises at least one second sensor (6, 5), which is configured to capture second sensor data (16, 15) relating to the object (3);
the driver assistance system comprises a control unit (9), which is configured to fuse the captured first and second sensor data (15, 16); and
the control unit (9) is configured to determine the at least one first position parameter (d, v) of the object (3) in dependency of the fused first and second sensor data (15, 16);
characterized in that
the control unit (9) is configured to correct the at least one first position parameter (d, v) in dependency of only the first sensor data (15, 16) and to provide the result as corrected at least one first position parameter (d’, v’).
14. Vehicle (1 ) comprising a driver assistance system (2) according to claim 13.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018122092.8 | 2018-09-11 | ||
DE102018122092.8A DE102018122092A1 (en) | 2018-09-11 | 2018-09-11 | Method for determining at least one position parameter of an object in an environment of a motor vehicle, computer program product, driver assistance system and motor vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020053165A1 true WO2020053165A1 (en) | 2020-03-19 |
Family
ID=67928833
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2019/074026 WO2020053165A1 (en) | 2018-09-11 | 2019-09-10 | Method for determining at least one position parameter of an object in an environment of a vehicle, computer program product, driver assistance system and vehicle |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102018122092A1 (en) |
WO (1) | WO2020053165A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2025046916A1 (en) * | 2023-08-29 | 2025-03-06 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102021208349B3 (en) | 2021-08-02 | 2022-12-15 | Continental Autonomous Mobility Germany GmbH | Method and sensor system for merging sensor data and vehicle with a sensor system for merging sensor data |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6590521B1 (en) * | 1999-11-04 | 2003-07-08 | Honda Giken Gokyo Kabushiki Kaisha | Object recognition system |
US20130335569A1 (en) * | 2012-03-14 | 2013-12-19 | Honda Motor Co., Ltd. | Vehicle with improved traffic-object position detection |
US8799201B2 (en) | 2011-07-25 | 2014-08-05 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and system for tracking objects |
US20170285161A1 (en) | 2016-03-30 | 2017-10-05 | Delphi Technologies, Inc. | Object Detection Using Radar And Vision Defined Image Detection Zone |
-
2018
- 2018-09-11 DE DE102018122092.8A patent/DE102018122092A1/en active Pending
-
2019
- 2019-09-10 WO PCT/EP2019/074026 patent/WO2020053165A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6590521B1 (en) * | 1999-11-04 | 2003-07-08 | Honda Giken Gokyo Kabushiki Kaisha | Object recognition system |
US8799201B2 (en) | 2011-07-25 | 2014-08-05 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method and system for tracking objects |
US20130335569A1 (en) * | 2012-03-14 | 2013-12-19 | Honda Motor Co., Ltd. | Vehicle with improved traffic-object position detection |
US20170285161A1 (en) | 2016-03-30 | 2017-10-05 | Delphi Technologies, Inc. | Object Detection Using Radar And Vision Defined Image Detection Zone |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2025046916A1 (en) * | 2023-08-29 | 2025-03-06 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
Also Published As
Publication number | Publication date |
---|---|
DE102018122092A1 (en) | 2020-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109099920B (en) | Sensor target accurate positioning method based on multi-sensor association | |
DK180393B1 (en) | DATA FUSION SYSTEM FOR A VEHICLE EQUIPPED WITH NON-SYNCHRONIZED PERCEPTION SENSORS | |
JP6256531B2 (en) | Object recognition processing device, object recognition processing method, and automatic driving system | |
CN109804270B (en) | Motor vehicle and method for 360 DEG environmental detection | |
CN111352413B (en) | Omnidirectional sensor fusion system and method and vehicle comprising same | |
KR101961571B1 (en) | Object recognition device using plurality of object detection means | |
JP6708730B2 (en) | Mobile | |
US11136034B2 (en) | Travel control method and travel control device | |
EP3671272A1 (en) | Vehicle sensor fusion based on fuzzy sets | |
KR20210061875A (en) | Method for detecting defects in the 3d lidar sensor using point cloud data | |
JP7596598B2 (en) | Method and apparatus for detecting decalibration of a sensor for detecting surroundings of a vehicle and vehicle | |
JP6490747B2 (en) | Object recognition device, object recognition method, and vehicle control system | |
KR20200068258A (en) | Apparatus and method for predicting sensor fusion target in vehicle and vehicle including the same | |
JP2019015606A (en) | Information processing method and information processing apparatus | |
WO2020053165A1 (en) | Method for determining at least one position parameter of an object in an environment of a vehicle, computer program product, driver assistance system and vehicle | |
EP3467545A1 (en) | Object classification | |
US20230419649A1 (en) | Method for Temporal Correction of Multimodal Data | |
JP2019190847A (en) | Stereo camera device | |
WO2019239775A1 (en) | Vehicle object sensing device | |
JP6075377B2 (en) | COMMUNICATION DEVICE, COMMUNICATION SYSTEM, COMMUNICATION METHOD, AND PROGRAM | |
CN103180877A (en) | Determination of the base width of a stereo image capture system | |
EP3971525B1 (en) | Self-position correction method and self-position correction device | |
CN109270523B (en) | Multi-sensor data fusion method and device and vehicle | |
KR102019383B1 (en) | Apparatus for tracking vehicle and operating method thereof | |
KR20240152931A (en) | Sensor error monitoring and detection in inertial measurement systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19766233 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19766233 Country of ref document: EP Kind code of ref document: A1 |