WO2024104775A1 - Maschinell gelernte verkehrssituations-vervollständigung - Google Patents
Maschinell gelernte verkehrssituations-vervollständigung Download PDFInfo
- Publication number
- WO2024104775A1 WO2024104775A1 PCT/EP2023/080306 EP2023080306W WO2024104775A1 WO 2024104775 A1 WO2024104775 A1 WO 2024104775A1 EP 2023080306 W EP2023080306 W EP 2023080306W WO 2024104775 A1 WO2024104775 A1 WO 2024104775A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- model
- data set
- algorithm
- road users
- trajectories
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
- G01S13/726—Multiple target tracking
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/66—Sonar tracking systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4039—Means for monitoring or calibrating of parts of a radar system of sensor or antenna obstruction, e.g. dirt- or ice-coating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/417—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0137—Measuring and analyzing of parameters relative to traffic conditions for specific applications
- G08G1/0145—Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9324—Alternative operation using ultrasonic waves
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Definitions
- the invention relates to a method for generating a model or an algorithm for completing data about other road users in a traffic situation that is incompletely recorded by sensors from an ego perspective of a vehicle, as well as a system for completing data about other road users in a traffic situation that is incompletely recorded by sensors from an ego perspective of a vehicle.
- the behavior of other road users is typically predicted, particularly with regard to their trajectories - in terms of the actions and reactions of the road users.
- relevant scenarios are typically considered, which are used, for example, in a bird's eye view simulation (a simulation with information from a bird's perspective) to model traffic environments with acting and reacting road users.
- these traffic environment models are based on complete descriptions of scenarios, the information for which is obtained, for example, by using drones for traffic monitoring.
- data from a bird's eye view obtained, for example, by using unmanned aerial vehicles or other stationary traffic monitoring systems, are usually not available during regular operation of a vehicle.
- the object of the invention is therefore to enable the utilization of incomplete traffic data, which were recorded in particular from the ego perspective of a road user (according to the so-called individual shadowing), reliably by completing them.
- a first aspect of the invention relates to a method for generating a model or an algorithm for completing data about other road users in a traffic situation that is incompletely recorded by sensors from a first-person perspective of a vehicle, comprising the steps:
- the complete data set includes the trajectory of an ego vehicle and the trajectories of all other road users in the traffic situation within a given time window;
- the model or algorithm generated by the method according to the invention serves - when fully generated - to complete data about other road users in a traffic situation recorded by sensors from a first-person perspective of a vehicle.
- a model or an algorithm is used is a matter of interpretation.
- an artificial neural network can be used for a model
- other estimators can also be applied, such as a hidden Markov model, in which the incomplete data are viewed as emissions from the system.
- an algorithm is a sequential series of instructions.
- a Kalman filter is an algorithm, not a model.
- the model and algorithm differ in the points mentioned above, they are functionally equivalent in order to provide a system that is suitable for completing data about other road users in a traffic situation recorded by sensors from a first-person perspective of a (first-person) vehicle.
- the traffic situation includes at least the trajectories of the vehicle under consideration, also referred to as ego vehicle, as well as of the other road users.
- trajectory is also justified if the ego vehicle or the other road users are not moving, since the term trajectory indicates a trajectory with time information, and therefore a static position can also be described by a trajectory with time information.
- the complete data set therefore describes a traffic situation in which both the ego vehicle and the other road users find themselves.
- the ego vehicle is the vehicle under consideration which, in analogy to the training phase, could use the fully trained model or the fully designed algorithm for completing data on the data recorded by sensors from its ego perspective in its regular operation. It is not necessary that such a model or algorithm is already provided for the ego vehicle in the complete data set, because the complete data set only serves to train such a model or algorithm, and the system for executing the finished model or algorithm is provided for the later regular operation of a vehicle and is not necessarily functionally included in the complete data set.
- the role of the ego vehicle in the complete data set defines the ego perspective according to which the data of the complete data set or the already reduced data set are to be aligned, as if the data had been recorded sensorily from this ego perspective. It may be necessary for the reduced data set to be prepared as if the data from the respective data set had been recorded from an ego perspective of the vehicle using the system for executing the model or algorithm; then it does not matter whether the complete data set is first transformed (if it is not yet ready) as if it had been determined sensorily from the ego perspective of this ego vehicle in question, or whether the complete data set is first reduced accordingly and the reduced data set is transformed (if it is not already ready) as if it had been determined sensorily from the ego perspective of this ego vehicle in question.
- the complete data set in its original form includes information from a bird's eye view.
- Possible training data sets from a drone perspective would be, for example, the HighD data set ("HighD Dataset” from www.highd-dataset.com) or, in urban areas, the InD data set ("inD Dataset” from www.ind-dataset.com).
- the completeness of the data means that a so-called "ground truth" is available.
- a bird's eye view cannot be determined from the ego perspective using the ego vehicle's own sensors. It can A transformation must then be carried out that references the data to the sensor view of the ego vehicle. This serves to maintain consistency, so that when training the model or algorithm, input data with a comparable reference and comparable perspectives are used as in the later operation of the vehicle when applying the fully trained model or fully designed algorithm.
- a transformation into the ego perspective of the ego vehicle is not necessary if already interpreted sensor data is used in the respective data set both for training the model or for designing the algorithm and during the subsequent operation of the fully trained model or the fully designed algorithm, for example determined trajectories of other road users that are recognizable to the sensors of the ego vehicle.
- the input data of the model or algorithm no longer contains any information about the perspective with which the other road users were recorded. Rather, the perspective is dissolved and abstracted into information about the traffic situation.
- the reduction of the complete data set by one or more of the other road users to a reduced data set is preferably only carried out to the extent that at least one other road user always remains in the reduced data set in order to provide a data basis for estimating the behavior of other users based on the behavior of this one other road user.
- the model or algorithm is created by machine learning. In the case of artificial neural networks, this can be done by so-called "back propagation". In particular, machine learning is carried out in a form of supervised learning.
- the parameters (weights) - and in rare complex cases, optionally also the structure of the artificial neural network such as the number of levels (so-called "layers") - are adjusted.
- a prerequisite for creation is always that input data is specified and output data associated with the input data is also specified, and the transmission by the model or algorithm is adjusted, in particular iteratively, until the model or algorithm independently arrives at only the input data and output data that correspond to the specified ones.
- the fully trained model or the fully designed algorithm is able to use current input data from reality to determine corresponding output data that correspond to reality as intended. may correspond, in this case, to the correct completion of data on other road users.
- machine learning is carried out with the specified output variables of the model or algorithm from the complete data set in such a way that data from the complete data set is used that is not present in the reduced data set.
- the specified output variables include all data from the complete data set.
- the model or algorithm is also able to detect whether all relevant actors of the scenario are present in the data set.
- the reduced data set and in particular a complete data set reduced in several ways, can be used for other purposes, in particular training, validation and testing of the resulting models and algorithms.
- model includes a large number of sub-models, for example by executing a separate, definable sub-model for each estimated additional road user and their trajectories. However, the large number of these sub-models is understood under the term "model” used above and below. The same applies to the algorithm.
- machine learning makes use of implicitly known situations and patterns, such as the reaction of another road user that can be observed by the ego vehicle's sensors to a third road user that cannot be detected by the ego vehicle's sensors.
- a reaction can be, for example, swerving, giving way, or something similar.
- the presence and trajectory of the third road user can thus be deduced from the reaction of the other road user that is detected by the sensors - this corresponds to the completion of the data set, which in the present example is initially incomplete and initially only includes the other road user that can be observed by the ego vehicle's sensors.
- the same complete data set can be used to provide the initial data in each iteration, or the complete data set can be replaced to provide a new data basis for another large number of reduced data sets.
- planning is used here because the model or algorithm is intended to complete the traffic situation in a realistic way, so that a reconstructed traffic situation with completion would lead to the same data set of the recording vehicle. Nevertheless, the actually recorded situation can differ from the reconstructed one. For example, a real ego vehicle would slow down when crossing traffic, regardless of whether the crossing traffic is a truck or a bicycle.
- a plurality of different reduced data sets are obtained from a single complete data set.
- the model or the algorithm comprises a plausibility check, wherein the plausibility check is provided with an estimate of a respective trajectory of the road users that can and cannot be detected by sensors by the ego vehicle, wherein if an estimate of a trajectory of a further road user that can be detected by sensors deviates from a sensor-based determined trajectory of this further road user, the model or the algorithm generates an estimated trajectory of a further road user that cannot be detected by sensors and added for completion, wherein the plausibility check is adapted by machine learning when the model or the algorithm is generated.
- heuristics that remain unchanged during generation are implemented in the model or in the algorithm.
- the basic procedure for training the model or designing the algorithm is that one or more plausible trajectories are estimated for each road user on the basis of one or more models. If a road user deviates deviates from these trajectories, i.e. behaves in a conspicuous or implausible manner, an attempt is made to convert this deviation into plausible behavior by adding further road users.
- the machine learning during the generation of the model or algorithm is carried out with the restriction that additional road users who cannot be detected by sensors are added to complete the model and are placed by estimation at locations that lie outside a sensory detection range of the ego vehicle.
- a further aspect of the invention relates to a system for completing data about other road users in a traffic situation that is incompletely recorded by sensors from an ego perspective of a vehicle, wherein the system is designed to execute a model or an algorithm, wherein input data based on sensor information from an ego vehicle and comprising data about the trajectories of other road users, which may be incomplete, are used to execute the model or the algorithm, and a complete data set generated by plausible reconstruction is obtained as output data of the model or the algorithm, which data includes trajectories of the other road users for the current traffic situation that are not directly contained in the sensor information of the ego vehicle.
- the system is designed for a first ego vehicle and to link sensor information transmitted from a second ego vehicle by data fusion in order to expand the first ego vehicle's own sensor information and reduce the amount of data to be completed.
- the system is designed to determine and output a plurality of possible plausibly completed data.
- the respective completed data are provided with a value of the respective calculated probability for it.
- the system is designed to carry out a plausibility analysis for plausible reconstruction, which includes a completeness analysis, wherein the completeness analysis checks whether the existing estimated number of other road users is sufficient to be able to plausibly explain the trajectories of all other road users, wherein the trajectories of all other road users include the trajectories of other road users determined by sensor data and the trajectories of other estimated existing road users estimated by completion.
- Fig. 1 A method for generating a model or an algorithm for completing data about other road users in a traffic situation that is incompletely recorded by sensors from an ego perspective of a vehicle according to an embodiment of the invention.
- Fig. 2 A system for completing data about other road users in a traffic situation that is incompletely recorded by sensors from an ego perspective of a vehicle according to an embodiment of the invention.
- the representations in the figures are schematic and not to scale.
- Fig. 1 shows a method for generating a model or an algorithm for completing data about other road users 3 in a traffic situation that is incompletely recorded by sensors from a first-person perspective of a vehicle.
- the steps of the method include:
- Fig. 2 shows a system in use for completing data about other road users 3 in a traffic situation that is incompletely recorded by sensors from an ego perspective of a vehicle.
- the system is based on the results of the method in Fig. 1 and involves executing a model, whereby input data based on sensor information from an ego vehicle 1 and including data about the trajectories of other road users 3, which may be incomplete, are used to execute the model.
- the ego vehicle 1 has a large number of sensors, such as lidar, radar, ultrasonic distance sensors, stereo cameras, etc.; on the one hand, the sensors all have finite ranges, and on the other hand, in some situations, such as the one shown in Fig.
- the other road user 3 is located on the lower right and is not However, another road user 3 approaching the right-hand road branch is detected by sensors of the ego vehicle 1, so that its trajectory can be determined directly from the sensor data of the ego vehicle 1. From this trajectory executed by the other road user 3 approaching from the right, the ego vehicle 1 recognizes that this other road user 3 is behaving differently in response to something, namely that it is braking and slowly approaching the road branch which, from its perspective, is turning in from the right.
- the trajectory of the additional road user 3 is therefore known for the ego vehicle 1 insofar as the initial data of the model includes that this additional road user 3 is in the upper road from the perspective of the ego vehicle 1, with the direction of travel downwards, so that this additional road user 3 appears in a right-of-way situation for the sensor-detectable additional road user 3 and it can be expected that this additional road user 3 from above will soon be in a relevant area around the ego vehicle 1.
- alternatives of a bifurcation arise - both alternatives are output by the model and transferred to a decision module of the ego vehicle 1 to determine its own further behavior.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Evolutionary Computation (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Acoustics & Sound (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP23800768.6A EP4619961A1 (de) | 2022-11-17 | 2023-10-31 | Maschinell gelernte verkehrssituations-vervollständigung |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102022212266.6A DE102022212266A1 (de) | 2022-11-17 | 2022-11-17 | Maschinell gelernte Verkehrssituations-Vervollständigung |
| DE102022212266.6 | 2022-11-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024104775A1 true WO2024104775A1 (de) | 2024-05-23 |
Family
ID=88695417
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2023/080306 Ceased WO2024104775A1 (de) | 2022-11-17 | 2023-10-31 | Maschinell gelernte verkehrssituations-vervollständigung |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4619961A1 (de) |
| DE (1) | DE102022212266A1 (de) |
| WO (1) | WO2024104775A1 (de) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018176000A1 (en) * | 2017-03-23 | 2018-09-27 | DeepScale, Inc. | Data synthesis for autonomous control systems |
| US20200353943A1 (en) * | 2019-05-07 | 2020-11-12 | Foresight Ai Inc. | Driving scenario machine learning network and driving environment simulation |
| WO2021178299A1 (en) * | 2020-03-04 | 2021-09-10 | Nec Laboratories America, Inc. | Multi-agent trajectory prediction |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102014203805B4 (de) * | 2014-03-03 | 2025-07-17 | Bayerische Motoren Werke Aktiengesellschaft | Fahrerassistenzsystem zur Erkennung eines sensorisch nicht erfassbaren Kollisionspartners |
| DE102017212277B3 (de) * | 2017-07-18 | 2018-09-06 | Robert Bosch Gmbh | Gefahrerkennung bei beabsichtigtem Spurwechsel |
| DE102020004341A1 (de) * | 2020-07-20 | 2020-11-19 | Daimler Ag | Verfahren zum automatischen Queren eines Kreuzungsbereichs mit einem Fahrzeug |
| DE102021003286A1 (de) * | 2021-06-25 | 2022-01-20 | Daimler Ag | Verfahren zum Betrieb eines Fahrzeuges |
-
2022
- 2022-11-17 DE DE102022212266.6A patent/DE102022212266A1/de active Pending
-
2023
- 2023-10-31 WO PCT/EP2023/080306 patent/WO2024104775A1/de not_active Ceased
- 2023-10-31 EP EP23800768.6A patent/EP4619961A1/de active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018176000A1 (en) * | 2017-03-23 | 2018-09-27 | DeepScale, Inc. | Data synthesis for autonomous control systems |
| US20200353943A1 (en) * | 2019-05-07 | 2020-11-12 | Foresight Ai Inc. | Driving scenario machine learning network and driving environment simulation |
| WO2021178299A1 (en) * | 2020-03-04 | 2021-09-10 | Nec Laboratories America, Inc. | Multi-agent trajectory prediction |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4619961A1 (de) | 2025-09-24 |
| DE102022212266A1 (de) | 2024-05-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3970077B1 (de) | Verfahren zum trainieren wenigstens eines algorithmus für ein steuergerät eines kraftfahrzeugs, computerprogrammprodukt, kraftfahrzeug sowie system | |
| EP3765927B1 (de) | Verfahren zum erzeugen eines trainingsdatensatzes zum trainieren eines künstlichen-intelligenz-moduls für eine steuervorrichtung eines fahrzeugs | |
| EP3748453B1 (de) | Verfahren und vorrichtung zum automatischen ausführen einer steuerfunktion eines fahrzeugs | |
| DE102016212700A1 (de) | Verfahren und System zur Steuerung eines Fahrzeugs | |
| EP3543985A1 (de) | Simulieren verschiedener verkehrssituationen für ein testfahrzeug | |
| EP3616184A1 (de) | Verfahren, computerprogrammprodukt, computer-lesbares medium, steuergerät und fahrzeug umfassen das steuergerät zum bestimmen eines kollektiven manövers von wenigstens zwei fahrzeugen | |
| DE102019216836A1 (de) | Verfahren zum Trainieren wenigstens eines Algorithmus für ein Steuergerät eines Kraftfahrzeugs, Computerprogrammprodukt sowie Kraftfahrzeug | |
| DE102017006338B4 (de) | Verfahren zum effizienten Validieren und der sicheren Applikation von autonomen und teilautonomen Fahrzeugen | |
| DE102019203712A1 (de) | Verfahren zum Trainieren wenigstens eines Algorithmus für ein Steuergerät eines Kraftfahrzeugs, Computerprogrammprodukt, Kraftfahrzeug sowie System | |
| EP3748454A1 (de) | Verfahren und vorrichtung zum automatischen ausführen einer steuerfunktion eines fahrzeugs | |
| DE102021004426A1 (de) | Verfahren zum Trainieren einer autonomen Fahrfunktion | |
| DE102022132917A1 (de) | Verfahren und System zur Bestimmung der Kritikalität und Kontrollierbarkeit von Szenarien für automatisierte Fahrfunktionen | |
| WO2019206792A1 (de) | Verfahren und vorrichtung zur umsetzung eines eingangsbildes einer ersten domäne in ein ausgangsbild einer zweiten domäne | |
| WO2022251890A1 (de) | Verfahren und system zum testen eines fahrerassistenzsystems für ein fahrzeug | |
| DE112020007538T5 (de) | Fahrunterstützungsvorrichtung, Lernvorrichtung, Fahrunterstützungsverfahren, Fahrunterstützungsprogramm, Gelerntes-Modellerzeugungsverfahren und Datenträger mit Gelerntes-Modellerzeugungsprogramm | |
| DE102019101613A1 (de) | Simulieren verschiedener Verkehrssituationen für ein Testfahrzeug | |
| DE102020202540A1 (de) | Verfahren zum Trainieren wenigstens eines Algorithmus für ein Steuergerät eines Kraftfahrzeugs, Computerprogrammprodukt sowie Kraftfahrzeug | |
| AT524822B1 (de) | Verfahren zum Testen eines Fahrerassistenzsystems eines Fahrzeugs | |
| DE102022208519A1 (de) | Computerimplementiertes Verfahren und Computerprogramm zur Bewegungsplanung eines Ego-Fahrsystems in einer Verkehrssituation, computerimplementiertes Verfahren zur Bewegungsplanung eines Ego-Fahrsystems in einer realen Verkehrssituation Steuergerät für ein Ego-Fahrzeug | |
| WO2024104775A1 (de) | Maschinell gelernte verkehrssituations-vervollständigung | |
| DE102021126820A1 (de) | Verfahren und Verarbeitungseinrichtung zum Steuern einer Fahrassistenzfunktion und Fahrassistenzsystem | |
| DE102023124870B4 (de) | Verfahren und System zur erweiterten automatisierten Kalibrierung von Funktionen eines Fahrerassistenzsystems | |
| EP4673930A1 (de) | Limitierter agentenmodelleinsatz in hybrid-verkehrssimulation mit realtrajektorien | |
| DE102018216719A1 (de) | Schlüsselbildbasierter autonomer Fahrzeugbetrieb | |
| DE202017105656U1 (de) | Prädiktives Messsystem, Aktorsteuerungssystem und Vorrichtung zum Betreiben des prädiktiven Messsystems und/oder des Aktorsteuerungssystems |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23800768 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023800768 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2023800768 Country of ref document: EP Effective date: 20250617 |
|
| WWP | Wipo information: published in national office |
Ref document number: 2023800768 Country of ref document: EP |