[go: up one dir, main page]

WO2020164090A1 - Prédiction de trajectoire servant à une stratégie de conduite - Google Patents

Prédiction de trajectoire servant à une stratégie de conduite Download PDF

Info

Publication number
WO2020164090A1
WO2020164090A1 PCT/CN2019/075159 CN2019075159W WO2020164090A1 WO 2020164090 A1 WO2020164090 A1 WO 2020164090A1 CN 2019075159 W CN2019075159 W CN 2019075159W WO 2020164090 A1 WO2020164090 A1 WO 2020164090A1
Authority
WO
WIPO (PCT)
Prior art keywords
predictor
hybrid
predictors
static
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/075159
Other languages
English (en)
Inventor
Gaowei Xu
Yao Ge
Maximilian DOEMLING
Dominik Notz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Priority to EP19914873.5A priority Critical patent/EP3924795A4/fr
Priority to PCT/CN2019/075159 priority patent/WO2020164090A1/fr
Priority to CN201980091906.XA priority patent/CN113454555A/zh
Publication of WO2020164090A1 publication Critical patent/WO2020164090A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/20Static objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4026Cycles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Definitions

  • the present disclosure relates in general to automated driving vehicles, and in more particular, to trajectory predictions for driving strategies.
  • An automated driving vehicle also known as a driverless car, self-driving car, or robotic car
  • ADV Automated driving vehicles
  • ADV use a variety of techniques to detect their surroundings, such as radar, laser light, GPS, odometry and computer vision.
  • Advanced control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage.
  • HMM hidden Markov model
  • the present disclosure aims to provide a method, an apparatus and a vehicle for trajectory prediction.
  • a computer-implemented method for trajectory prediction comprises obtaining sensor data for an object surrounding a vehicle from at least one sensor installed on the vehicle; feeding the obtained sensor data into a hybrid predictor consisting of a plurality of predictors each with a weight assigned by the hybrid predictor, wherein the plurality of predictors comprise at least a free space predictor and a roadmodel-based predictor; for a current time instant, each of the plurality of predictors giving its own predictions for a predetermined period of time from the current time instant based on historical sensor data from the at least one sensor; the hybrid predictor determining a reliability of each one of the plurality of predictors; the hybrid predictor outputting weighted hybrid predictions for the predetermined period of time from the current time instant based on the respective predictions given by each of the plurality of predictors and the reliability of each one of the plurality of predictors, with assigning a greater weight to the predictor that gave a better prediction in
  • a trajectory prediction apparatus comprising a sensor data obtaining module configured to obtain sensor data for an object surrounding a vehicle from at least one sensor installed on the vehicle; a hybrid predictor consisting of a plurality of predictors each with a weight assigned by the hybrid predictor, wherein the plurality of predictors comprises at least a free space predictor and a roadmodel-based predictor, and the hybrid predictor is configured to: receive the sensor data from the sensor data obtaining module; for a current time instant, use the free space predictor and the roadmodel-based predictor to each give its own predictions for a predetermined period of time from the current time instant based on historical sensor data from the at least one sensor, determine which one of the plurality of predictors gave a better prediction that is closer to the actual movement of the object in previous one or more time instants; and output weighted hybrid predictions for the predetermined period of time from the current time instant based on the respective predictions given by each of the plurality of predictor
  • a vehicle comprising at least one sensor configured to capture sensing data for objects surrounding the vehicle; the trajectory prediction apparatus according to the second embodiment; and a decision module configured to make vehicle control decisions based on the trajectories for objects surrounding the vehicle predicted by the trajectory prediction apparatus.
  • the invention can handle the scenarios where there doesn't exist lanes such as road junctions.
  • the invention can predict the future trajectories and kinematic states of agents which may not move in lanes, such as pedestrians, scooters, bicycles, etc., because the free space predictor will play a significant role in this scenarios.
  • the invention can alleviate the phenomenon that the predicted results are oscillatory when agents are moving with very low velocities or even static in that the hybrid predictor has a static agent classifier.
  • the invention improves the prediction accuracy of future vehicle trajectories and it is more flexible and independent of road structures.
  • Fig. 1 illustrates an exemplary road environment.
  • Fig. 2 shows a schematic block diagram of a trajectory prediction apparatus in accordance with one embodiment of the present invention.
  • Fig. 3 is a partial enlarged view of the road environment of Fig. 1.
  • Fig. 4 is another partial enlarged view of the road environment of Fig. 1.
  • Fig. 5 is a flow diagram of a method for trajectory prediction, in accordance with one embodiment of the present invention.
  • Fig. 6 illustrates an exemplary vehicle according to an embodiment of the present invention.
  • Fig. 7 illustrates a general hardware environment wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
  • vehicle used through the specification refers to cars of any type, including but not limited to sedans, vans, trucks, buses, or the like. For simplicity, the invention is described with respect to “car” .
  • a or B used through the specification refers to “A and B” and “A or B” rather than meaning that A and B are exclusive, unless otherwise specified.
  • trajectory prediction there are several methods of trajectory prediction in the prior art. Among these methods, more typical trajectory predictions are the roadmodel-based trajectory prediction and the movement-history-based trajectory prediction.
  • the roadmodel-based trajectory prediction is based on existing road models, and thus, apparently, such trajectory prediction is only suitable for those areas where road models exist. Also, it is only applicable to automobiles. In addition, such kind of trajectory prediction also assumes that cars comply with traffic rules, for example, cars will travel along the centerlines of lanes, cars will change direction according to the lane direction indicators, cars will not change lanes arbitrarily or illegally, and the like.
  • the movement-history-based trajectory prediction is based on the movement of an object over a period of time (also referred to as "time horizon" ) to predict future trajectories. How these two trajectory predictions will work in a road environment will be explained with respect to Fig. 1.
  • Fig. 1 illustrates an exemplary road environment. Shown in Fig. 1 is an exemplary three-way intersection comprising a main road in a vertical direction and a branch in a horizontal direction. The main road has 2 lanes, one being a straight-only lane and the other being a right-turn-only lane for cars to turn right into the branch. Also shown in Fig. 1 is a car 102 traveling on the right-turn lane and is about to reach the intersection. At this moment, a roadmodel-based trajectory prediction is able to recognize that the car 102 is traveling on the right-turn lane due to the knowledge of the road model of this intersection.
  • the roadmodel-based trajectory prediction thus gives a prediction that the car 102 will travel along the trajectory shown as the dashed line 104, i.e., turning right at the intersection and driving into the branch.
  • the movement-history-based trajectory prediction is unaware of the road model of the intersection. It gives predictions only based on the movement history of an object over a period of time.
  • the previous two observation positions 102' and 102" of the car 102 are schematically illustrated in Fig. 1. It will be appreciated that only two previous positions are shown for simplicity, and in practice, more historical locations (e.g., 5 or 10 locations) may be necessary to predict the trajectory.
  • the movement-history-based trajectory prediction thus predicts that the car 102 will travel along the trajectory shown by the dashed line 106, i.e., continually travelling straightforward.
  • the roadmodel-based trajectory prediction can generally predict the future trajectory of the vehicle more accurately.
  • such roadmodel-based trajectory prediction can only be applied in areas with road models, and it must be assumed that objects obey traffic rules, and such prediction is only for cars.
  • the movement-history-based trajectory prediction is much more flexible. It works without various assumptions and it is not limited in cars.
  • Its disadvantage is that, since its prediction is based only on historical trajectories, such prediction is actually more similar to summarizing a past trajectory (such as finding a common smooth curve for location points) and extending this trajectory to the future. Therefore, for a motion trajectory with a sharp change (such as a right-angle turn) , the trajectory prediction may deviate greatly from the actual trajectory, and the prediction may be less accurate as the predicted future time is further away from the current time.
  • Fig. 2 shows a schematic block diagram of a trajectory prediction apparatus 200 in accordance with one embodiment of the present invention.
  • the trajectory prediction apparatus 200 may include a sensor data obtaining module 202, a hybrid predictor 204, and a trajectory predicting module 206.
  • the sensor data obtaining module 202 is configured to obtain sensor data for an object surrounding a vehicle from at least one sensor installed on the vehicle.
  • the hybrid predictor 204 may include a plurality of existing predictors.
  • the hybrid predictor 204 may include at least one free space predictor 208 and one road model based predictor 210.
  • the hybrid predictor 204 may optionally include a static object classifier 212.
  • the free space predictor 208 may give the positions of points on the movement trajectory curve corresponding to a plurality of time points in the future as the prediction result.
  • the roadmodel-based predictor 210 may give a prediction of future locations of the object based on its knowledge of the road model where the current location of the object is.
  • the free space predictor 208 (labeled as Predictor 1) and the roadmodel-based predictor 210 (labeled as Predictor 2) may respectively give a prediction of positions at a plurality of time instances (t c ⁇ 1 , t c ⁇ 2 , t c ⁇ 3 , t c ⁇ 4 , ...) in a future period of time (for example, 5 seconds, 10 seconds) , and, The number of time instances may depend on the size of the time horizon and the sampling accuracy. Therefore, a data structure can be contemplated as shown in Table 1 below:
  • Fig. 3 is a partial enlarged view of the road environment of Fig. 1.
  • Fig. 3 it is illustrated that the car 102 has reached the intersection and has just begun to turn to the right. It is assumed that the current time point corresponding to the current position of the car 102 is tc, and its position is marked as Gc. In addition, it can be seen that the previous positions of the car 102 are Gc ⁇ 1 , Gc ⁇ 2 , Gc ⁇ 3 , Gc ⁇ 4 , Gc ⁇ 5 , ..., respectively.
  • predicted values and respectively given by the free space predictor 208 and the roadmodel-based predictor 210 are located at corresponding positions as shown in the figure. It can be seen that the free space predictor 208 considers that the car 102 is traveling in a straight line based on the current and previous positions Gc ⁇ 1 , Gc ⁇ 2 , Gc ⁇ 3 , Gc ⁇ 4 , and Gc ⁇ 5 of the car 102, and thus the prediction values reflect that the car 102 will continue traveling forward along the straight line, which is the trajectory 106.
  • the roadmodel-based predictor 210 since the roadmodel-based predictor 210 is aware of that the car 102 is traveling on the right-turn lane, the predicted values given by it reflect that the car 102 will travel along the trajectory 104. Thus it can be seen that, at the past time instance t c-1 , the predicted value given by the roadmodel-based predictor 210 for the current time tc is closer to the actual value Gc, and therefore the roadmodel-based predictor 210 is currently a more accurate and reliable predictor.
  • the hybrid predictor 204 is configured to fuse the predictions from a plurality of predictors and output fused prediction results.
  • the fusion can be done by assigning a weight to each predictor and using the weighted value as the fused prediction.
  • the Predictor 1 may be assigned a weight w 1 and the Predictor 2 may be assigned a weight w 2 .
  • the fused prediction results can thus be calculated as:
  • the hybrid predictor 204 assigns the roadmodel-based predictor 210 a higher weight, i.e., w 2 > w 1 when predicting the trajectory for future time instances, t c+1 , t c+1 , t c+1 , .... .
  • the specific assigning of the values of w 1 and w 2 can be determined based on the differences between and Gc and and Gc, respectively.
  • Fig. 4 is also a partial enlarged view of the road environment of Fig. 1. Different from Fig. 3, now assuming that the current time instance is t c+1 , and the car 102 has traveled a certain distance forward along the trajectory 104.
  • the free space predictor 208 considered that the car 102 was only slightly off the line based on the current and previous positions Gc, Gc ⁇ 1 , Gc ⁇ 2 , Gc ⁇ 3 , Gc ⁇ 4 , and Gc ⁇ 5 of the car 102, and accordingly, the predicted positions given, reflect that the car 102 would travel along the trajectory 106', which only slightly deviates from the original trajectory 106.
  • the roadmodel-based predictor 210 is aware of that the car 102 is traveling on the right-turn lane, so the predicted value given by it reflect that the car 102 will continue to travel along the trajectory 104. Similar to the case of Fig.
  • the hybrid predictor 204 will assign the roadmodel-based predictor 210 a higher weight. While, if further considering the position of G c+1 and the predictions and respectively given by the two predictors at time t c-1 , the prediction given by the free space predictor 208 is still on the dashed line 106, and the distance between the prediction and the actual position G c+1 is farther than the distance between the prediction given at time tc and the actual position G c+1 . Meanwhile, the predictions and given at time t c-1 and time tc by the roadmodel-based prediction model 210 approximately coincide with G c+1 . Therefore, the hybrid predictor 204 is further confident in that the reliability of the roadmodel-based predictor 210 is higher at this moment, and thus the roadmodel-based predictor 210 will be given a greater weight.
  • the hybrid predictor 204 gives fused prediction values, such as the trajectory prediction module 206 can provide predicted trajectories based on the fused predicted values.
  • the hybrid predictor 204 may also optionally include a static object classifier 212.
  • a static object classifier 212 It is well known that current sensors are more or less unstable or inaccurate. For example, for an object that is stationary, the observation data of sensors for that object may jitter over time. Therefore, performing a movement trajectory prediction for observation data of a static object is a waste of resources and may also lead to erroneous trajectory prediction.
  • the static object classifier 212 may be configured to classify an object as static or non-static by analysis of sensor data.
  • the static object classifier 212 may distinguish between static and non-static by for example setting thresholds for different types of sensor data, such as classifying an object as a static object when detecting that the speed of the object is below a certain predetermined threshold.
  • the static object classifier 212 may also analyze the distribution of sensor data over time, and classify an object as static when the data exhibits a small fluctuation around a certain value (e.g., a velocity value of zero and an acceleration value of zero) .
  • a certain value e.g., a velocity value of zero and an acceleration value of zero
  • the hybrid predictor 204 outputs the current location of the static object as its prediction result, i.e., predicting that the static object will remain stationary.
  • the static object classifier 212 may optionally further smooth the observation data using a Kalman filter, and then feed it to the free space predictor 208 and the roadmodel- based predictor 210.
  • Fig. 5 is a flow diagram of a method 500 for trajectory prediction in accordance with one embodiment of the present invention.
  • the method 500 begins at block 502, where sensor data for an object surrounding a vehicle may be obtained from at least one sensor installed on the vehicle.
  • the obtained sensor data may be fed into a hybrid predictor, at block 504.
  • the hybrid predictor of the present invention may consist of a plurality of predictors, each with a weight assigned by the hybrid predictor.
  • the plurality of predictors comprise at least a free space predictor, such as the free space predictor 208 in Fig. 2, and a roadmodel-based predictor, such as the roadmodel-based predictor 210.
  • each of the plurality of predictors may give its own predictions for a predetermined period of time from the current time instant.
  • the free space predictor 208 may give its predictions based on historical sensor data from the at least one sensor as previously described with regard to Fig. 3.
  • the roadmodel-based predictor 210 may also give its predictions mainly based on its knowledge of the road model, while it also uses the historical sensor data to determine the position of the object in the road model.
  • the hybrid predictor 204 determines the reliability of each of the plurality of predictors. As an example, this can be determined by evaluating which of the predictors gave a better prediction that is closer to the actual movement of the object in previous one or more time instants, as previously described with regard to Figs 3 and 4.
  • the hybrid predictor may output weighted hybrid predictions for the predetermined period of time from the current time instant based on the respective predictions given by each of the plurality of predictors, with greater weights being assigned to the more reliable predictors.
  • outliers may be dropped from the weighted hybrid predictions.
  • an iterative multivariate Gaussian model is used to fuse all the predictions of different sources with various time horizons after dropping outliers iteratively.
  • a predicted trajectory may be provided based on the weighted hybrid predictions. The method 500 ends.
  • the hybrid predictor 204 has a static object classifier 212
  • the obtained sensor data may be first fed into the static object classifier 212 to classify the object into static or non-static.
  • the static object classifier 212 classifies the object into static, the predicted trajectory can be directly provided based on the static state of the object, which means the predictions of the plurality of predictors are skipped.
  • the static object classifier 212 may optionally smooth the sensor data, such as by using a Kalman filter, before feeding it into the plurality of predictors.
  • Fig. 6 illustrates an exemplary vehicle 600 according to an embodiment of the present invention.
  • the vehicle 600 may comprise at least one sensor 602 configured to capture sensing data for objects surrounding the vehicle, a trajectory prediction apparatus 604 for providing trajectory predictions, such as the trajectory prediction apparatus 200 in Fig. 2, and a decision module 606 configured to make vehicle control decisions based on the trajectories for objects surrounding the vehicle predicted by the trajectory prediction apparatus 604.
  • Fig. 7 illustrates a general hardware environment 700 wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
  • the computing device 700 may be any machine configured to perform processing and/or calculations, may be but is not limited to a work station, a server, a desktop computer, a laptop computer, a tablet computer, a personal data assistant, a smart phone, an on-vehicle computer or any combination thereof.
  • the aforementioned system may be wholly or at least partially implemented by the computing device 700 or a similar device or system.
  • the computing device 700 may comprise elements that are connected with or in communication with a bus 702, possibly via one or more interfaces.
  • the computing device 700 may comprise the bus 702, and one or more processors 704, one or more input devices 706 and one or more output devices 708.
  • the one or more processors 704 may be any kinds of processors, and may comprise but are not limited to one or more general-purpose processors and/or one or more special-purpose processors (such as special processing chips) .
  • the input devices 706 may be any kinds of devices that can input information to the computing device, and may comprise but are not limited to a mouse, a keyboard, a touch screen, a microphone and/or a remote control.
  • the output devices 708 may be any kinds of devices that can present information, and may comprise but are not limited to display, a speaker, a video/audio output terminal, a vibrator and/or a printer.
  • the computing device 700 may also comprise or be connected with non-transitory storage devices 710 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage device, a solid-state storage, a floppy disk, a flexible disk, hard disk, a magnetic tape or any other magnetic medium, a compact disc or any other optical medium, a ROM (Read Only Memory) , a RAM (Random Access Memory) , a cache memory and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions and/or code.
  • non-transitory storage devices 710 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage
  • the non-transitory storage devices 710 may be detachable from an interface.
  • the non-transitory storage devices 710 may have data/instructions/code for implementing the methods and steps which are described above.
  • the computing device 700 may also comprise a communication device 712.
  • the communication device 712 may be any kinds of device or system that can enable communication with external apparatuses and/or with a network, and may comprise but are not limited to a modem, a network card, an infrared communication device, a wireless communication device and/or a chipset such as a Bluetooth TM device, 802.11 device, WiFi device, WiMax device, cellular communication facilities and/or the like.
  • the computing device 700 When the computing device 700 is used as an on-vehicle device, it may also be connected to external device, for example, a GPS receiver, sensors for sensing different environmental data such as an acceleration sensor, a wheel speed sensor, a gyroscope and so on. In this way, the computing device 700 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle.
  • external device for example, a GPS receiver, sensors for sensing different environmental data such as an acceleration sensor, a wheel speed sensor, a gyroscope and so on.
  • the computing device 700 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle.
  • other facilities such as an engine system, a wiper, an anti-lock Braking System or the like
  • non-transitory storage device 710 may have map information and software elements so that the processor 704 may perform route guidance processing.
  • the output device 706 may comprise a display for displaying the map, the location mark of the vehicle and also images indicating the travelling situation of the vehicle.
  • the output device 706 may also comprise a speaker or interface with an ear phone for audio guidance.
  • the bus 702 may include but is not limited to Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus. Particularly, for an on-vehicle device, the bus 702 may also include a Controller Area Network (CAN) bus or other architectures designed for application on an automobile.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • CAN Controller Area Network
  • the computing device 700 may also comprise a working memory 714, which may be any kind of working memory that may store instructions and/or data useful for the working of the processor 704, and may comprise but is not limited to a random access memory and/or a read-only memory device.
  • working memory 714 may be any kind of working memory that may store instructions and/or data useful for the working of the processor 704, and may comprise but is not limited to a random access memory and/or a read-only memory device.
  • Software elements may be located in the working memory 714, including but are not limited to an operating system 716, one or more application programs 718, drivers and/or other data and codes. Instructions for performing the methods and steps described in the above may be comprised in the one or more application programs 718, and the units of the aforementioned apparatus 800 may be implemented by the processor 704 reading and executing the instructions of the one or more application programs 718.
  • the executable codes or source codes of the instructions of the software elements may be stored in a non-transitory computer-readable storage medium, such as the storage device (s) 710 described above, and may be read into the working memory 714 possibly with compilation and/or installation.
  • the executable codes or source codes of the instructions of the software elements may also be downloaded from a remote location.
  • the present disclosure may be implemented by software with necessary hardware, or by hardware, firmware and the like. Based on such understanding, the embodiments of the present disclosure may be embodied in part in a software form.
  • the computer software may be stored in a readable storage medium such as a floppy disk, a hard disk, an optical disk or a flash memory of the computer.
  • the computer software comprises a series of instructions to make the computer (e.g., a personal computer, a service station or a network terminal) execute the method or a part thereof according to respective embodiment of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

Certains exemples de la présente invention concernent un procédé, un appareil, et un véhicule de prédiction de trajectoire. Le procédé consiste : à obtenir des données de capteur concernant un objet environnant un véhicule provenant d'au moins un capteur installé sur le véhicule (502) ; à entrer les données de capteur obtenues dans un prédicteur hybride constitué d'une pluralité de prédicteurs dotés chacun d'un poids attribué par le prédicteur hybride, la pluralité de prédicteurs comprenant au moins un prédicteur d'espace libre et un prédicteur basé sur un modèle de route (504) ; à donner, par chaque prédicteur de la pluralité de prédicteurs, en ce qui concerne un instant actuel, ses propres prédictions pour une période de temps prédéfinie à partir de l'instant actuel en fonction de données de capteur historiques provenant dudit capteur (506) ; à déterminer, par le prédicteur hybride, une fiabilité de chaque prédicteur de la pluralité de prédicteurs (508) ; à délivrer en sortie, par le prédicteur hybride, des prédictions hybrides pondérées concernant la période de temps prédéfinie à partir de l'instant actuel en fonction des prédictions respectives données par chaque prédicteur de la pluralité de prédicteurs et de la fiabilité de chaque prédicteur de la pluralité de prédicteurs, tout en attribuant un poids supérieur au prédicteur ayant donné la meilleure prédiction audit instant précédent (510) ; et à fournir une trajectoire prédite basée sur les prédictions hybrides pondérées (512).
PCT/CN2019/075159 2019-02-15 2019-02-15 Prédiction de trajectoire servant à une stratégie de conduite Ceased WO2020164090A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP19914873.5A EP3924795A4 (fr) 2019-02-15 2019-02-15 Prédiction de trajectoire servant à une stratégie de conduite
PCT/CN2019/075159 WO2020164090A1 (fr) 2019-02-15 2019-02-15 Prédiction de trajectoire servant à une stratégie de conduite
CN201980091906.XA CN113454555A (zh) 2019-02-15 2019-02-15 用于驾驶策略的轨迹预测

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/075159 WO2020164090A1 (fr) 2019-02-15 2019-02-15 Prédiction de trajectoire servant à une stratégie de conduite

Publications (1)

Publication Number Publication Date
WO2020164090A1 true WO2020164090A1 (fr) 2020-08-20

Family

ID=72044174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/075159 Ceased WO2020164090A1 (fr) 2019-02-15 2019-02-15 Prédiction de trajectoire servant à une stratégie de conduite

Country Status (3)

Country Link
EP (1) EP3924795A4 (fr)
CN (1) CN113454555A (fr)
WO (1) WO2020164090A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112733270A (zh) * 2021-01-08 2021-04-30 浙江大学 车辆行驶轨迹预测和轨迹偏离危险度评估的系统与方法
CN113342005A (zh) * 2021-08-04 2021-09-03 北京三快在线科技有限公司 一种无人驾驶设备的横向控制方法及装置
CN115062202A (zh) * 2022-06-30 2022-09-16 重庆长安汽车股份有限公司 驾驶行为意图及轨迹的预测方法、装置、设备及存储介质
CN115140034A (zh) * 2022-06-27 2022-10-04 阿里巴巴达摩院(杭州)科技有限公司 碰撞风险检测方法、装置及设备
CN113844446B (zh) * 2021-10-14 2023-08-15 安徽江淮汽车集团股份有限公司 融合长短程的车辆轨迹预测方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115827082A (zh) * 2022-08-26 2023-03-21 中国银联股份有限公司 一种应用系统的弹性伸缩方法以及弹性伸缩系统
CN116299418B (zh) * 2022-11-28 2025-10-28 天津大学 基于多运动学模型融合的测距设备后端数据融合方法及其应用

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180074505A1 (en) * 2016-09-14 2018-03-15 Qualcomm Incorporated Motion planning and intention prediction for autonomous driving in highway scenarios via graphical model-based factorization
CN108475057A (zh) * 2016-12-21 2018-08-31 百度(美国)有限责任公司 基于车辆周围的情境预测车辆的一个或多个轨迹的方法和系统
US20180284810A1 (en) * 2017-03-28 2018-10-04 Continental Teves Ag & Co. Ohg Method for establishing a cooperation partner for executing a driving maneuver and a system
US20180374359A1 (en) * 2017-06-22 2018-12-27 Bakhi.com Times Technology (Beijing) Co., Ltd. Evaluation framework for predicted trajectories in autonomous driving vehicle traffic prediction
CN109131346A (zh) * 2017-06-27 2019-01-04 通用汽车环球科技运作有限责任公司 用于预测自主车辆中的交通模式的系统和方法
US20190025841A1 (en) * 2017-07-21 2019-01-24 Uber Technologies, Inc. Machine Learning for Predicting Locations of Objects Perceived by Autonomous Vehicles

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2562060B1 (fr) * 2011-08-22 2014-10-01 Honda Research Institute Europe GmbH Procédé et système de prédiction de comportement de mouvement d'un objet de trafic cible
DE102015221920A1 (de) * 2015-11-09 2017-05-11 Bayerische Motoren Werke Aktiengesellschaft Verfahren, Computerprogrammprodukt, Vorrichtung, und Fahrzeug umfassend die Vorrichtung zum Steuern einer Trajektorienplanung eines Egofahrzeugs
US10394245B2 (en) * 2016-11-22 2019-08-27 Baidu Usa Llc Method and system to predict vehicle traffic behavior for autonomous vehicles to make driving decisions
CN106950956B (zh) * 2017-03-22 2020-02-14 合肥工业大学 融合运动学模型和行为认知模型的行车轨迹预测系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180074505A1 (en) * 2016-09-14 2018-03-15 Qualcomm Incorporated Motion planning and intention prediction for autonomous driving in highway scenarios via graphical model-based factorization
CN108475057A (zh) * 2016-12-21 2018-08-31 百度(美国)有限责任公司 基于车辆周围的情境预测车辆的一个或多个轨迹的方法和系统
US20180284810A1 (en) * 2017-03-28 2018-10-04 Continental Teves Ag & Co. Ohg Method for establishing a cooperation partner for executing a driving maneuver and a system
US20180374359A1 (en) * 2017-06-22 2018-12-27 Bakhi.com Times Technology (Beijing) Co., Ltd. Evaluation framework for predicted trajectories in autonomous driving vehicle traffic prediction
CN109131346A (zh) * 2017-06-27 2019-01-04 通用汽车环球科技运作有限责任公司 用于预测自主车辆中的交通模式的系统和方法
US20190025841A1 (en) * 2017-07-21 2019-01-24 Uber Technologies, Inc. Machine Learning for Predicting Locations of Objects Perceived by Autonomous Vehicles

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3924795A4 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112733270A (zh) * 2021-01-08 2021-04-30 浙江大学 车辆行驶轨迹预测和轨迹偏离危险度评估的系统与方法
CN112733270B (zh) * 2021-01-08 2022-06-24 浙江大学 车辆行驶轨迹预测和轨迹偏离危险度评估的系统与方法
CN113342005A (zh) * 2021-08-04 2021-09-03 北京三快在线科技有限公司 一种无人驾驶设备的横向控制方法及装置
CN113844446B (zh) * 2021-10-14 2023-08-15 安徽江淮汽车集团股份有限公司 融合长短程的车辆轨迹预测方法
CN115140034A (zh) * 2022-06-27 2022-10-04 阿里巴巴达摩院(杭州)科技有限公司 碰撞风险检测方法、装置及设备
CN115140034B (zh) * 2022-06-27 2026-01-13 阿里巴巴达摩院(杭州)科技有限公司 碰撞风险检测方法、装置及设备
CN115062202A (zh) * 2022-06-30 2022-09-16 重庆长安汽车股份有限公司 驾驶行为意图及轨迹的预测方法、装置、设备及存储介质
CN115062202B (zh) * 2022-06-30 2024-10-01 重庆长安汽车股份有限公司 驾驶行为意图及轨迹的预测方法、装置、设备及存储介质

Also Published As

Publication number Publication date
EP3924795A4 (fr) 2022-12-21
CN113454555A (zh) 2021-09-28
EP3924795A1 (fr) 2021-12-22

Similar Documents

Publication Publication Date Title
JP6831420B2 (ja) 自動運転車の軌跡候補を評価するための方法
WO2020164090A1 (fr) Prédiction de trajectoire servant à une stratégie de conduite
US11851081B2 (en) Predictability-based autonomous vehicle trajectory assessments
US11945434B2 (en) Delay decision making for autonomous driving vehicles in response to obstacles based on confidence level and distance
US10816984B2 (en) Automatic data labelling for autonomous driving vehicles
US10824153B2 (en) Cost design for path selection in autonomous driving technology
US11731612B2 (en) Neural network approach for parameter learning to speed up planning for complex driving scenarios
US20200331476A1 (en) Automatic lane change with minimum gap distance
CN111857118B (zh) 对停车轨迹分段以控制自动驾驶车辆停车
CN108099918B (zh) 用于确定自主车辆的命令延迟的方法
EP3694756B1 (fr) Système de planification de stationnement vertical fondé sur une courbe en spirale pour véhicules à conduite autonome
US11260880B2 (en) Map-less and localization-less lane following method for autonomous driving of autonomous driving vehicles on highway
JP2019206327A (ja) 自動運転車両の軌道の生成方法
JP2019527862A (ja) 自律走行のための地図画像に基づく交通予測
CN113424209B (zh) 使用深度学习多预测器融合和贝叶斯优化的轨迹预测
JP2019192234A (ja) 複数のキューを利用したオブジェクト追跡
JP2019131177A (ja) 複数のスレッドを使用して自動運転車両に用いられる基準線を生成するための方法及びシステム
CN116745195A (zh) 车道外安全驾驶的方法和系统
KR102359497B1 (ko) 단일 차량 동작용으로 설계된 자율 주행 시스템에 따른 차량 플래툰 구현
US11136023B2 (en) Method for determining exiting intersection of moving objects for autonomous driving vehicles
CN111103876A (zh) 自动驾驶车辆的基于雷达通信的扩展感知
CN111684379B (zh) 自动驾驶车辆的三点转弯的最优规划器切换方法
US11242057B2 (en) Method for optimizing three-point turn of autonomous driving vehicles
CN111801638B (zh) 基于枚举的自动驾驶车辆的三点转弯规划

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19914873

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019914873

Country of ref document: EP

Effective date: 20210915