[go: up one dir, main page]

GB2580908A - Method and device for selective active and passive sensing in automated driving applications - Google Patents

Method and device for selective active and passive sensing in automated driving applications Download PDF

Info

Publication number
GB2580908A
GB2580908A GB1901102.2A GB201901102A GB2580908A GB 2580908 A GB2580908 A GB 2580908A GB 201901102 A GB201901102 A GB 201901102A GB 2580908 A GB2580908 A GB 2580908A
Authority
GB
United Kingdom
Prior art keywords
vehicle
drones
active
sensing
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1901102.2A
Other versions
GB201901102D0 (en
Inventor
Watzenig Daniel
Fuchs Anton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH
Original Assignee
Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH filed Critical Kompetenzzentrum das Virtuelle Fahrzeug Forchungs GmbH
Priority to GB1901102.2A priority Critical patent/GB2580908A/en
Publication of GB201901102D0 publication Critical patent/GB201901102D0/en
Publication of GB2580908A publication Critical patent/GB2580908A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U80/00Transport or storage specially adapted for UAVs
    • B64U80/80Transport or storage specially adapted for UAVs by vehicles
    • B64U80/86Land vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/102UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] adapted for flying in formations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U70/00Launching, take-off or landing arrangements
    • B64U70/90Launching from or landing on platforms
    • B64U70/92Portable platforms

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method and device for selecting an active or passive sensing means in an automated vehicle using a set of drones, the method comprising a risk assessment procedure based on superordinate control station information or in-vehicle sensor data (1, Fig.1) to classify between safe operation with active sensor principles or operation requiring drones (2), the deactivation of active in-vehicle sensing and communication devices, launching at least three drones (4), wherein the number of drones is based on vehicle speed, the environment, number of potential tracks, or degree of necessity to conceal position, establishing an ad-hoc network between drones (5), sensing the vehicle’s environment through camera-based and far-ranging active sensors (6), observing and estimating the vehicle’s position based on triangulation, and exchanging information on the exact vehicle position plus further relevant information regarding vehicle risks between the drones and the vehicle (7).

Description

Method and Device for Selective Active and Passive Sensing in Automated Driving Applications
Background of the invention
Highly automated and autonomous driving are rather new technologies for mobility. Their field of application ranges from comfortable passenger transportation on road and rail to efficient 24/7 transportation of goods. All highly automated and autonomous driving strategies and operation principles have in common that (at least) three types of actions have to be carried out: - Sense (i.e. determination of the environment, especially the road and potential obstacles and restrictions ahead of the vehicle) - Plan (i.e. deriving the manoeuvre plan for driving the vehicle based on all available information on the environment) - Act (i.e. executing actions such as accelerating, braking, steering, etc.) For sensing the environment in automated and autonomous driving, active principles such as radar or lidar are typically used, where electromagnetic waves or sound waves are transmitted and the reflected and received signal is evaluated. These active systems are detectable and back-traceable to a certain extend and can reveal unintentionally the exact position of a vehicle. Therefore, active sensor principles are not favourable for all applications or situations, especially for military applications.
An alternative technical approach is desirable where the sensing action is -at least for certain situations or phases -not carried out actively in the vehicle. The present invention describes a method and device to avoid active environment sensing out of the vehicle by establishing ad-hoc networks of drone-based external observers.
It is one intention of the present invention to provide a method and a device for extending available information about the environment of a vehicle with a wider range than in-vehicle sensors are capable to provide.
State of the Art In several state of the art technologies typically reserved for military, police and emergency services either satellites or high-flying drones are used to obtain and provide information on the environment to ground vehicles. This information is mainly camera-image based (including infrared) and typically provided by one satellite or one drone.
For many applications these procedures require preparation time and cannot be applied for many vehicles at the same time due to restrictions in availability of satellites or drones in proper operation position.
Description of the Invention
The vehicle to be operated in highly automated and autonomous driving mode is equipped with cameras and on-board short-/mid-/far-ranging active sensors such as ultrasound, infrared, radar, lidar as well as with a set of drones that can be launched from the vehicle. The vehicle provides communication devices (including a laser-based optical communication path). Drones are powered by batteries or all sorts of fuel (e.g. hydrogen for fuel cell, other fuels) and also carry cameras, far-ranging active sensors and communication devices allowing for communication between drones and from the drones to the vehicle. Drones observe the vehicle position via camera and observe the scene mainly in front of the vehicle from elevated position. The absolute position of each drone is provided via GPS technology and correction methods of the GPS signal, which all belong to the state of the art.
The method of the present invention comprises the following steps: In a first step the vehicle is operated using on-board far-ranging active sensors.
In a second step, the threat concerning detection and localisation of the vehicle due to the use of active sensing is assessed using available information of a superordinate control station, data from in-vehicle sensors as well as all other available and observable data. The outcome of this step is a classification into a case with little risk (phase 1) and a case with significant risk for the vehicle (phase 2).
In a third step (for phase 2 only), the vehicle is prepared for switching off the active sensing devices, i.e. the on-board far-ranging active sensors and communication are deactivated.
In a forth step, a set of drones (minimum of three drones) are launched from the vehicle. The number of drones depend on vehicle driving speed, (topology of) the environment (e.g. urban area, mountain area, etc.), number of potential tracks to be scouted and the degree of necessity to conceal the vehicle's position (i.e. additional drones at various random positions around the vehicle are used to distract from the present vehicle position).
In a fifth step, the drones establish an ad-hoc network. Based on automated image analysis (e.g. least number of detected potential risks for each drone), the least threatened drone takes over the role of a temporary master drone. The role of the master drone can be switched at certain time steps in order to avoid with this hopping procedure a detection of the most relevant master drone. To keep communication bandwidth low, it can be implemented that only the current master drone is the one that transmits selected information to the vehicle.
In a sixth step all drones use their far-ranging active sensors to sense the environment in the relevant directions. During this action, the position of the vehicle is observed by at least three drones to estimate the vehicle position with minimum uncertainty based on the triangulation principle. If more than one path for the routing of the vehicle is possible and suitable, drones are following these potential individual paths (at least one drone per path) in advance of the vehicle to obtain the best possible scene description with long-ranging sensors and cameras to reduce the risks for the vehicle.
In a seventh step, information on exact vehicle position as well as relevant extracted information on e.g. potential risks (mines, roadside bombs, obstacles, persons, other vehicles, etc.) are exchanged between the drones in an encrypted manner and transmitted in a processed and compressed protocol to the vehicle. All relevant information and features extracted on drone level is exchanged within the ad-hoc network. It hence provides redundancy concerning failure or breakdown of one single drone.
Step seven and six are repeated continuously.
In an eighth step, a control signal is sent from the vehicle to the master drone to change in phase (especially from phase 2 risk level to phase 1) or to initiate other relevant actions (e.g. return of some or all drones to the vehicle). This control signal is preferable a laser-based, optical communication channel directed from the vehicle to the master drone. The risk of detection and localisation is reduced as much as possible with the focused optical communication path.
The present invention also comprises a device for selective active/passive sensing in automated driving applications. This device comprises: - A vehicle able to carry a set of drones (minimum of three drones). The drones can be launched from the vehicle and can also return and be received by the vehicle on demand. The vehicle is equipped with a communication module to receive encrypted information transmitted wirelessly (radio signal or optical communication path) from drones and send control signals to at least one of the drones (master drone) - A set of at least three drones that are battery-or fuel-powered and comprise at least one camera to observe the position of the vehicle and the environment, a GPS-based localisation unit, far-ranging active sensors such as lidar or radar for environment sensing, a computation unit in a data processing module to run data analysis and feature extraction (including artificial intelligence algorithms and processes to determine information), and a communication module to send and receive encrypted information wirelessly (radio signal or optical communication path) to/from other drones, and to send information to the vehicle and receive a control signal from the vehicle.
The invention comprises a launch platform for the drones and can comprise a quick launch platform, where drones are shot into the air by means of compressed air, catapult or propellant e.g. by using a set of pipes to accelerate drones inside a ballistic hood through the pipe setup.
Each of the drones comprises several modules: 1. A communication module to receive and transmit signals wirelessly 2. A sensing module with at least a GPS-based position sensing unit, at least one camera (either fixed or controllable in various directions) to take video images of the vehicle and the environment, at least one far-ranging active sensor to measure distance profiles (such as sensors based on radar, lidar, time of flight camera, etc.). Information and control signals to control and adjust sensors of the sensing module can be received from the vehicle or other drones via communication module.
3. A data processing module with on-board computing capacity to extract features and information, fuse relevant sensor data, provide data analytics and data processing to obtain environmental data and data on the vehicle. The data processing module receives data from the sensing module and feeds back control to the sensor module to control and adjust sensors of the sensing module. Processed and prepared data is transmitted via communication module.
4. A flight control module, which receives control signals from the vehicle or other drones directly or receives flight control information out of processed sensing data (e.g. flight manoeuvres to follow the vehicle's position and not to lose sight to the vehicle).
5. A drive and power module to operate and power the propulsion of the drones.
For a better understanding of the invention figures are given. These figures show: Fig. 1: a flow diagram of the presented method Fig. 2: a possible embodiment of the invention where vehicle on-board far-ranging sensors are used to operate it in highly automated and autonomous driving mode and where the vehicle drones are carried by the vehicle (phase 1) Fig. 3: a possible embodiment of the invention where all active far-ranging sensors and the communication module in the vehicle are deactivated and a set of drones in an ad-hoc network observe vehicle position and environment and transmit relevant information to the vehicle as a passive receiving unit (phase 2) Fig. 4: a possible embodiment of the invention where the vehicle sends a control signal to one (master drone) or all of the drones Fig. 5: a possible embodiment of the invention where drones follow possible planned routes upfront of the vehicle to reduce the risk for the vehicle Fig. 6: an overview of the modules of an individual drone with interaction of modules
GB1901102.2A 2019-01-28 2019-01-28 Method and device for selective active and passive sensing in automated driving applications Withdrawn GB2580908A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1901102.2A GB2580908A (en) 2019-01-28 2019-01-28 Method and device for selective active and passive sensing in automated driving applications

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1901102.2A GB2580908A (en) 2019-01-28 2019-01-28 Method and device for selective active and passive sensing in automated driving applications

Publications (2)

Publication Number Publication Date
GB201901102D0 GB201901102D0 (en) 2019-03-13
GB2580908A true GB2580908A (en) 2020-08-05

Family

ID=65655934

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1901102.2A Withdrawn GB2580908A (en) 2019-01-28 2019-01-28 Method and device for selective active and passive sensing in automated driving applications

Country Status (1)

Country Link
GB (1) GB2580908A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116171962B (en) * 2023-03-23 2024-03-08 广东省农业科学院植物保护研究所 Efficient targeted spray regulation and control method and system for plant protection unmanned aerial vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018009190A1 (en) * 2016-07-07 2018-01-11 Ford Global Technologies, Llc Vehicle-integrated drone
GB2559753A (en) * 2017-02-16 2018-08-22 Continental Automotive Gmbh Fusion of images from drone and vehicle
WO2018156139A1 (en) * 2017-02-24 2018-08-30 Ford Global Technologies, Llc Drone-based tracking

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018009190A1 (en) * 2016-07-07 2018-01-11 Ford Global Technologies, Llc Vehicle-integrated drone
GB2559753A (en) * 2017-02-16 2018-08-22 Continental Automotive Gmbh Fusion of images from drone and vehicle
WO2018156139A1 (en) * 2017-02-24 2018-08-30 Ford Global Technologies, Llc Drone-based tracking

Also Published As

Publication number Publication date
GB201901102D0 (en) 2019-03-13

Similar Documents

Publication Publication Date Title
JP5523764B2 (en) System and method for navigation of unmanned aerial vehicles
US9163909B2 (en) Unmanned multi-purpose ground vehicle with different levels of control
KR102140519B1 (en) Unmanned aerial vehicle defense system
EP2490940B1 (en) Uav system and method
EP3640915A1 (en) Uav network assisted situational self-driving
EP4042105B1 (en) Map including data for routing aerial vehicles during gnss failure
US8315794B1 (en) Method and system for GPS-denied navigation of unmanned aerial vehicles
IL321542A (en) Flight management system for uavs
KR101701397B1 (en) vehicle control method using unmanned vehicle and system
Mukherjee et al. Unmanned aerial system for post disaster identification
CN111722646A (en) A maritime search method and system based on the cooperation of unmanned aerial vehicles and unmanned ships
JP2019067337A (en) Vehicle control device, vehicle control method, and program
WO2021180399A1 (en) Method for controlling a formation of a collaborating swarm of unmanned mobile units
US10249192B2 (en) Notification regarding an estimated movement path of a vehicle
KR20120036684A (en) An intelligent aviation robot using gps
GB2580908A (en) Method and device for selective active and passive sensing in automated driving applications
AU2021384849A9 (en) Method and arrangement for transferring data
US11695469B2 (en) Commanding autonomous vehicles using multi-link satellite networks
Iyer et al. Versatile UAS Platform for Indoor Law Enforcement and First Responders Applications
CN115917352A (en) Apparatus and method for collision avoidance
US20250273060A1 (en) Distributed sensor network implemented by swarm of unmanned autonomous vehicles
WO2022162848A1 (en) Control system, flying body identification method, computer-readable medium, and flying body
Teo Closing the gap between research and field applications for multi-UAV cooperative missions
Kp et al. Manned Intelligence Surveillance and Reconnaissance (Manned-ISR) Systems-An Indigenous approach
KR102859753B1 (en) Apparatus and method for warrior platform long range communication with relay drone

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)