[go: up one dir, main page]

GB2559753A - Fusion of images from drone and vehicle - Google Patents

Fusion of images from drone and vehicle Download PDF

Info

Publication number
GB2559753A
GB2559753A GB1702515.6A GB201702515A GB2559753A GB 2559753 A GB2559753 A GB 2559753A GB 201702515 A GB201702515 A GB 201702515A GB 2559753 A GB2559753 A GB 2559753A
Authority
GB
United Kingdom
Prior art keywords
drone
vehicle
image data
control unit
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1702515.6A
Other versions
GB201702515D0 (en
Inventor
Kalyazin Nikita
Riley Tom
Pandey Abhinav
Plowman Robin
Sun Duo
Jin Yonggang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Priority to GB1702515.6A priority Critical patent/GB2559753A/en
Publication of GB201702515D0 publication Critical patent/GB201702515D0/en
Publication of GB2559753A publication Critical patent/GB2559753A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

A driver assistance system (100) comprises a drone (110), which communicates wirelessly with a communication unit (120), and a control unit (130) which is configured to provide information to a driver of a vehicle based on information acquired by the drone about a vicinity of the vehicle. The system may comprise a camera (140, 150) for each of the drone and vehicle, and the control unit may generate a fused surround view image using the image data obtained by these cameras (Fig. 2). The fused image may be generated in real time and as the drone flies. The image data may be combined to give a three-dimensional image. The control unit may be configured to change the perspective of the fused image, or may be configured to select a sub-set of the image data received from the drone for data fusion.

Description

(54) Title of the Invention: Fusion of images from drone and vehicle
Abstract Title: A driver assistance system using a drone to transmit information about the vicinity of a vehicle, and which may fuse image data from the drone and the vehicle (57) A driver assistance system (100) comprises a drone (110), which communicates wirelessly with a communication unit (120), and a control unit (130) which is configured to provide information to a driver of a vehicle based on information acquired by the drone about a vicinity of the vehicle. The system may comprise a camera (140, 150) for each of the drone and vehicle, and the control unit may generate a fused surround view image using the image data obtained by these cameras (Fig. 2). The fused image may be generated in real time and as the drone flies.
The image data may be combined to give a three-dimensional image. The control unit may be configured to change the perspective of the fused image, or may be configured to select a sub-set of the image data received from the drone for data fusion.
Fig. 1
100
Figure GB2559753A_D0001
Figure GB2559753A_D0002
130
Figure GB2559753A_D0003
Figure GB2559753A_D0004
Figure GB2559753A_D0005
Figure GB2559753A_D0006
Figure GB2559753A_D0007
Application No. GB1702515.6
RTM
Date :9 August 2017
Intellectual
Property
Office
The following terms are registered trade marks and should be read as such wherever they occur in this document:
WiFi
Zigbee
Bluetooth
WiMax
UMTS
FTE
Galileo
Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
Fusion of images from drone and vehicle
The invention relates to a driver assistance system for acquiring vicinity information of a vehicle by a drone, a vehicle comprising the driver assistance system, a method, a program element, and a computer-readable medium.
Since the beginning of the development of motor vehicles, systems for assisting a driver have been developed. Some important milestones thereof are the ABS, the parking assist and the cruise control. Especially in the last 10 years numerous driver assistance systems have been developed. During the parking procedure, driving assistance systems can support the driver, so that the vehicle does not get damaged or the vehicle parks completely autonomous. One of the systems can generate a surround view of the car, so that the driver can see the vicinity of the vehicle. Therefore, the driver may also see obstacles which he would not have seen by looking through the windows of the vehicle.
It would be advantageous to receive detailed information about the vicinity of a vehicle.
The object of the present invention is solved with the subject matter of the independent claims, wherein further embodiments are incorporated in the dependent claims.
A first aspect of the invention is related to a driver assistance system for a vehicle. The driver assistance system comprises a drone configured to acquire information about a vicinity of the vehicle, a communication unit configured to receive the information from the drone via a wireless communication link and a control unit configured to provide assistance information to the driver of the vehicle based on the information acquired by the drone .
Thus, the driver assistance system includes a separate, external device, i.e. a drone. The drone may be configured to fly over, above, in front of or next to the vehicle. Further, the drone may be configured to acquire vicinity information of the vehicle. The vicinity information may comprise, for example, images, infrared images, radar images, distances, parking spots, obstacles and / or traffic information. The drone may be able to send the acquired information to the vehicle. Furthermore, the drone may send the acquired information to several different vehicles simultaneously. A communication unit of the driver assistance system may be configured to receive the acquired vicinity information from the drone via a wireless communication link. The communication unit may be located in the vehicle and may be connected to a control unit of the driver assistance system. The control unit may be configured to provide assistance information to the driver of the vehicle based on the vicinity information acquired by the drone. The control unit may be further configured to fuse information acquired by the drone and information acquired by a sensor and / or several sensors of the vehicle. According to another embodiment of the invention the drone may be configured to receive the information acquired by the sensors of the vehicle. Further, the drone may be configured to fuse the information received from the vehicle with the information acquired by the drone. It may also be possible that the control unit of the vehicle fuses the information acquired by the drone and acquired by the sensors of the vehicle and send this fused information to the drone. The control unit of the driver assistance system and the communication unit may be configured to receive vicinity information from several different drones. Thus, the control unit may be able to provide a surround view image based on vicinity information acquired by more than one drone.
Furthermore, the drone may be configured to navigate and determine its location based on the acquired vicinity information of the vehicle. This may be advantageous if the positioning over a global positioning system (GPS) is not accurate. Also the positioning of the vehicle may be improved by the information acquired by the drone. The control unit may be configured to control the flight of the drone, such that the control unit directs the drone to fly to a certain area, i.e. to find a parking spot, to determine the length and / or the cause of a traffic jam or to fly to a specific obstacle to acquire more information about this specific obstacle.
A drone, in a technological context, is an unmanned aircraft. Drones are more formally known as unmanned aerial vehicles (UAVs) or unmanned aircraft systems (UASes). Essentially, a drone is a flying robot. The aircrafts may be remotely controlled or can fly autonomously through software-controlled flight plans in their embedded systems working in conjunction with onboard sensors and GPS .
The wireless transmission of the vicinity information may be carried out via Bluetooth, WiFi (e.g. WiFi 802.11a / b / g / n or WiFi 802.lip), ZigBee or WiMax or cellular radio systems such as GPRS, UMTS or LTE . It is also possible to use other transmission protocols. The mentioned protocols offer the advantage of standardization.
It should be noted that, in the context of the present invention, GPS is representative of all Global Navigation Satellite Systems (GNSS), e.g. GPS, Galileo, GLONASS (Russia), Compass (China) or IRNSS (India).
According to an embodiment of the invention the driver assistance system further comprises at least one camera for the vehicle configured to acquire first image data and at least one camera for the drone, wherein the information acquired by the drone comprises second image data. The control unit is configured to fuse the first image data with the second image data, to receive a fused surround view image.
Thus, according to this embodiment the drone may comprise a camera, which may provide image data. The vehicle may also comprise a camera to acquire image data. The images from the camera of the vehicle may be titled first image data and the images form the camera of the drone may be titled second image data. The control unit may be configured to fuse the first image data with the second image data to receive a fused surround view image of the vehicle. The vehicle may comprise more than one camera, for example four cameras, so that the vehicle can generate a surround view image. The control unit may be configured to analyse the second image data from the drone and enlarge the first image data with the second image data or parts of the second image data. Further, the control unit may be configured to use specific details of the second image data for fusing with the first image data .
In an embodiment of the invention the control unit may be configured to use image data and other vicinity information acquired by the drone to provide assistance information to the driver of the vehicle, i.e. use the second image data and distance information for a specific obstacle.
According to an embodiment of the invention the control unit is configured to generate the fused surround view image during the flight of the drone, and may even be configured to do this in real time .
Real time is the operation of a computer system in which programs for data processing are constantly ready for operation, such that the processing results are available within a predetermined period of time, for example 1, 10, 20 or 50 ms. Depending on the application, the data may be generated according to a temporally random distribution or at predetermined times.
It may be advantageous, if the control unit could provide a surround view image in real time. Further, the control device should create the surround view image based on the second image data acquired by the drone also during the flight of the drone. The control unit could also be configured to generate a surround view image, if the drone is not flying or use not real time image data acquired by the drone, for example, the drone flies to the beginning of a traffic jam to measure the length of the traffic jam. Than the drone flies back to the vehicle with the information about the length of the traffic jam. And this information is provided to the driver of the vehicle.
According to another embodiment of the invention the control unit is configured to generate a 3D image from the combination of the first image data and the second image data.
Thus, the control unit may be configured to generate a 3D image for the driver of the vehicle. To create the 3D image, the control device may fuse the first image data acquired by the camera of the vehicle and the second image data acquired by the camera of the drone. For an accurate 3D image generation, the control unit may acquire information about the position of the drone and the camera of the drone relative to the camera of the vehicle. A 3D image may be more realistic for the driver, so that the driver can handle the information shown by the driver assistance system in a better way. Therefore, the driver assistance system is more useful and accepted by the driver of the vehicle.
According to an embodiment of the invention the control unit is configured to change the perspective of the fused image.
In another embodiment of the invention the control unit may be configured to change the perspective of the fused surround view image based on the first image data and the second image data. In other words, the content can be shown from the view of an virtual camera location. So the content can be shown in a more natural way to the driver. Thus, the driver can handle the shown content of the driver assistance system in a better way. The change of the perspective may also be dynamically changed based on the situation and the preferences of the driver.
According to another embodiment of the invention the control unit is configured to select a sub-set of the second image data for data fusion.
Thus, the control unit may be configured to analyse the content of the second image data acquired by the camera of the drone . Based on this analysis, the control unit may augment the first image data with specific details from the second image data, for example, one obstacle is only present in the second image data because this obstacle is not in the field of view of the cameras of the vehicle. Thus, the control unit may add this specific obstacle to the first image data. Further, the first image data with the added obstacle from the second image data could be displayed to the driver of the vehicle.
According to a second aspect of the invention a vehicle comprising a driving assistance system is provided.
The vehicle may, for example, be a motor vehicle, such as a car, bus, truck or motorcycle, a rail vehicle, a ship or an aircraft, such as a helicopter or an airplane.
According to a third aspect of the invention a method for fusion a surround view image is provided, comprising:
Acquiring vicinity information of a vehicle by a drone; Transmitting the vicinity information from the drone via a wireless communication link to a communication unit of the vehicle;
Providing assistance information to the driver of the vehicle by a control unit based on the information acquired by the drone .
According to another aspect, there is provided a computer program element for controlling the driver assistance system as previously described which, if the computer program element is executed by a processing unit, is adapted to perform the method steps as previously described.
There is also provided a computer readable medium having stored the computer element as previously described.
Advantageously, the benefits provided by any of the above aspects equally apply to all of the other aspects and vice versa.
The above aspects and examples will become apparent from and be elucidated with reference to the embodiments described hereinafter .
Exemplary embodiments will be described in the following with reference to the following drawings:
Fig. 1 shows a schematic set up of the driver assistance system with a drone, according to an embodiment of the invention.
Fig. 2 shows a schematic set up of the field of view of the cameras of the vehicle and of the drone.
Fig. 3 shows a flow chart for a method for a driver assistance system according to an embodiment of the invention.
Fig. 1 shows the driver assistance system 100. The driver assistance system 100 comprises a drone 110, a camera 140 on the drone 110, a control unit 130, a camera 150 on the vehicle 200 and a communication unit 120. The communication unit 120 exchanges vicinity data from the drone 110 and the vehicle 200. The drone 110 is configured to fly over / above, in front of or next to the vehicle 200. Further, the drone 110 is configured to acquire vicinity information of the vehicle 200. These vicinity information may comprise image data, distances, parking spots and/or traffic information. The drone 110 is further configured to send the acquired vicinity information via a wireless communication link to the communication unit 120 of the driver assistance system 100. The wireless communication link may be, for example, WiFi, LTE, UMTS, 3G or Bluetooth. The control unit 130 of the driver assistance system 100 is configured to provide assistance information to the driver of the vehicle 200 based on the information acquired by the drone 110. The information acquired by the drone 110 may help the driver of the vehicle 200 while driving the vehicle 200, i.e. the driver can focus on the street or traffic and the vehicle 200 could be protected of damages. The drone 110 is able to change the height and the position during the flight and therefore, the drone 110 is able to change the perspective and the scope of the image data acquired by the camera 150 of the drone 110. According to an embodiment of the invention the drone 110 is configured to fly ahead of the vehicle 200 and therefore sees what is in front of the vehicle 200, i.e. if a traffic jam is present and the length of it. In another embodiment of the invention the drone 110 is configured to fly near to or around the vehicle 200 and espy a parking space for the vehicle 200. An advantage of the drone 110 may be that the drone 110 can see over walls and around corners,
i.e. flies to a position or change the field of view to get another perspective. Further, the done 110 may also acquire vicinity information when the vehicle 200 stand still or is blocked in traffic .
The control unit 130 is configured to process the vicinity information acquired by the drone 110 and based on these information provide assistance information to the driver of the vehicle 200. Further, the control unit 130 is configured to use data from the vehicle 200 and its sensors and use data acquired by the drone 110 and based on these data providing assistance information to the driver. The control unit 130 may also be configured to fuse the data acquired by the vehicle 200 and the data acquired by the drone 110. In an embodiment of the invention, the vicinity information acquired by the drone 110 comprise image data. With this image data and image data acquired by a camera 150 of the vehicle 200, the control unit 130 of the driver assistance system 100 is configured to generate a surround view image. This surround view image can be generated due to fusion of the image data acquired by the camera 140 of the drone 110 with image data acquired by the camera 150 of the vehicle 200. The control unit 130 is further configured to generate the surround view image in real time and while the drone 110 is flying. The control unit 130 is also configured to generate a 3D image based on image data from the camera 150 of the vehicle 200 and on image data from the camera 140 of the drone 110. The control unit 130 is further configured to change dynamically the perspective of the resulting image data shown to the driver, based on image data acquired by the camera 150 of the vehicle 200 and on the image data acquired by the camera 140 of the drone 110 and vice versa.
According to an embodiment of the invention the drone 110 can be located, positioned and controlled based in the image data acquired by the drone 110. In other words, the position control of the drone 110 can be improved by imply the image data captured by the drone and not only by the use of GPS information. The control unit 130 may control the drone 110 by leading the drone 110 to a specific location and / or area.
In Fig. 2 image data from the camera 140 of the drone 110 and image data from the camera 150 from the vehicle 200 is shown. Especially, the field of view of the cameras 140, 150 is shown. The image data acquired by the vehicle 200 is generated by four cameras, symbolised by the rectangle within the vehicle 200 in the middle. The image data acquired by the camera 140 of the drone 110 is symbolised by the big rectangle. The control unit 130 is configured to fuse both image data to one fused surround view image. Further, the control unit 130 is configured to display the resulting fused surround view image to the driver of the vehicle 200. The advantage of the fusion of the image data acquired by the camera 140 of the drone 110 with the image data acquired by the camera 150 of the vehicle 200 may be that the driver can be provided with additional information and also vicinity information that can only be acquired with another perspective. Especially, a perspective more elevated, i.e. while flying in a certain height.
Fig. 3 shows a schematic flow chart for a method for a driver assistance system. In step 301 vicinity information of a vehicle is acquired by a drone. The acquired information is transmitted in step 302 from the drone to the control unit of the driver assistance system. In step 303 assistance information is provided to the driver of the vehicle. The assistance information is based on the vicinity information acquired by the drone.

Claims (10)

Claims
1. Driver assistance system (100) for a vehicle, comprising: a drone (110) configured to acquire information about a vicinity of the vehicle;
a communication unit (120) configured to receive the information from the drone (110) via a wireless communication link;
a control unit (130) configured to provide assistance information to a driver of the vehicle based on the information acquired by the drone (110) .
2. Driver assistance system (100) according to claim 1, further comprising:
at least one camera (150) for the vehicle configured to acquire first image data;
at least one camera (140) for the drone (110) configured to acquire second image data;
wherein the control unit (130) is configured to fuse the first image data with the second image data, to receive a fused surround view image.
3. Driver assistance system (100) according to claim 2, wherein the control unit (130) is configured to generate the fused surround view image in real time and during the flight of the drone (110).
4. Driver assistance system (100) according any one of the claims 2 or 3, wherein the control unit (130) is configured to generate a 3D image from the combination of the first image data and the second image data.
5. Driver assistance system according (100) any one of the preceding claims, wherein the control unit (130) is configured to change the perspective of the fused image.
6. Driver assistance system (100) according any one of the claims 2 to 5, wherein the control unit (130) is configured to select a sub-set of the second image data for data fusion.
7. Vehicle (200) with a driver assistance system (100) according to any one of the preceding claims.
8. Method for a driver assistance system, comprising:
Acquiring (301) vicinity information of a vehicle by a drone;
Transmitting (302) the vicinity information from the drone via a wireless communication link to a communication unit of the vehicle;
Providing (303) assistance information to the driver of the vehicle by a control unit based on the information acquired by the drone.
9. Computer program element for a driver assistance system according to any one of claims 1 to 6 and/or a vehicle according to claim 7, which when executed by a processor causes the driver assistance system to carry out the method according to claim 8.
10. Computer readable medium, on which a computer program element according to claim 9 is stored.
Intellectual
Property
Office
Application No: GB1702515.6 Examiner: Dr Maurice Blount
GB1702515.6A 2017-02-16 2017-02-16 Fusion of images from drone and vehicle Withdrawn GB2559753A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1702515.6A GB2559753A (en) 2017-02-16 2017-02-16 Fusion of images from drone and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1702515.6A GB2559753A (en) 2017-02-16 2017-02-16 Fusion of images from drone and vehicle

Publications (2)

Publication Number Publication Date
GB201702515D0 GB201702515D0 (en) 2017-04-05
GB2559753A true GB2559753A (en) 2018-08-22

Family

ID=58486788

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1702515.6A Withdrawn GB2559753A (en) 2017-02-16 2017-02-16 Fusion of images from drone and vehicle

Country Status (1)

Country Link
GB (1) GB2559753A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110281926A (en) * 2019-06-27 2019-09-27 淮安信息职业技术学院 The method and system of mountain-area road-curve sight information identification and traffic safety early warning
WO2020139582A1 (en) * 2018-12-27 2020-07-02 Continental Automotive Systems, Inc. Surround view by drones
GB2580908A (en) * 2019-01-28 2020-08-05 Kompetenzzentrum Das Virtuelle Fahrzeung Forschungsgesellschaft Mbh Method and device for selective active and passive sensing in automated driving applications
EP3729402A4 (en) * 2019-03-08 2020-11-25 SZ DJI Technology Co., Ltd. METHOD OF SHARING MAPPING DATA BETWEEN AN UNMANNED AIRCRAFT AND A GROUND VEHICLE
CN113696815A (en) * 2021-10-27 2021-11-26 江苏日盈电子股份有限公司 Interaction method and interaction system for multi-rotor unmanned aerial vehicle and vehicle
US11709073B2 (en) 2019-03-08 2023-07-25 SZ DJI Technology Co., Ltd. Techniques for collaborative map construction between an unmanned aerial vehicle and a ground vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110874578B (en) * 2019-11-15 2023-06-20 北京航空航天大学青岛研究院 Unmanned aerial vehicle visual angle vehicle recognition tracking method based on reinforcement learning

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007135659A2 (en) * 2006-05-23 2007-11-29 Elbit Systems Electro-Optics Elop Ltd. Clustering - based image registration
US20080158256A1 (en) * 2006-06-26 2008-07-03 Lockheed Martin Corporation Method and system for providing a perspective view image by intelligent fusion of a plurality of sensor data
FR2986647A3 (en) * 2012-02-07 2013-08-09 Renault Sas MOTOR VEHICLE ASSOCIATED WITH AN OBSERVATION DRONE
US20140257595A1 (en) * 2013-03-05 2014-09-11 Navteq B.V. Aerial image collection
EP3069995A1 (en) * 2015-03-18 2016-09-21 LG Electronics Inc. Unmanned aerial vehicle and method of controlling the same
US20160282864A1 (en) * 2015-03-24 2016-09-29 Southwest Research Institute Unmanned Ground/Aerial Vehicle System Having Autonomous Ground Vehicle That Remotely Controls One or More Aerial Vehicles
CN106251697A (en) * 2016-10-18 2016-12-21 珠海格力电器股份有限公司 Method, device and system for searching parking space

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007135659A2 (en) * 2006-05-23 2007-11-29 Elbit Systems Electro-Optics Elop Ltd. Clustering - based image registration
US20080158256A1 (en) * 2006-06-26 2008-07-03 Lockheed Martin Corporation Method and system for providing a perspective view image by intelligent fusion of a plurality of sensor data
FR2986647A3 (en) * 2012-02-07 2013-08-09 Renault Sas MOTOR VEHICLE ASSOCIATED WITH AN OBSERVATION DRONE
US20140257595A1 (en) * 2013-03-05 2014-09-11 Navteq B.V. Aerial image collection
EP3069995A1 (en) * 2015-03-18 2016-09-21 LG Electronics Inc. Unmanned aerial vehicle and method of controlling the same
US20160282864A1 (en) * 2015-03-24 2016-09-29 Southwest Research Institute Unmanned Ground/Aerial Vehicle System Having Autonomous Ground Vehicle That Remotely Controls One or More Aerial Vehicles
CN106251697A (en) * 2016-10-18 2016-12-21 珠海格力电器股份有限公司 Method, device and system for searching parking space

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020139582A1 (en) * 2018-12-27 2020-07-02 Continental Automotive Systems, Inc. Surround view by drones
US11577647B2 (en) 2018-12-27 2023-02-14 Continental Autonomous Mobility US, LLC Surround view by drones
GB2580908A (en) * 2019-01-28 2020-08-05 Kompetenzzentrum Das Virtuelle Fahrzeung Forschungsgesellschaft Mbh Method and device for selective active and passive sensing in automated driving applications
EP3729402A4 (en) * 2019-03-08 2020-11-25 SZ DJI Technology Co., Ltd. METHOD OF SHARING MAPPING DATA BETWEEN AN UNMANNED AIRCRAFT AND A GROUND VEHICLE
US11709073B2 (en) 2019-03-08 2023-07-25 SZ DJI Technology Co., Ltd. Techniques for collaborative map construction between an unmanned aerial vehicle and a ground vehicle
US11721225B2 (en) 2019-03-08 2023-08-08 SZ DJI Technology Co., Ltd. Techniques for sharing mapping data between an unmanned aerial vehicle and a ground vehicle
US12222218B2 (en) 2019-03-08 2025-02-11 SZ DJI Technology Co., Ltd. Techniques for collaborative map construction between an unmanned aerial vehicle and a ground vehicle
CN110281926A (en) * 2019-06-27 2019-09-27 淮安信息职业技术学院 The method and system of mountain-area road-curve sight information identification and traffic safety early warning
CN110281926B (en) * 2019-06-27 2020-09-29 淮安信息职业技术学院 Method and system for curve information identification and driving safety early warning of mountain highway
CN113696815A (en) * 2021-10-27 2021-11-26 江苏日盈电子股份有限公司 Interaction method and interaction system for multi-rotor unmanned aerial vehicle and vehicle

Also Published As

Publication number Publication date
GB201702515D0 (en) 2017-04-05

Similar Documents

Publication Publication Date Title
GB2559753A (en) Fusion of images from drone and vehicle
US12485892B2 (en) Method for displaying lane information and apparatus for executing the method
CN111670339B (en) Technology for collaborative map building between unmanned aerial vehicles and ground vehicles
CN111448476B (en) Technology to share mapping data between unmanned aerial vehicles and ground vehicles
US10989562B2 (en) Systems and methods for annotating maps to improve sensor calibration
US10401501B2 (en) Autonomous vehicle sensor calibration system
US10685571B2 (en) Vehicle dispatch system, autonomous driving vehicle, and vehicle dispatch method
CN110268356B (en) Leading unmanned aerial vehicle's system
JP5473304B2 (en) Remote location image display device, remote control device, vehicle control device, remote control system, remote control method, remote control program, vehicle control program, remote location image display method, remote location image display program
US10890928B2 (en) Flying vehicle navigation system and flying vehicle navigation method
US20170166222A1 (en) Assessment of human driving performance using autonomous vehicles
CN112601693B (en) Solution for monitoring and planning the movement of a vehicle
JP2010250478A (en) Driving assistance device
DE112018004691T5 (en) INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS, PROGRAM AND MOVING BODY
US9709414B2 (en) Personalized suggestion of automated driving features
US9987927B2 (en) Method for operating a communication device for a motor vehicle during an autonomous drive mode, communication device as well as motor vehicle
US11145112B2 (en) Method and vehicle control system for producing images of a surroundings model, and corresponding vehicle
JP2016045825A (en) Image display system
KR102749769B1 (en) Exposure control device, exposure control method, program, photographing device, and moving body
CN111540192A (en) Control of activation thresholds for vehicle safety systems
WO2023102911A1 (en) Data collection method, data presentation method, data processing method, aircraft landing method, data presentation system and storage medium
US11447154B2 (en) Vehicle travel system
CN118235016A (en) Map features delivered via augmented reality (AR)
WO2019113399A1 (en) Systems and methods for road surface dependent motion planning
CN108521802A (en) Control method of unmanned aerial vehicle, control terminal and unmanned aerial vehicle

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)