[go: up one dir, main page]

WO2018184963A3 - Direct vehicle detection as 3d bounding boxes using neural network image processing - Google Patents

Direct vehicle detection as 3d bounding boxes using neural network image processing Download PDF

Info

Publication number
WO2018184963A3
WO2018184963A3 PCT/EP2018/058033 EP2018058033W WO2018184963A3 WO 2018184963 A3 WO2018184963 A3 WO 2018184963A3 EP 2018058033 W EP2018058033 W EP 2018058033W WO 2018184963 A3 WO2018184963 A3 WO 2018184963A3
Authority
WO
WIPO (PCT)
Prior art keywords
neural network
image processing
vehicle detection
bounding boxes
input image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/058033
Other languages
French (fr)
Other versions
WO2018184963A2 (en
Inventor
Karsten Behrendt
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to EP18714245.0A priority Critical patent/EP3607489B1/en
Priority to US16/500,155 priority patent/US11216673B2/en
Priority to CN201880036861.1A priority patent/CN110678872A/en
Priority to KR1020197029041A priority patent/KR102629651B1/en
Publication of WO2018184963A2 publication Critical patent/WO2018184963A2/en
Publication of WO2018184963A3 publication Critical patent/WO2018184963A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/091Active learning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/235Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on user input or interaction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Transportation (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

Systems and methods of detecting and tracking one or more vehicles in a field of view of an imaging system using neural network processing. An electronic controller receives an input image from a camera mounted on the host vehicle. The electronic controller applies a neural network configured to output a definition of a three-dimensional bounding box based at least in part on the input image. The three-dimensional bounding box indicates a size and a position of a detected vehicle in a field of view of the input image. The three-dimensional bounding box includes a first quadrilateral shape outlining a rear or front of the detected vehicle and a second quadrilateral shape outline a side of the detected vehicle.
PCT/EP2018/058033 2017-04-04 2018-03-29 Direct vehicle detection as 3d bounding boxes using neural network image processing Ceased WO2018184963A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP18714245.0A EP3607489B1 (en) 2017-04-04 2018-03-29 Direct vehicle detection as 3d bounding boxes using neural network image processing
US16/500,155 US11216673B2 (en) 2017-04-04 2018-03-29 Direct vehicle detection as 3D bounding boxes using neural network image processing
CN201880036861.1A CN110678872A (en) 2017-04-04 2018-03-29 Direct vehicle detection as a 3D bounding box by using neural network image processing
KR1020197029041A KR102629651B1 (en) 2017-04-04 2018-03-29 Direct vehicle detection with 3D bounding boxes using neural network image processing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762481346P 2017-04-04 2017-04-04
US62/481,346 2017-04-04

Publications (2)

Publication Number Publication Date
WO2018184963A2 WO2018184963A2 (en) 2018-10-11
WO2018184963A3 true WO2018184963A3 (en) 2018-12-20

Family

ID=61827750

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/058033 Ceased WO2018184963A2 (en) 2017-04-04 2018-03-29 Direct vehicle detection as 3d bounding boxes using neural network image processing

Country Status (5)

Country Link
US (1) US11216673B2 (en)
EP (1) EP3607489B1 (en)
KR (1) KR102629651B1 (en)
CN (1) CN110678872A (en)
WO (1) WO2018184963A2 (en)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018206751A1 (en) * 2018-05-02 2019-11-07 Continental Automotive Gmbh CONTOUR RECOGNITION OF A VEHICLE BASED ON MEASUREMENT DATA OF A UMFELDSENSORIK
US11176415B2 (en) * 2018-05-09 2021-11-16 Figure Eight Technologies, Inc. Assisted image annotation
US12277406B2 (en) 2018-08-10 2025-04-15 Nvidia Corporation Automatic dataset creation using software tags
US11816585B2 (en) * 2018-12-03 2023-11-14 Tesla, Inc. Machine learning models operating at different frequencies for autonomous vehicles
JP2022034086A (en) * 2018-12-07 2022-03-03 ソニーセミコンダクタソリューションズ株式会社 Information processing apparatus, information processing method, and program
US11494979B2 (en) 2019-01-04 2022-11-08 Qualcomm Incorporated Bounding box estimation and lane vehicle association
CN113811886B (en) * 2019-03-11 2024-03-19 辉达公司 Intersection detection and classification in autonomous machine applications
DE102019204139A1 (en) 2019-03-26 2020-10-01 Robert Bosch Gmbh Training for artificial neural networks with better utilization of the learning data sets
EP3716137A1 (en) * 2019-03-27 2020-09-30 Visteon Global Technologies, Inc. Systems and methods for estimating the position of a target vehicle
US11823460B2 (en) * 2019-06-14 2023-11-21 Tusimple, Inc. Image fusion for autonomous vehicle operation
US20210027546A1 (en) * 2019-07-22 2021-01-28 Scale AI, Inc. Techniques for labeling cuboids in point cloud data
US20220262142A1 (en) * 2019-08-14 2022-08-18 Intel Corporation Automatic generation of 3d bounding boxes from multi-camera 2d image data
US12347216B2 (en) * 2019-09-12 2025-07-01 Koninklijke Philips N.V. Interactive endoscopy for intraoperative virtual annotation in vats and minimally invasive surgery
JP7139300B2 (en) * 2019-10-03 2022-09-20 本田技研工業株式会社 Recognition device, recognition method, and program
KR102871465B1 (en) * 2020-01-02 2025-10-16 엘지전자 주식회사 Enhancing performance of local device
DE102020103741A1 (en) * 2020-02-13 2021-08-19 Car.Software Estonia As Method for spatial characterization of at least one vehicle image
JP7495178B2 (en) * 2020-04-14 2024-06-04 株式会社Subaru Vehicle driving support device
KR102270198B1 (en) * 2020-06-08 2021-06-28 주식회사 에스아이에이 Method for object detection based on anchor-free rpn
US11798210B2 (en) * 2020-12-09 2023-10-24 Salesforce, Inc. Neural network based detection of image space suitable for overlaying media content
CN112633258B (en) * 2021-03-05 2021-05-25 天津所托瑞安汽车科技有限公司 Target determination method and device, electronic equipment and computer readable storage medium
RU2767831C1 (en) * 2021-03-26 2022-03-22 Общество с ограниченной ответственностью "Яндекс Беспилотные Технологии" Methods and electronic devices for detecting objects in the environment of an unmanned vehicle
EP4102466A4 (en) * 2021-04-26 2023-05-17 Beijing Baidu Netcom Science Technology Co., Ltd. OBJECT COLLISION DETECTION METHOD AND APPARATUS, DEVICE, AND STORAGE MEDIA
DE102021205094A1 (en) 2021-05-19 2022-11-24 Robert Bosch Gesellschaft mit beschränkter Haftung Quality check of training data for image classifiers
DE102021205271A1 (en) 2021-05-21 2022-11-24 Robert Bosch Gesellschaft mit beschränkter Haftung Quality check of training data for classification models for semantic segmentation of images
CN113435318B (en) * 2021-06-25 2025-02-25 上海商汤临港智能科技有限公司 Neural network training, image detection, driving control method and device
DE102021118065A1 (en) 2021-07-13 2023-01-19 Connaught Electronics Ltd. Method for generating three-dimensional information in a three-dimensional environment, computer program product, computer-readable storage medium and assistance system
US12043278B2 (en) * 2021-07-23 2024-07-23 Rivian Ip Holdings, Llc Systems and methods for determining drivable space
US11663807B2 (en) 2021-08-05 2023-05-30 Ford Global Technologies, Llc Systems and methods for image based perception
EP4131174A1 (en) * 2021-08-05 2023-02-08 Argo AI, LLC Systems and methods for image based perception
US11966452B2 (en) 2021-08-05 2024-04-23 Ford Global Technologies, Llc Systems and methods for image based perception
US12304515B2 (en) 2022-06-14 2025-05-20 Ford Global Technologies, Llc Vehicle path adjustment
WO2023245635A1 (en) * 2022-06-24 2023-12-28 Intel Corporation Apparatus and method for object detection
US12499657B2 (en) * 2022-09-26 2025-12-16 Micron Technology, Inc. Video stream augmentation using a deep learning device
JP2024109318A (en) * 2023-02-01 2024-08-14 トヨタ自動車株式会社 Driving Support Devices
EP4411670A1 (en) * 2023-02-03 2024-08-07 Aptiv Technologies AG Data structure for efficient training of semantic segmentation models
KR102870700B1 (en) * 2023-10-26 2025-10-14 주식회사 스카이오토넷 Cms camera device in large commercial vehicles and method for operating thereof
US12322162B1 (en) * 2024-05-09 2025-06-03 Geotab Inc. Systems and methods for training vehicle collision and near-miss detection models

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130322691A1 (en) * 2012-06-01 2013-12-05 Ricoh Company, Ltd. Target recognition system and target recognition method executed by the target recognition system
US8874267B1 (en) * 2012-06-20 2014-10-28 Google Inc. Avoiding blind spots of other vehicles
US20160054452A1 (en) * 2014-08-20 2016-02-25 Nec Laboratories America, Inc. System and Method for Detecting Objects Obstructing a Driver's View of a Road
US20160125249A1 (en) * 2014-10-30 2016-05-05 Toyota Motor Engineering & Manufacturing North America, Inc. Blur object tracker using group lasso method and apparatus
WO2017027030A1 (en) * 2015-08-12 2017-02-16 Hewlett Packard Enterprise Development Lp Retraining a machine classifier based on audited issue data

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8842163B2 (en) 2011-06-07 2014-09-23 International Business Machines Corporation Estimation of object properties in 3D world
US9466215B2 (en) * 2012-03-26 2016-10-11 Robert Bosch Gmbh Multi-surface model-based tracking
US9070202B2 (en) 2013-03-14 2015-06-30 Nec Laboratories America, Inc. Moving object localization in 3D using a single camera
US9396553B2 (en) 2014-04-16 2016-07-19 Xerox Corporation Vehicle dimension estimation from vehicle images
US10410096B2 (en) 2015-07-09 2019-09-10 Qualcomm Incorporated Context-based priors for object detection in images
US10029622B2 (en) 2015-07-23 2018-07-24 International Business Machines Corporation Self-calibration of a static camera from vehicle information
US10424064B2 (en) * 2016-10-18 2019-09-24 Adobe Inc. Instance-level semantic segmentation system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130322691A1 (en) * 2012-06-01 2013-12-05 Ricoh Company, Ltd. Target recognition system and target recognition method executed by the target recognition system
US8874267B1 (en) * 2012-06-20 2014-10-28 Google Inc. Avoiding blind spots of other vehicles
US20160054452A1 (en) * 2014-08-20 2016-02-25 Nec Laboratories America, Inc. System and Method for Detecting Objects Obstructing a Driver's View of a Road
US20160125249A1 (en) * 2014-10-30 2016-05-05 Toyota Motor Engineering & Manufacturing North America, Inc. Blur object tracker using group lasso method and apparatus
WO2017027030A1 (en) * 2015-08-12 2017-02-16 Hewlett Packard Enterprise Development Lp Retraining a machine classifier based on audited issue data

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Direct manipulation interface", 27 December 2016 (2016-12-27), XP002781196, Retrieved from the Internet <URL:https://en.wikipedia.org/w/index.php?title=Direct_manipulation_interface&oldid=756947614> [retrieved on 20180518] *
ARSALAN MOUSAVIAN ET AL: "3D Bounding Box Estimation Using Deep Learning and Geometry", ARXIV.ORG, 1 December 2016 (2016-12-01), pages 1 - 9, XP055474731, Retrieved from the Internet <URL:https://arxiv.org/pdf/1612.00496v1.pdf> [retrieved on 20180514] *
BO LI: "3D Fully Convolutional Network for Vehicle Detection in Point Cloud", 16 January 2017 (2017-01-16), XP055474739, Retrieved from the Internet <URL:https://arxiv.org/pdf/1611.08069.pdf> [retrieved on 20180514] *
XIAOZHI CHEN ET AL: "Multi-view 3D Object Detection Network for Autonomous Driving", ARXIV.ORG, 23 November 2016 (2016-11-23), XP055474745, Retrieved from the Internet <URL:https://arxiv.org/pdf/1611.07759v1.pdf> [retrieved on 20180514] *

Also Published As

Publication number Publication date
EP3607489A2 (en) 2020-02-12
KR20190132404A (en) 2019-11-27
US20200349365A1 (en) 2020-11-05
US11216673B2 (en) 2022-01-04
WO2018184963A2 (en) 2018-10-11
CN110678872A (en) 2020-01-10
KR102629651B1 (en) 2024-01-29
EP3607489B1 (en) 2023-05-24

Similar Documents

Publication Publication Date Title
WO2018184963A3 (en) Direct vehicle detection as 3d bounding boxes using neural network image processing
CA2975139C (en) Stereo camera system for collision avoidance during aircraft surface operations
CN106462996B (en) Method and device for displaying vehicle surrounding environment without distortion
US9102269B2 (en) Field of view matching video display system
WO2020146491A3 (en) Using light detection and ranging (lidar) to train camera and imaging radar deep learning networks
WO2020056431A8 (en) System and method for three-dimensional (3d) object detection
CA3027899C (en) Ground plane detection for placement of augmented reality objects
EP4266085A3 (en) Fusion-based object tracker using lidar point cloud and surrounding cameras for autonomous vehicles
EP3444748A3 (en) Automated detection and avoidance system
EP3293488A3 (en) System and method of simulataneously generating a multiple lane map and localizing a vehicle in the generated map
CN105718853B (en) Obstacle detection device and obstacle detection method
KR20190095592A (en) Method and Apparatus for Vehicle Detection Using Lidar Sensor and Camera
MY165967A (en) Moving-body-detecting device and moving-body-detecting system
CA3155737A1 (en) Systems and methods for providing for the processing of objects in vehicles
WO2015177643A3 (en) Systems and methods for braking a vehicle based on a detected object
EP2937757A3 (en) Methods and systems for object detection using multiple sensors
EP3103695A3 (en) Driver assistance apparatus and control method for the same
MX2017012837A (en) Rear obstacle detection and distance estimation.
EP3163506A1 (en) Method for stereo map generation with novel optical resolutions
JP2016530639A5 (en)
MX350354B (en) Three-dimensional object detection device.
EP4420930A3 (en) Image processing device, image processing method, and image processing system
WO2017158167A3 (en) A computer implemented method and systems for tracking an object in a 3d scene
WO2004028169A3 (en) Stereo night vision system for vehicles
MX344875B (en) Three-dimensional object detection device.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18714245

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 20197029041

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018714245

Country of ref document: EP

Effective date: 20191104