WO2014090245A1 - Verfahren und vorrichtung zur befahrbarkeitsanalyse - Google Patents
Verfahren und vorrichtung zur befahrbarkeitsanalyse Download PDFInfo
- Publication number
- WO2014090245A1 WO2014090245A1 PCT/DE2013/200336 DE2013200336W WO2014090245A1 WO 2014090245 A1 WO2014090245 A1 WO 2014090245A1 DE 2013200336 W DE2013200336 W DE 2013200336W WO 2014090245 A1 WO2014090245 A1 WO 2014090245A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- trafficability
- image data
- vehicle
- segmentation
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
Definitions
- the invention relates to a method and a device for trafficability analysis, which are particularly suitable for use in driver assistance systems.
- Camera-based driver assistance systems which recognize the course of their own lane on the basis of the lane markings, are now established in the market, and their use is already required by law in certain application areas.
- these driver assistance systems recognize the course of the markings of the own and the neighboring lanes and estimate therefrom the position of the own vehicle relative to the lane markings. Unintentional leaving the lane can thus be detected early and the system can initiate a suitable reaction, the z. B. warns the driver before leaving or prevented by driving the steering a leaving the lane.
- driver assistance systems which warn not only when leaving a lane or prevent the departure, but to support a driver, for example in an evasive maneuver need for such a function more information about the possible path of the own vehicle than by the above-mentioned pure lane marking detecting systems is determined. If, for example, the goal of a driver assistance system is to prevent an accident by means of suitable automatic evasion, such a system also requires reliable information about whether a possible avoidance path is passable at all, in addition to information about one's own traffic lane, so that the vehicle is no longer damaged by the evasion or caused, as it would be by an accident in case of non-evasion. The determination of such information is referred to herein as trafficability analysis.
- the object of the present invention is now to propose a method and a device for trafficability analysis. This object is solved by the subject matters of the independent claims. Further embodiments of the invention will become apparent from the dependent claims.
- image data is understood to mean not only data generated by the camera-based system, but also those of all environmental detection systems, for example radar, lidar-based systems, which can supply data to an environment.
- the detection of different areas is performed on the basis of an estimated ground level of the environment, whereby computing time can be saved and thus, as a rule, faster analysis results are obtained.
- driving activities recognized in the trafficability analysis are taken into account in different areas, as a result of which a more reliable analysis result can be obtained.
- An embodiment of the invention now relates to a trafficability analysis method using a computer comprising the steps of: receiving image data of an environment in front of a vehicle, analyzing the image data to recognize different areas in an environment image, and analyzing detected different areas with respect to the trafficability the vehicle. Analyzing the image data to detect different regions may include the following steps:
- a method based on environmental images acquired with a plurality of camera optics or camera beams may be employed or a method based on recording with a camera optics at various positions utilizing motion stereo may be employed.
- the segmentation of the estimated ground plane on the relevant pixels can be performed on the basis of color, saturation, intensity and / or texture information.
- the segmentation of the estimated ground plane on the relevant pixels can be performed on the basis of variance information of the calculated positions of pixels in space.
- the analysis of detected different areas with regard to the trafficability by the vehicle may include the detection of obstacles, in particular raised objects, and / or the detection of driving activities.
- a driving activity is understood in particular to mean that a vehicle other than its own is currently driving in an area or has already traveled.
- the driving activity can Include information about the direction of travel of the other vehicle, which counter or cross traffic can be considered.
- an area can be excluded from trafficability. This applies, for example, to oncoming traffic even if driving in principle would be possible, but actual driving offers a great risk of a frontal accident.
- the recognition of driving activities may include receiving and evaluating data of a camera-, radar- and / or lidar-based object recognition and / or receiving and evaluating an object list generated with a camera, radar and / or lidar-based object recognition.
- the recognition of driving activities can also have a long-term observation of driving activities to increase the recognition reliability, the transfer of a trafficability classification to similar image areas and / or a dynamic exclusion of trafficability when detected danger by driving activity observation.
- a further embodiment of the invention relates to a trafficability analysis apparatus using a computer having first means for receiving image data of an environment in front of a vehicle, second means for analyzing the image data for detecting different areas in an environment image, and third means for analyzing detected different ones Areas with regard to vehicle passability.
- the second means may be arranged to carry out a method according to the invention and as described above, and the third means may be designed to carry out a method according to the invention and as described above.
- Another embodiment of the invention relates to a driver assistance system having a device according to the invention and as described herein. Further advantages and possible applications of the present invention will become apparent from the following description in conjunction with the / in the drawing (s) illustrated embodiment (s).
- FIG. 1 shows a flow chart of an embodiment of a method for trafficability analysis according to the invention.
- Fig. 2 shows an example of a detected with a digital camera
- Fig. 3 shows another example of a detected with a digital camera
- An environmental image in front of a vehicle segmented by a trafficability analysis method according to the present invention a block diagram of an embodiment of an apparatus for Trafficability analysis according to the invention.
- the flowchart of a program executed by a computer, shown in FIG. 1, is used to analyze image data generated, for example, by a stereovision camera which captures images of the surroundings in front of a vehicle and may belong to a camera-based driver assistance system with regard to vehicle driveability For example, to be able to quickly determine a suitable alternative route in the event of an evasive maneuver.
- step S10 digital image data of the surroundings in front of the vehicle are received by the stereovision camera for the drivability analysis. for example via a special image data transmission line, a vehicle bus or a radio link.
- the received image data is then analyzed in subsequent steps S12-S20 to detect different regions in the environment image.
- steps S22-S24 the different areas identified in the preceding steps are analyzed with regard to their trafficability by the vehicle and trafficable areas are recognized.
- the recognized as passable areas can then be issued to be processed by a driver assistance system, which should assist a driver in an evasive maneuver and to signal him drivable alternative routes.
- a drivable range can typically be determined by analysis of changing or consistent textures, e.g. B. be detected by detecting a transition from a tar track to a turf at the roadside.
- an assessment of the trafficability of the areas adjacent to the own roadway from image information alone is often not reliably possible. So could one adjacent lane be equipped with a different surface than its own lane, but in the image as an unattached strip of sand.
- This problem of separation and detection of different areas can also occur in the stereovision methods which are often used today, which calculate a spatial (3D) coordinate for each pixel of images captured with a 3D camera. With the aid of these methods, it is basically possible to separate raised objects from the ground level, as is indicated by way of example in FIG.
- raised objects 20 and 22 can be separated from the ground plane 12.
- ground plane 12 for example an asphalt road 14 bounded by meadows (right and left side area 16 or 18), then reliable separation of the areas is often impossible (separation into trafficable / not passable).
- a separation of different areas even within a plane such as the ground plane 12 could in principle be carried out by means of, for example, color, intensity or texture-based segmentation of mono images, ie the separation of different areas within the ground plane 12 such as the asphalt road 14 and the adjacent meadows 16 and 18 and the right object / obstacle 20 (FIG. 2) and 30, 32 (FIG. 3), which do not protrude beyond the horizontal line of the ground plane 12.
- a major disadvantage of this method is the required high computational effort, which speaks so far against a series application, especially in driver assistance systems.
- the method according to the invention now combines in the following steps a texture-based segmentation with stereo vision in order to obtain the advantages of both methods with reduced computational complexity.
- the segmentation can be carried out only on the area which can not be subdivided by the stereo-vision or stereo method (only the ground plane 12 in FIG. 2 instead of the complete, framed area including the objects).
- a segmented ground plane in which passable areas are separated from non-drivable areas is obtained, the computational outlay being reduced compared to a segmentation of the full screen.
- step S12 the position of pixels in space is calculated from a plurality of images captured by the stereo-vision camera with the aid of a stereo-vision approach. Based on the space points, an estimate of the ground plane 12 is made in the next step S14. With the help of the determined ground level, the relevant pixels for the segmentation of the ground level 12 can be determined in step S16. On these pixels, a segmentation of the ground plane 12 is performed in step S18. Since the number of pixels to be segmented is significantly lower than in the original image, the computational outlay for the segmentation step S18 can be significantly reduced. The result is a segmentation of the ground plane 12, which provides additional information about passable / non-drivable areas.
- the segmentation of the selected pixels can be performed by color, intensity or texture.
- additional information for a segmentation for example, the variance of the height of the spatial points (the variance is eg for a meadow in addition to the road surface higher than for a flat road surface) or a small height deviation can be used.
- the technical advantage of this procedure is a segmentation of points in the ground plane on the basis of features in the image (eg intensity, color, texture), compared to a segmentation of the frame by a suitable selection of pixels (stereo ground level) computing time is saved and also additional information for the segmentation (eg variance of the height of the space points) are made available.
- Decisive here is the selection of pixels to be segmented from the image with the aid of an estimate of the (relevant) ground plane, which is carried out with the aid of a stereo method.
- the different regions 14, 16, 18 of the ground plane 12 (see FIGS. 2 and 3) obtained by the segmentation in step S18 are output in a subsequent step S20 for further processing by a driver assistance system.
- the output different areas are analyzed in subsequent steps S22 and S24 with regard to their trafficability.
- step S22 for example, obstacles 20 and 22 in the right and left side regions 16 and 18 of the road 14 (FIG. 2) and obstacles 30 and 32 in the right side region 16 of the road 14 (FIG. 3) are recognized, for example by a texture , Color or intensity analysis. If obstacles 20 and 22 or 30 and 32 are detected in one area, the corresponding area 16 and 18 with an obstacle 20 or 22 (FIG. 2) and the area 16 with obstacles 30 and 32 (FIG. not passable ", areas without obstacles are marked as" passable ".
- the trafficability assessment or analysis of areas around the vehicle contains information as to whether a driving activity has already been or will be perceived in the areas investigated.
- driving activity could be determined by a vehicle-own sensor, for. B. by a camera-based object recognition, or else by the merger with other sensors, eg. B. with the object list of a radar-based sensor.
- a driving activity 28 has been detected in a region 18 in the image, then this region can with high probability be drivable be considered.
- An observation of the area over a longer period of time can contribute to a correspondingly higher security of this assessment.
- step S24 is not limited to camera-based systems, but the analysis and fused consideration of driving activity can be used in all traffic assessment systems.
- the marked as "not passable” and "passable” areas can be further processed by the driver assistance system, in particular they can be used to determine a possible alternative route in an on-road obstacle 14. If an evasive maneuver is necessary, a determined alternative route can either be passively signaled to the driver, for example by a visual display or by a voice output similar to a navigation system, or a determined alternative route can be used for an active intervention in the vehicle control, for example Generate autonomous steering interventions for initiating and possibly performing an evasive maneuver.
- 4 shows a block diagram of a trafficability analysis device 100 according to the invention, which processes data from a stereo-vision camera with a first and second camera 102 or 104. The two cameras 102 and 104 provide image data of the surroundings in front of a vehicle.
- This image data is supplied to a stereo vision processing unit 106, which calculates the position of pixels in the space, that is, executes the above-explained process step S12.
- the calculated pixel positions in the space are transmitted to a ground level estimation unit 108, which estimates a ground level in the surrounding images based on the obtained space points in accordance with method step S14 explained above.
- a relevant pixel selection unit 1 1 0 determined on the basis of the unit 108 estimated ground level and the image data from the two cameras 1 02 and 1 04 relevant pixels for a segmentation of the ground plane (corresponding to the above-explained method step S1 6).
- an image segmentation unit 1 1 2 Based on the relevant pixels determined by the unit 1 1 0, an image segmentation unit 1 1 2 performs a segmentation of the ground plane (method step S1 8).
- the different areas of the ground level determined by the unit 1 1 2 are output from a ground level area output unit 1 14 in a form suitable for further processing to a trafficability analysis unit 1 1 6 which analyzes each of the output different areas in terms of drivability (corresponding the method steps S22 and S24 explained above and outputs the result of the analysis, for example in the form of a list, as follows:
- the above lists can be further processed by a driver assistance system as described above.
- the device shown in Fig. 4 may be implemented in hardware and / or software.
- it may be implemented in the form of an Application Specific Integrated Circuit (ASIC) or Programmable Gate Array (FPGA) or a microprocessor or controller that executes a firmware implementing the method shown in FIG.
- ASIC Application Specific Integrated Circuit
- FPGA Programmable Gate Array
- microprocessor or controller that executes a firmware implementing the method shown in FIG.
- the present invention enables a computationally efficient audibility analysis, especially for use in driver assistance systems.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP13826921.2A EP2932434A1 (de) | 2012-12-11 | 2013-12-06 | Verfahren und vorrichtung zur befahrbarkeitsanalyse |
| DE112013005909.6T DE112013005909A5 (de) | 2012-12-11 | 2013-12-06 | Verfahren und Vorrichtung zur Befahrbarkeitsanalyse |
| US14/647,958 US9690993B2 (en) | 2012-12-11 | 2013-12-06 | Method and device for analyzing trafficability |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102012112104.4A DE102012112104A1 (de) | 2012-12-11 | 2012-12-11 | Verfahren und vorrichtung zur befahrbarkeitsanalyse |
| DE102012112104.4 | 2012-12-11 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014090245A1 true WO2014090245A1 (de) | 2014-06-19 |
Family
ID=50064320
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/DE2013/200336 Ceased WO2014090245A1 (de) | 2012-12-11 | 2013-12-06 | Verfahren und vorrichtung zur befahrbarkeitsanalyse |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US9690993B2 (de) |
| EP (1) | EP2932434A1 (de) |
| DE (2) | DE102012112104A1 (de) |
| WO (1) | WO2014090245A1 (de) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3054400A1 (de) * | 2015-02-09 | 2016-08-10 | Toyota Jidosha Kabushiki Kaisha | Fahrstrassenoberflächenerkennungsvorrichtung und fahrstrassenoberflächenerkennungsverfahren |
Families Citing this family (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10222932B2 (en) | 2015-07-15 | 2019-03-05 | Fyusion, Inc. | Virtual reality environment based manipulation of multilayered multi-view interactive digital media representations |
| US11006095B2 (en) | 2015-07-15 | 2021-05-11 | Fyusion, Inc. | Drone based capture of a multi-view interactive digital media |
| US10147211B2 (en) | 2015-07-15 | 2018-12-04 | Fyusion, Inc. | Artificially rendering images using viewpoint interpolation and extrapolation |
| US11095869B2 (en) | 2015-09-22 | 2021-08-17 | Fyusion, Inc. | System and method for generating combined embedded multi-view interactive digital media representations |
| US12495134B2 (en) | 2015-07-15 | 2025-12-09 | Fyusion, Inc. | Drone based capture of multi-view interactive digital media |
| US10242474B2 (en) | 2015-07-15 | 2019-03-26 | Fyusion, Inc. | Artificially rendering images using viewpoint interpolation and extrapolation |
| US12261990B2 (en) | 2015-07-15 | 2025-03-25 | Fyusion, Inc. | System and method for generating combined embedded multi-view interactive digital media representations |
| US11783864B2 (en) * | 2015-09-22 | 2023-10-10 | Fyusion, Inc. | Integration of audio into a multi-view interactive digital media representation |
| US11202017B2 (en) | 2016-10-06 | 2021-12-14 | Fyusion, Inc. | Live style transfer on a mobile device |
| US10437879B2 (en) | 2017-01-18 | 2019-10-08 | Fyusion, Inc. | Visual search using multi-view interactive digital media representations |
| US20180227482A1 (en) | 2017-02-07 | 2018-08-09 | Fyusion, Inc. | Scene-aware selection of filters and effects for visual digital media content |
| US10313651B2 (en) | 2017-05-22 | 2019-06-04 | Fyusion, Inc. | Snapshots at predefined intervals or angles |
| US11069147B2 (en) | 2017-06-26 | 2021-07-20 | Fyusion, Inc. | Modification of multi-view interactive digital media representation |
| US10592747B2 (en) | 2018-04-26 | 2020-03-17 | Fyusion, Inc. | Method and apparatus for 3-D auto tagging |
| US12216195B2 (en) * | 2019-01-07 | 2025-02-04 | Ainstein Ai, Inc. | Radar-camera detection system and methods |
| EP3991089A1 (de) * | 2019-06-27 | 2022-05-04 | Zenuity AB | Verfahren und system zur schätzung einer antreibbaren oberfläche |
| CN112396051B (zh) * | 2019-08-15 | 2024-05-03 | 纳恩博(北京)科技有限公司 | 可通行区域的确定方法及装置、存储介质、电子装置 |
| EP4357944A1 (de) * | 2022-10-20 | 2024-04-24 | Zenseact AB | Identifizierung unbekannter verkehrsobjekte |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1684142A1 (de) * | 2005-01-20 | 2006-07-26 | Robert Bosch Gmbh | Verfahren zur Kursprädiktion in Fahrerassistenzsystemen für Kraftfahrzeuge |
| DE102005045017A1 (de) * | 2005-09-21 | 2007-03-22 | Robert Bosch Gmbh | Verfahren und Fahrerassistenzsystem zur sensorbasierten Anfahrtsteuerung eines Kraftfahrzeugs |
| US20100013615A1 (en) * | 2004-03-31 | 2010-01-21 | Carnegie Mellon University | Obstacle detection having enhanced classification |
| US20100104199A1 (en) * | 2008-04-24 | 2010-04-29 | Gm Global Technology Operations, Inc. | Method for detecting a clear path of travel for a vehicle enhanced by object detection |
Family Cites Families (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7783403B2 (en) * | 1994-05-23 | 2010-08-24 | Automotive Technologies International, Inc. | System and method for preventing vehicular accidents |
| US7630806B2 (en) * | 1994-05-23 | 2009-12-08 | Automotive Technologies International, Inc. | System and method for detecting and protecting pedestrians |
| US8140358B1 (en) * | 1996-01-29 | 2012-03-20 | Progressive Casualty Insurance Company | Vehicle monitoring system |
| US6104812A (en) * | 1998-01-12 | 2000-08-15 | Juratrade, Limited | Anti-counterfeiting method and apparatus using digital screening |
| US20040247157A1 (en) * | 2001-06-15 | 2004-12-09 | Ulrich Lages | Method for preparing image information |
| JP3960092B2 (ja) * | 2001-07-12 | 2007-08-15 | 日産自動車株式会社 | 車両用画像処理装置 |
| EP1671216B1 (de) * | 2003-10-09 | 2016-12-07 | Honda Motor Co., Ltd. | Detektion beweglicher objekte durch verwendung von computer-vision mit der fähigkeit für niedrige beleuchtungstiefe |
| FR2898986B1 (fr) * | 2006-03-24 | 2008-05-23 | Inrets | Detection d'obstacle |
| US8340421B2 (en) * | 2008-02-04 | 2012-12-25 | Eyep Inc. | Three-dimensional system and method for connection component labeling |
| JP5216010B2 (ja) * | 2009-01-20 | 2013-06-19 | 本田技研工業株式会社 | ウインドシールド上の雨滴を同定するための方法及び装置 |
| JP4788798B2 (ja) * | 2009-04-23 | 2011-10-05 | トヨタ自動車株式会社 | 物体検出装置 |
| WO2012037528A2 (en) * | 2010-09-16 | 2012-03-22 | California Institute Of Technology | Systems and methods for automated water detection using visible sensors |
| JP2012253690A (ja) * | 2011-06-06 | 2012-12-20 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
| CN103177236B (zh) * | 2011-12-22 | 2016-06-01 | 株式会社理光 | 道路区域检测方法和装置、分道线检测方法和装置 |
| CN104335264A (zh) * | 2012-06-14 | 2015-02-04 | 丰田自动车株式会社 | 车道划分标示检测装置、驾驶辅助系统 |
| JP5829980B2 (ja) * | 2012-06-19 | 2015-12-09 | トヨタ自動車株式会社 | 路側物検出装置 |
| US9488483B2 (en) * | 2013-05-17 | 2016-11-08 | Honda Motor Co., Ltd. | Localization using road markings |
| WO2015024257A1 (en) * | 2013-08-23 | 2015-02-26 | Harman International Industries, Incorporated | Unstructured road boundary detection |
-
2012
- 2012-12-11 DE DE102012112104.4A patent/DE102012112104A1/de not_active Withdrawn
-
2013
- 2013-12-06 DE DE112013005909.6T patent/DE112013005909A5/de active Pending
- 2013-12-06 WO PCT/DE2013/200336 patent/WO2014090245A1/de not_active Ceased
- 2013-12-06 EP EP13826921.2A patent/EP2932434A1/de not_active Ceased
- 2013-12-06 US US14/647,958 patent/US9690993B2/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100013615A1 (en) * | 2004-03-31 | 2010-01-21 | Carnegie Mellon University | Obstacle detection having enhanced classification |
| EP1684142A1 (de) * | 2005-01-20 | 2006-07-26 | Robert Bosch Gmbh | Verfahren zur Kursprädiktion in Fahrerassistenzsystemen für Kraftfahrzeuge |
| DE102005045017A1 (de) * | 2005-09-21 | 2007-03-22 | Robert Bosch Gmbh | Verfahren und Fahrerassistenzsystem zur sensorbasierten Anfahrtsteuerung eines Kraftfahrzeugs |
| US20100104199A1 (en) * | 2008-04-24 | 2010-04-29 | Gm Global Technology Operations, Inc. | Method for detecting a clear path of travel for a vehicle enhanced by object detection |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2932434A1 * |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3054400A1 (de) * | 2015-02-09 | 2016-08-10 | Toyota Jidosha Kabushiki Kaisha | Fahrstrassenoberflächenerkennungsvorrichtung und fahrstrassenoberflächenerkennungsverfahren |
| US9971946B2 (en) | 2015-02-09 | 2018-05-15 | Toyota Jidosha Kabushiki Kaisha | Traveling road surface detection device and traveling road surface detection method |
Also Published As
| Publication number | Publication date |
|---|---|
| US9690993B2 (en) | 2017-06-27 |
| US20150324649A1 (en) | 2015-11-12 |
| DE102012112104A1 (de) | 2014-06-12 |
| DE112013005909A5 (de) | 2015-09-10 |
| EP2932434A1 (de) | 2015-10-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2014090245A1 (de) | Verfahren und vorrichtung zur befahrbarkeitsanalyse | |
| DE102013205950B4 (de) | Verfahren zum Detektieren von Straßenrändern | |
| DE102015105248B4 (de) | Verfahren und System zum Erzeugen eines Bildes der Umgebung eines Gelenkfahrzeugs | |
| DE102012221563B4 (de) | Funktionsdiagnose und validierung eines fahrzeugbasierten bildgebungssystems | |
| WO2014032904A1 (de) | Verfahren und vorrichtung zum erkennen einer position eines fahrzeugs auf einer fahrspur | |
| DE102016200828B4 (de) | Objekterfassungsvorrichtung und Objekterfassungsverfahren | |
| DE102016118502A1 (de) | Verfahren, Einrichtung und Vorrichtung zum Ermitteln einer Fahrbahngrenze | |
| DE112018007485T5 (de) | Straßenoberflächen-Detektionsvorrichtung, Bildanzeige-Vorrichtung unter Verwendung einer Straßenoberflächen-Detektionsvorrichtung, Hindernis-Detektionsvorrichtung unter Nutzung einer Straßenoberflächen-Detektionsvorrichtung, Straßenoberflächen-Detektionsverfahren, Bildanzeige-Verfahren unter Verwendung eines Straßenoberflächen-Detektionsverfahrens, und Hindernis-Detektionsverfahren unter Nutzung eines Straßenoberflächen-Detektionsverfahrens | |
| DE102015203016A1 (de) | Verfahren und Vorrichtung zur optischen Selbstlokalisation eines Kraftfahrzeugs in einem Umfeld | |
| EP3520023B1 (de) | Detektion und validierung von objekten aus sequentiellen bildern einer kamera | |
| WO2013029722A2 (de) | Verfahren zur umgebungsrepräsentation | |
| DE102015114403A1 (de) | Annäherungsobjekterfassungsvorrichtung für ein Fahrzeug und Annäherungsobjekterfassungsverfahren dafür | |
| DE102015115012A1 (de) | Verfahren zum Erzeugen einer Umgebungskarte einer Umgebung eines Kraftfahrzeugs anhand eines Bilds einer Kamera, Fahrerassistenzsystem sowie Kraftfahrzeug | |
| DE102018108751B4 (de) | Verfahren, System und Vorrichtung zum Erhalten von 3D-Information von Objekten | |
| EP2023265A1 (de) | Verfahren für eine Erkennung eines Gegenstandes | |
| DE102017117593A1 (de) | Fahrzeugfahrassistenzvorrichtung | |
| DE102013012930A1 (de) | Verfahren zum Bestimmen eines aktuellen Abstands und/oder einer aktuellen Geschwindigkeit eines Zielobjekts anhand eines Referenzpunkts in einem Kamerabild, Kamerasystem und Kraftfahrzeug | |
| DE102017103540A1 (de) | Ermitteln einer Winkelstellung eines Anhängers ohne Zielmarkierung | |
| WO2019057252A1 (de) | Verfahren und vorrichtung zum erkennen von fahrspuren, fahrerassistenzsystem und fahrzeug | |
| DE102018121008A1 (de) | Kreuzverkehrserfassung unter verwendung von kameras | |
| EP1944212B1 (de) | Verfahren und Vorrichtung zum Ermitteln potentiell gefährdender Objekte für ein Fahrzeug | |
| DE102019132012B4 (de) | Verfahren und System zur Detektion von kleinen unklassifizierten Hindernissen auf einer Straßenoberfläche | |
| DE102015211871A1 (de) | Objekterkennungsvorrichtung | |
| WO2019162327A2 (de) | Verfahren zur ermittlung einer entfernung zwischen einem kraftfahrzeug und einem objekt | |
| DE102021133089A1 (de) | Vorrichtung zur Ermittlung einer Topographie einer Fahrzeugumgebung, Fahrzeug und Verfahren |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13826921 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2013826921 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14647958 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 112013005909 Country of ref document: DE Ref document number: 1120130059096 Country of ref document: DE |
|
| REG | Reference to national code |
Ref country code: DE Ref legal event code: R225 Ref document number: 112013005909 Country of ref document: DE |