[go: up one dir, main page]

WO2018166747A1 - Perfectionnements apportés à une commande de véhicule - Google Patents

Perfectionnements apportés à une commande de véhicule Download PDF

Info

Publication number
WO2018166747A1
WO2018166747A1 PCT/EP2018/053853 EP2018053853W WO2018166747A1 WO 2018166747 A1 WO2018166747 A1 WO 2018166747A1 EP 2018053853 W EP2018053853 W EP 2018053853W WO 2018166747 A1 WO2018166747 A1 WO 2018166747A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
vehicle
terrain
empty region
water
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/053853
Other languages
English (en)
Inventor
Jithesh KOTTERI
Bineesh RAVI
Jithin JAYARAJ
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to DE112018001378.2T priority Critical patent/DE112018001378T5/de
Publication of WO2018166747A1 publication Critical patent/WO2018166747A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/15Road slope, i.e. the inclination of a road segment in the longitudinal direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis

Definitions

  • the invention relates to a system for controlling a vehicle.
  • the invention relates to a system for controlling steering of a land-based vehicle which is capable of driving in a variety of different and extreme terrains and conditions.
  • a forward-looking camera system detects lane markings on the road ahead of the vehicle.
  • feedback in the form of an audible alarm or haptic response, such as vibration of a steering wheel is provided in the case that the vehicle deviates excessively from a notional lane centreline or crosses a lane marking.
  • Some steering control systems automatically control steerable road wheel steering angle in order to maintain a vehicle in-lane when driving on a highway by reference to lane markings.
  • known steering control systems are unusable in off-road driving environments where such systems may be of particular value in reducing driver fatigue.
  • Embodiments of the invention may provide an apparatus, a method or a vehicle which addresses this problem.
  • Other aims and advantages of the invention will become apparent from the following description, claims and drawings.
  • a computing system for a vehicle comprising a processing means arranged to receive, from terrain data capture means comprising a stereoscopic camera system, terrain information indicative of the topography of the terrain extending ahead of the vehicle, the terrain information including at least a stereo image pair of terrain ahead of the vehicle, wherein the processing means is configured to:
  • a difference in height between at least two locations of the region is less than a location height difference threshold value
  • the system being configured to output a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.
  • Embodiments of the present invention have the advantage that the presence of a body of water ahead of the vehicle may be reliably detected and an output signal generated in response. It is to be understood that, where a body of water is present, the body of water will typically specularly reflect terrain features above the water such as sky, trees, bushes or the like. The apparent distance of these terrain features from the terrain data capture means, based on disparity of the stereo image pair, will be greater than the distance of the body of water from the terrain data capture means. Accordingly, the region of the 3D point cloud corresponding to the location of the body of water may appear to have a lower density of datapoints than would be the case if the body of water were a non-specularly reflecting body such as a body of earth.
  • Embodiments of the present invention attempt to detect the presence of a body of water based on typical characteristics of such a body, being the presence of specular reflections by the body of water in a captured image of the body of water and the characteristic of flatness typically associated with a body of water.
  • the system may be configured wherein, in order for the system to determine that an empty region corresponds to a body of water, the further condition must be met that the difference in height between at least two locations on substantially opposite sides of the region is less than a location height difference threshold value.
  • opposite sides points along a given column or row of pixels, that includes a portion of the body of water, at which the body of water meets ground.
  • the pixels whose height is considered may be pixels closest to the empty region, on opposite sides of the empty region, along the column or row.
  • the system may be configured to determine the slope of terrain to be experienced by the vehicle immediately before the empty region within the predicted path, wherein if the slope exceeds a predetermined slope value the system determines that the region does not correspond to a body of water.
  • the system may be further configured to determine the slope of terrain to be experienced by the vehicle immediately before the empty region within the predicted path, wherein if the slope exceeds a predetermined slope value the system determines that the region does not correspond to a body of water and may correspond to a crest of a slope.
  • the system may be further configured to identify, in at least one of the stereoscopic image pairs corresponding to the 3D point cloud, the empty region, and determine whether the empty region corresponds to a body of water in further dependence at least in part on RGB (red, green, blue) colour values of at least one image pixel corresponding to the empty region.
  • RGB red, green, blue
  • the system may be configured to determine whether the empty region corresponds to a body of water in dependence at least in part on an average RGB (red, green, blue) colour value over a plurality of image pixels corresponding to the empty region.
  • RGB red, green, blue
  • the system may be configured to determine that the empty region does not correspond to a body of water unless the average RGB (red, green, blue) colour value over a plurality of image pixels corresponding to the empty region is above a predetermined minimum average RGB colour value and less than a predetermined maximum average RGB colour value.
  • the system being configured to determine whether the number of pixels in an image of the empty region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value, comprises the system being configured to determine the disparity between corresponding pixels of the left and right stereo image pair corresponding to the empty region, the number of pixels corresponding to specular reflection being determined in dependence on the disparity between the corresponding pixels of the left and right image pair.
  • system is further configured to require that a predetermined width condition must be met in respect of a width of the region of at least one of the stereoscopic image pairs corresponding to the empty region in order for the empty region to correspond to an empty region.
  • this requirement is a necessary condition but not a sufficient condition, for the empty region to correspond to a body of water.
  • the predetermined width condition comprises the condition that a width or average width of at least a portion of the image that includes the empty region exceeds a predetermined threshold width value.
  • the system may be configured to require that at least a portion of the empty region has a width exceeding the threshold width value.
  • the portion may be a polygonal portion defined by the system by extraction of pixels of the left or right image of the stereoscopic image pair corresponding to the empty region.
  • the system may comprise a speed controller, the speed controller being configured to control vehicle speed based at least in part on whether a body of water has been identified in the path of the vehicle.
  • the speed controller may receive the signal that is output by the computing system signal in dependence on whether terrain corresponding to a body of water has been identified in the path of the vehicle and control vehicle speed based on the signal.
  • the system may be configured to provide an output to the speed controller indicative of a maximum recommended speed in dependence at least in part on whether a body of water has been identified in the path of the vehicle.
  • the system may be configured to provide an alert to a driver in dependence on whether a body of water has been identified in the path of the vehicle.
  • the terrain data capture means comprises a stereoscopic camera system.
  • the system may further comprise the terrain data capture means.
  • the system comprises an electronic processor having an electrical input for receiving the terrain information indicative of the topography of terrain ahead of the vehicle;
  • processor configured to access the memory device and execute the instructions stored therein such that it is operable to:
  • a difference in height between at least two locations of the region is less than a location height difference threshold value
  • a vehicle comprising a system according to another aspect.
  • a method of identifying terrain corresponding to a body of water implemented by means of a computing system of a vehicle,
  • the method comprising receiving, from terrain data capture means (185C) comprising a stereoscopic camera system, terrain information indicative of the topography of the terrain extending ahead of the vehicle (100), the terrain information including at least a stereo image pair of terrain ahead of the vehicle, the method comprising:
  • a difference in height between at least two locations of the region is less than a location height difference threshold value
  • the method comprising outputting a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.
  • a non- transitory computer readable medium loaded with the computer program product of another aspect.
  • a processor arranged to implement the method of another aspect, or the computer program product of another aspect.
  • FIGURE 1 is a schematic illustration of a vehicle according to an embodiment of the invention in plan view
  • FIGURE 3 is a high level schematic diagram of an embodiment of the vehicle speed control system of the present invention, including a cruise control system and a low-speed progress control system;
  • FIGURE 9 is a diagram mapping a domain of values of speckleCount and average intensity of R, G, B colour values determined in accordance with an embodiment of the present invention that are consistent with the presence of a ditch or body of water in a stereoscopic image pair of terrain ahead of a vehicle; and
  • FIGURE 10 is a flow diagram illustrating operation of a computing system of the vehicle of the embodiment of FIG. 1 .
  • references herein to a block such as a function block are to be understood to include reference to software code for performing the function or action specified which may be an output that is provided responsive to one or more inputs.
  • the code may be in the form of a software routine or function called by a main computer program, or may be code forming part of a flow of code not being a separate routine or function.
  • Reference to function block is made for ease of explanation of the manner of operation of embodiments of the present invention.
  • the nature of the terrain over which the vehicle is travelling may also be utilised in the LSP control system 12 to determine an appropriate increase or decrease in vehicle speed. For example, if the user selects a value of user set-speed that is not suitable for the nature of the terrain over which the vehicle is travelling, the system 12 is operable to automatically adjust the value of LSP_set-speed to a value lower than user set-speed. In some cases, for example, the user selected speed may not be achievable or appropriate over certain terrain types, particularly in the case of uneven or rough surfaces.
  • Y is an axis oriented in an upward direction with respect to the vehicle 100, corresponding to a substantially vertically upward direction when the vehicle 100 is parked on level ground, and Z is parallel to or coincident with a longitudinal axis of the vehicle 100, along the direction of travel of the vehicle 100.
  • the processing portion 19 determines a predicted path of the vehicle 100 based on the instant value of steering angle of the vehicle 100.
  • the steering angle is considered to directly related to the rotational angle of the steering wheel 171 of the vehicle 100.
  • the predicted path is considered to be a path that will be followed by the vehicle 100 if the vehicle continues moving with the instant steering angle and with substantially no slip or skid of the wheels of the vehicle 100. It is to be understood that the predicted path swept by a wheelbase the vehicle 100 may be considered to be an area of width substantially equal to a maximum track of the vehicle (maximum lateral distance between left and right front or rear wheels) and centred on a centreline of the vehicle 100.
  • the processing portion 19 identifies substantially continuous regions or areas of terrain that are defined by empty cells not being shadow cells and which have at least a portion that lies within the predicted path swept by the vehicle 100. These regions or areas are referred to herein as 'empty regions'. Thus, each empty region is defined by a substantially continuous region of empty cells not being shadow cells and which have at least a portion that lies within the predicted path swept by the vehicle 100.
  • a measurement of the slope of the terrain immediately ahead of the empty region may provide a useful indication whether the empty region corresponds to a body of water or to a region behind the crest of a slope.
  • FIG. 6(a) is an example of an image captured by a left camera of the stereoscopic camera system 185C of the vehicle 100.
  • FIG. 6(b) is a representation of the corresponding elevation map, generated according to the MLS methodology, i.e. the map corresponding to that shown in FIG. 5 in respect of the image of FIG. 6(a).
  • regions identified by the processing portion 19 as corresponding to regions of different type are shown with different contrast or grayscale.
  • Regions corresponding to obstacles OB are shown hatched and labelled OB, regions corresponding to obstacle shadows are labelled SH, and regions corresponding to empty cells not being obstacle shadow cells are labelled ER.
  • the left and right wheel tracks are shown superimposed on the map.
  • the corresponding features are indicated in the camera image of FIG. 6(a) for ease of correlation by the reader.
  • the terrain immediately preceding the region is considered to be a crest region, corresponding to the crest of a slope and not a body of water.
  • the predetermined crest gradient value set is substantially 10% and the predetermined crest width value is substantially 2m.
  • the processing portion 19 attempts to determine whether the region may correspond to a body of water or a ditch. In order to do this, the processing portion 19 identifies, in one of the pairs of 2D images captured by the stereoscopic camera system 185C, a region that corresponds to the empty region ER identified in the MLS map. The processing portion 19 extracts from this 2D image the region corresponding to the empty region ER and, in addition, an area of the image immediately surrounding region ER. This is so as to attempt to include more reflections from objects in the body of water, if indeed the empty region ER does correspond to a body of water.
  • the processing portion 19 attempts to determine the number of pixels in the extended cropped polygonal region that correspond to the reflection of an image of the environment, such as an image of a tree (being a near object) or sky (being a far object).
  • Such pixels are referred to herein as reflection pixels, and the number of pixels is referred to as the 'speckle count' for the extended cropped polygonal area.
  • the presence of reflection pixels is determined by considering the corresponding region of a disparity image, being a dataset of similar dimensions to each of the image pairs captured by the stereoscopic camera system, where the datapoints of the dataset correspond to the distance (disparity) between corresponding pixels of the respective images. It is to be understood that the disparity between pixels corresponding to a given point on an object will be higher the closer the object is to the camera system 185C. Thus, the disparity map will be expected to show relatively low disparity values in respect of reflection pixels in the empty region.
  • the processing portion 19 is configured to identify reflection pixels in respect of far objects by identifying pixels having a disparity corresponding to a distance of at least 40m from the camera system 185C.
  • the following equation may be employed:
  • Disparity ⁇ (focal length * (base line) / (depth)) where disparity and focal length are in units of pixels, whilst base line and depth are in units of metres.
  • the processing portion 19 identifies reflection pixels in respect of near objects The processing portion does this by considering the disparity values for each pixel along each substantially vertical column of the image, i.e. from the top of the image to the bottom of the image (or vice-versa). The processing portion 19 checks the variation in the disparity values from top to bottom (or from bottom to top). If no reflections are present in the image, the disparity values will gradually increase from top to bottom (or decrease from bottom to top), i.e. regions of the image corresponding to objects closer to the camera will have higher disparity than those corresponding to objects further away from the camera. However, if reflections are present, the disparity values may decrease rather than increase over at least some pixels moving from top to bottom.
  • the processing portion 19 considers the disparity values at the top and bottom of each column of the image, which are considered to correspond to regions that are not part of a body of water, such as ground or objects bordering a puddle. The processing portion 19 then determines the number of pixels in a given column for which the disparity is less than the disparity of the pixels at the top and bottom of the column. That is, pixels that represent an image of an object that is further away than objects corresponding to the pixels at the top and bottom of the 2D image. These pixels are considered to correspond to reflections from near objects.
  • the processing portion 19 then sums the number of near and far reflection pixels identified in the empty region ER to obtain a 'speckleCount' value.
  • the processing portion 19 is configured to then determine whether the empty region ER corresponds to a body of water. The processing portion 19 does this by first eliminating 'dark' regions and 'over bright' regions from the 3D point cloud dataset as regions that might correspond to a body of water.
  • the average width of the extended cropped polygonal region along the direction of travel of the vehicle of the image is greater than a width threshold value and the height difference (with respect to the MLS map) between cells at opposite ends of a column of cells spanning the extended polygonal region is less than a height difference threshold value.
  • the predetermined upper speckleCount threshold value is substantially 50 points per polygon region although other values may be useful.
  • the width threshold value is set to 20 pixels and the height difference threshold value is set to 0.25m.
  • the LSP control system 12 is configured to cause the speed of the vehicle 100 to reduce to a predetermined water crossing speed in the case that a body of water has been identified, or a predetermined ditch crossing speed in the case that a ditch has been identified.
  • the predetermined water crossing speed and the predetermined ditch crossing speed are each substantially 5kph although other values may be useful in some embodiments.
  • FIG. 10 is a flow diagram illustrating operation of the stereoscopic camera system 185C and processing portion 19 of the vehicle 100 of FIG. 1 .
  • the images are output to the processing portion 19 for step S103 although in some embodiments step S103 is performed by the camera system 185C.
  • the camera system 185C is configured repeatedly to capture stereoscopic image pairs and to output them to the processing portion 19.
  • image pairs are output to the processing portion 19 at a frame rate of 25 pairs per second. Other values may be useful such as 10 pairs per second, 12 pairs per second or any other suitable value.
  • step S1 15 the processing portion determines whether the slope exceeds a predetermined crest gradient value and a width of the empty region exceeds a predetermined crest width value.
  • the processing portion 19 If the slope does exceed the predetermined crest slope value and the width of the empty region exceeds the predetermined crest width value, the processing portion 19 outputs a signal indicating that a crest has been detected ahead of the vehicle 100. The processing portion 19 also outputs a signal indicative of the distance of the crest from the vehicle 100. The processing portion 19 then continues at step S101 .
  • the processing portion 19 achieves this by defining an extended polygonal region in the manner described above, and identifying the corresponding region in a disparity image generated based on the stereoscopic image pair of which the left image is currently being analysed.
  • the processing portion 19 identifies pixels of the left image pair corresponding to the empty region which correspond to a pixel of the disparity image having a disparity corresponding to a distance of more than a predetermined disparity image distance from the camera system 185C, in the present embodiment a distance of 40m. Other values may be useful in some embodiments. Such pixels are considered to correspond to specular reflections from far objects.
  • the processing portion 19 identifies pixels corresponding to specular reflections from near objects by considering the disparity values in columns of pixels of the left image corresponding to the empty region.
  • the processing portion 19 compares the disparity values of pixels in each column within the empty region with the values of disparity of the 'end' pixels of each column, i.e. the pixel at the top and bottom of each column, at the boundary of the empty region with respect to that column.
  • the processing portion then counts the number of pixels within the column that have disparity values that are less than the values of disparity of the end pixels. These pixels are considered to correspond to specular reflections from near objects.
  • the processing portion 19 then sums the number of pixels corresponding to specular reflections from near and far objects to generate the speckleCount value.
  • step S123 if the processing portion 19 has determined that the empty region does correspond to a body of water, the vehicle 100 outputs a signal indicative of the presence of a body of water ahead of the vehicle 100 and a signal indicative of the distance of the empty region from the vehicle 100. The processing portion 19 then continues at step S101 .
  • step S123 the processing portion 19 has determined the empty region does not correspond to a body of water
  • the processing portion 19 continues at step S127.
  • step S127 the processing portion outputs a signal indicative that a ditch is present ahead of the vehicle, and a signal indicative of the distance of the empty region from the vehicle 100.
  • the processing portion 19 then continues at step S101 .
  • Some embodiments of the present invention enable vehicle operation with enhanced composure when traversing terrain. This is at least in part due to a reduction in driver workload when operating with the LSP control system 12 active. This is because a driver is not required manually to reduce vehicle speed when approaching a terrain feature in the form of a body of water or ditch. Rather, the LSP control system 12 automatically causes a reduction in speed in response to the detection of the terrain feature.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

L'invention concerne un système informatique (10, 19 185C) pour un véhicule (100), le système comprenant un moyen de traitement (10, 19) configuré pour recevoir, à partir de moyens de capture de données de terrain (185C) comprenant un système d'appareil photographique stéréoscopique, des informations de terrain indicatives de la topographie du terrain s'étendant devant le véhicule (100), les informations de terrain comprenant au moins une paire d'images stéréo de terrain devant le véhicule, les moyens de traitement (10, 19) étant configurés pour : calculer un ensemble de données de nuage de points 3D par rapport au terrain devant le véhicule sur la base des informations de terrain; identifier une région de terrain sensiblement continue devant le véhicule et à l'intérieur d'un trajet prédit du véhicule dans lequel la densité de nombre de points de données dans le nuage de points 3D correspondant à cette région est inférieure à une valeur seuil de densité de points de données de région vide, ladite région étant considérée comme étant une région vide si la densité de nombre de points de données dans le nuage de points 3D correspondant à cette région est inférieure à une valeur seuil de densité de points de données de région vide; déterminer si la région vide correspond à un corps d'eau, en tant que région qui satisfait les critères suivants : le nombre de pixels dans une image de cette région capturée par le système d'appareil photographique stéréoscopique qui correspond à la réflexion spéculaire d'une caractéristique de l'environnement, y compris le terrain, dépasse une valeur seuil de densité de réflexion spéculaire; et une différence de hauteur entre au moins deux emplacements de la région est inférieure à une valeur seuil de différence de hauteur d'emplacement, le système étant configuré pour émettre un signal en fonction de si le terrain correspondant à un corps d'eau a été identifié devant le véhicule.
PCT/EP2018/053853 2017-03-15 2018-02-16 Perfectionnements apportés à une commande de véhicule Ceased WO2018166747A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112018001378.2T DE112018001378T5 (de) 2017-03-15 2018-02-16 Verbesserungen bei der fahrzeugsteuerung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201711008976 2017-03-15
IN201711008976 2017-03-15

Publications (1)

Publication Number Publication Date
WO2018166747A1 true WO2018166747A1 (fr) 2018-09-20

Family

ID=59065586

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/053853 Ceased WO2018166747A1 (fr) 2017-03-15 2018-02-16 Perfectionnements apportés à une commande de véhicule

Country Status (3)

Country Link
DE (1) DE112018001378T5 (fr)
GB (1) GB2563198B (fr)
WO (1) WO2018166747A1 (fr)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109657698A (zh) * 2018-11-20 2019-04-19 同济大学 一种基于点云的磁悬浮轨道障碍物检测方法
CN110009741A (zh) * 2019-06-04 2019-07-12 奥特酷智能科技(南京)有限公司 一种在Unity中生成环境点云地图的方法
CN110032193A (zh) * 2019-04-30 2019-07-19 盐城工业职业技术学院 一种智能拖拉机田间避障控制系统及方法
CN110111413A (zh) * 2019-04-08 2019-08-09 西安电子科技大学 一种基于水陆共存场景的稀疏点云三维模型建立方法
CN111084711A (zh) * 2019-12-25 2020-05-01 清华大学 一种基于主动视觉引导的导盲杖地形探测方法
WO2020107151A1 (fr) * 2018-11-26 2020-06-04 Beijing Didi Infinity Technology And Development Co., Ltd. Systèmes et procédés de gestion d'une carte en haute définition
CN113012187A (zh) * 2019-12-19 2021-06-22 动态Ad有限责任公司 使用曲面拟合的前景提取
CN113257022A (zh) * 2020-02-13 2021-08-13 奥迪股份公司 辅助驾驶装置、相应的方法、车辆、计算机设备和介质
CN113378647A (zh) * 2021-05-18 2021-09-10 浙江工业大学 基于三维点云的实时轨道障碍物检测方法
CN113884068A (zh) * 2020-07-02 2022-01-04 阿尔卑斯阿尔派株式会社 障碍物检测装置、障碍物检测方法以及存储介质
JP2022013536A (ja) * 2020-07-02 2022-01-18 アルプスアルパイン株式会社 障害物検出装置、障害物検出方法および障害物検出プログラム
US20230030503A1 (en) * 2021-07-30 2023-02-02 Hyundai Motor Company Apparatus for controlling stop of vehicle and method thereof
US20230221410A1 (en) * 2020-04-03 2023-07-13 Hitachi Astemo, Ltd. Object sensing device and object sensing method
CN116639105A (zh) * 2023-06-25 2023-08-25 长城汽车股份有限公司 一种车辆及其车辆的控制方法和控制装置
US11792644B2 (en) 2021-06-21 2023-10-17 Motional Ad Llc Session key generation for autonomous vehicle operation
CN117565870A (zh) * 2024-01-16 2024-02-20 江苏智能无人装备产业创新中心有限公司 一种越野无人车辆坡道路段超低车速预测控制方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2584383B (en) * 2019-02-08 2022-06-15 Jaguar Land Rover Ltd Vehicle control system and method
EP4094096A1 (fr) * 2020-01-24 2022-11-30 Outsight Procédé et système de génération d'une carte tridimensionnelle colorée

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7349776B2 (en) 2002-04-18 2008-03-25 Ford Global Technologies, Llc Vehicle control
GB2492655A (en) 2011-07-04 2013-01-09 Land Rover Uk Ltd Vehicle control system that evaluates driving condition indicator(s) to determine the most appropriate control mode
GB2499279A (en) 2012-02-13 2013-08-14 Jaguar Cars A driver advice system for a vehicle
GB2499461A (en) 2012-02-20 2013-08-21 Jaguar Cars Cruise control which sets a speed ceiling when off-road driving is detected
GB2507622A (en) 2012-08-16 2014-05-07 Jaguar Land Rover Ltd Vehicle speed control system which maintains target speed independently of slip
GB2508464A (en) 2012-08-16 2014-06-04 Jaguar Land Rover Ltd Vehicle speed control system that adjusts the target speed based on comfort level defined by movement of a portion of the vehicle body or of an occupant
DE102012112164A1 (de) * 2012-12-12 2014-06-12 Continental Teves Ag & Co. Ohg Videobasierte erkennung von hindernissen auf einer fahrbahn
WO2014139875A1 (fr) 2013-03-15 2014-09-18 Jaguar Land Rover Limited Système et procédé de régulation de vitesse de véhicule
US20150356357A1 (en) * 2013-01-24 2015-12-10 Isis Innovation Limited A method of detecting structural parts of a scene

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012037528A2 (fr) * 2010-09-16 2012-03-22 California Institute Of Technology Systèmes et procédés de détection d'eau automatisée utilisant des capteurs visibles

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7349776B2 (en) 2002-04-18 2008-03-25 Ford Global Technologies, Llc Vehicle control
GB2492655A (en) 2011-07-04 2013-01-09 Land Rover Uk Ltd Vehicle control system that evaluates driving condition indicator(s) to determine the most appropriate control mode
GB2492748A (en) 2011-07-04 2013-01-16 Land Rover Uk Ltd Vehicle control system that evaluates driving condition indicators
GB2499279A (en) 2012-02-13 2013-08-14 Jaguar Cars A driver advice system for a vehicle
GB2499461A (en) 2012-02-20 2013-08-21 Jaguar Cars Cruise control which sets a speed ceiling when off-road driving is detected
WO2013124321A1 (fr) 2012-02-20 2013-08-29 Jaguar Land Rover Limited Procédé de commande de la vitesse destiné à un véhicule
GB2507622A (en) 2012-08-16 2014-05-07 Jaguar Land Rover Ltd Vehicle speed control system which maintains target speed independently of slip
GB2508464A (en) 2012-08-16 2014-06-04 Jaguar Land Rover Ltd Vehicle speed control system that adjusts the target speed based on comfort level defined by movement of a portion of the vehicle body or of an occupant
DE102012112164A1 (de) * 2012-12-12 2014-06-12 Continental Teves Ag & Co. Ohg Videobasierte erkennung von hindernissen auf einer fahrbahn
US20150356357A1 (en) * 2013-01-24 2015-12-10 Isis Innovation Limited A method of detecting structural parts of a scene
WO2014139875A1 (fr) 2013-03-15 2014-09-18 Jaguar Land Rover Limited Système et procédé de régulation de vitesse de véhicule

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS, SPIE, PO BOX 10 BELLINGHAM WA 98227-0010 USA, 1 January 2009 (2009-01-01), XP040496947 *
RALPH AESCHIMANN ET AL: "Ground or obstacles? Detecting clear paths in vehicle navigation", 2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 1 May 2015 (2015-05-01), pages 3927 - 3934, XP055249069, ISBN: 978-1-4799-6923-4, DOI: 10.1109/ICRA.2015.7139747 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109657698B (zh) * 2018-11-20 2021-09-03 同济大学 一种基于点云的磁悬浮轨道障碍物检测方法
CN109657698A (zh) * 2018-11-20 2019-04-19 同济大学 一种基于点云的磁悬浮轨道障碍物检测方法
WO2020107151A1 (fr) * 2018-11-26 2020-06-04 Beijing Didi Infinity Technology And Development Co., Ltd. Systèmes et procédés de gestion d'une carte en haute définition
CN110111413A (zh) * 2019-04-08 2019-08-09 西安电子科技大学 一种基于水陆共存场景的稀疏点云三维模型建立方法
CN110032193A (zh) * 2019-04-30 2019-07-19 盐城工业职业技术学院 一种智能拖拉机田间避障控制系统及方法
CN110032193B (zh) * 2019-04-30 2020-07-03 盐城工业职业技术学院 一种智能拖拉机田间避障控制系统及方法
CN110009741A (zh) * 2019-06-04 2019-07-12 奥特酷智能科技(南京)有限公司 一种在Unity中生成环境点云地图的方法
CN110009741B (zh) * 2019-06-04 2019-08-16 奥特酷智能科技(南京)有限公司 一种在Unity中生成环境点云地图的方法
GB2591332B (en) * 2019-12-19 2024-02-14 Motional Ad Llc Foreground extraction using surface fitting
CN113012187A (zh) * 2019-12-19 2021-06-22 动态Ad有限责任公司 使用曲面拟合的前景提取
US11161525B2 (en) 2019-12-19 2021-11-02 Motional Ad Llc Foreground extraction using surface fitting
US11976936B2 (en) 2019-12-19 2024-05-07 Motional Ad Llc Foreground extraction using surface fitting
CN113012187B (zh) * 2019-12-19 2024-02-06 动态Ad有限责任公司 用于运载工具的方法和前景提取系统以及存储介质
GB2591332A (en) * 2019-12-19 2021-07-28 Motional Ad Llc Foreground extraction using surface fitting
CN111084711A (zh) * 2019-12-25 2020-05-01 清华大学 一种基于主动视觉引导的导盲杖地形探测方法
CN113257022A (zh) * 2020-02-13 2021-08-13 奥迪股份公司 辅助驾驶装置、相应的方法、车辆、计算机设备和介质
CN113257022B (zh) * 2020-02-13 2023-03-03 奥迪股份公司 辅助驾驶装置、相应的方法、车辆、计算机设备和介质
US20230221410A1 (en) * 2020-04-03 2023-07-13 Hitachi Astemo, Ltd. Object sensing device and object sensing method
US12481025B2 (en) * 2020-04-03 2025-11-25 Hitachi Astemo, Ltd. Object sensing device and object sensing method
US11763570B2 (en) * 2020-07-02 2023-09-19 Alps Alpine Co., Ltd. Obstacle detection device, obstacle detection method, and storage medium storing obstacle detection program
CN113884068A (zh) * 2020-07-02 2022-01-04 阿尔卑斯阿尔派株式会社 障碍物检测装置、障碍物检测方法以及存储介质
CN113884068B (zh) * 2020-07-02 2025-12-23 阿尔卑斯阿尔派株式会社 障碍物检测装置、障碍物检测方法以及存储介质
JP7460282B2 (ja) 2020-07-02 2024-04-02 アルプスアルパイン株式会社 障害物検出装置、障害物検出方法および障害物検出プログラム
JP2022013536A (ja) * 2020-07-02 2022-01-18 アルプスアルパイン株式会社 障害物検出装置、障害物検出方法および障害物検出プログラム
US20220004783A1 (en) * 2020-07-02 2022-01-06 Alps Alpine Co., Ltd. Obstacle detection device, obstacle detection method, and storage medium storing obstacle detection program
CN113378647B (zh) * 2021-05-18 2024-03-29 浙江工业大学 基于三维点云的实时轨道障碍物检测方法
CN113378647A (zh) * 2021-05-18 2021-09-10 浙江工业大学 基于三维点云的实时轨道障碍物检测方法
US11792644B2 (en) 2021-06-21 2023-10-17 Motional Ad Llc Session key generation for autonomous vehicle operation
US12054142B2 (en) * 2021-07-30 2024-08-06 Hyundai Motor Company Apparatus for controlling stop of vehicle and method thereof
US20230030503A1 (en) * 2021-07-30 2023-02-02 Hyundai Motor Company Apparatus for controlling stop of vehicle and method thereof
CN116639105A (zh) * 2023-06-25 2023-08-25 长城汽车股份有限公司 一种车辆及其车辆的控制方法和控制装置
CN117565870A (zh) * 2024-01-16 2024-02-20 江苏智能无人装备产业创新中心有限公司 一种越野无人车辆坡道路段超低车速预测控制方法
CN117565870B (zh) * 2024-01-16 2024-03-22 江苏智能无人装备产业创新中心有限公司 一种越野无人车辆坡道路段超低车速预测控制方法

Also Published As

Publication number Publication date
DE112018001378T5 (de) 2019-11-28
GB2563198A (en) 2018-12-12
GB2563198B (en) 2021-05-26
GB201707477D0 (en) 2017-06-21

Similar Documents

Publication Publication Date Title
WO2018166747A1 (fr) Perfectionnements apportés à une commande de véhicule
US11772647B2 (en) Control system for a vehicle
US11554778B2 (en) Vehicle speed control
US11603103B2 (en) Vehicle speed control
US11021160B2 (en) Slope detection system for a vehicle
US10611375B2 (en) Vehicle speed control
US11999378B2 (en) Control system for a vehicle
GB2577485A (en) Control system for a vehicle
WO2018007079A1 (fr) Perfectionnements apportés à la régulation de vitesse d'un véhicule
US10967846B2 (en) Vehicle speed control
GB2576265A (en) Improvements in vehicle speed control
GB2551711A (en) Improvements in vehicle speed control
GB2576450A (en) Improvements in vehicle speed control
GB2584587A (en) Control system for a vehicle
GB2552940A (en) Improvements in vehicle control
GB2577676A (en) Control system for a vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18707656

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18707656

Country of ref document: EP

Kind code of ref document: A1