WO2018016394A1 - 走路境界推定装置及びそれを用いた走行支援システム - Google Patents
走路境界推定装置及びそれを用いた走行支援システム Download PDFInfo
- Publication number
- WO2018016394A1 WO2018016394A1 PCT/JP2017/025356 JP2017025356W WO2018016394A1 WO 2018016394 A1 WO2018016394 A1 WO 2018016394A1 JP 2017025356 W JP2017025356 W JP 2017025356W WO 2018016394 A1 WO2018016394 A1 WO 2018016394A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- boundary
- road
- feature
- value
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a road boundary estimation device and a driving support system using the same.
- the road boundary represents a boundary that separates the travelable area and the non-travelable area.
- the travelable region indicates a region where the host vehicle can travel safely without colliding with an obstacle or departing from the road.
- the road boundary when viewed from the host vehicle is the end of the road (typically composed of structures such as side walls, guardrails, curbs, etc., but sometimes there is no structure such as gravel) or on the road Can be regarded as a boundary with other obstacles (vehicles, bicycles, pedestrians, etc.).
- LDP LiD Departure Prevention
- LKS LiD Driving Assistance Systems
- Prior art document 1 and patent document 1 show a method for estimating a road boundary by fitting a straight line at a position where there is a height difference between the inside and the outside of the boundary.
- Prior Art Document 2 shows a method for estimating a road boundary as an optimization problem for searching for a route that minimizes the cost in a certain cost space.
- Patent Document 2 in order to appropriately detect a travelable region even when the boundary is unclear, the vertical direction of the image is based on a feature amount similar to an edge of the image called a monogenic signal.
- a means for solving the left and right runway boundary positions extending as an optimization problem is shown.
- a route in which each point continues is configured so that the direction in which the points continue deviates from the tilt restriction is not considered to be a boundary of the travelable region. Cumulative cost is not calculated.
- a boundary corresponding to a linear element in an image can be easily adopted by adding a cost according to an angle formed by a local azimuth vector and each route when searching for an optimum route during accumulation.” It is described.
- the track boundary is likely to have a shape that simply extends continuously and monotonously in the depth direction, but obstacles on the road are within the sensor viewing angle.
- the shape is discontinuous and complicated.
- the former case is a case where there are vehicles, bicycles, pedestrians, etc. on the road.
- the road edge tends to be an object with weak features compared to the object to be detected by obstacle detection or lane detection, so that the boundary position tends to be ambiguous. From the viewpoint of the feature of 'distance', it is difficult to distinguish from a road surface at a low-end road edge such as a curbstone, which is weak compared to obstacles on the road such as vehicles and pedestrians. Also, from the viewpoint of the characteristics of the “image shading”, it is difficult to distinguish from the road surface because there is no clear shading edge at the boundary position, which is weak compared to the lane marker.
- the road boundary is required to be estimated at the position of the road edge where the boundary is easily discontinuous and has a complicated shape as described above, and the boundary feature amount tends to be poor from the viewpoint of distance and image. .
- the boundary is ambiguous as described above, the reliability of the result is considered to decrease, so it is important to output only the estimated position of the section that can ensure a certain level of reliability. It is done. Based on these assumptions, the problems of the prior art will be further described.
- the position and the feature amount of the obstacle are expressed by an occupancy grid map (OccupancyGridMap), and the road boundary is expressed as a path connecting the grid positions of the occupancy grid map generated from the distance image.
- Continuous and complex boundary shapes can also be expressed.
- the boundary is not estimated at the position of a road edge with a low height, for example, a curbstone, where the distance feature of the obstacle is small, and the boundary is likely to be estimated at the position of a large obstacle on the back side.
- a continuous boundary line is estimated in the lateral angle direction of the sensor, there is a problem that it is impossible to estimate only the boundary of a highly reliable section when the boundary is ambiguous and the reliability is lowered.
- Patent Document 2 the estimation of the runway boundary extending in the depth direction is stabilized by providing a tilt limit on the route extending in the longitudinal direction of the image taken by the runway boundary, or by setting a cost according to the tilt. Conceivable. However, since the shape that extends in the vertical direction of the image is estimated, there is a problem that it is impossible to estimate only the boundary of a highly reliable section when the boundary is ambiguous and the reliability is lowered.
- the present invention has been made in view of the above-mentioned problems, and the object of the present invention is when there is a road edge with a weak distance characteristic or image density characteristic or when the road edge has a discontinuous shape.
- the present invention also provides a road boundary estimation device that can estimate a road boundary position and shape with high reliability and a driving support system using the same.
- a boundary feature map that stores a feature amount of a road boundary is generated for each grid of an image, and a depth direction from a detection start end position that starts entering a predetermined sensor viewing angle to an arbitrary detection end position
- the maximum boundary likelihood evaluation value calculated by adding the feature amount of the boundary feature map corresponding to the boundary shape of each road boundary candidate and the continuity evaluation value in the depth direction is set.
- the road boundary position and shape are estimated with high reliability even when there is a road edge with a weak distance feature or image density feature or when the road edge has a discontinuous shape. can do.
- FIG. 3 is a configuration diagram of a second embodiment.
- FIG. 6 is a configuration diagram of a third embodiment. The figure explaining the outline
- FIG. 2 schematically shows the system configuration of a driving support system to which Embodiment 1 of the road boundary estimation apparatus according to the present invention is applied.
- the illustrated driving support system 1 is mounted on a vehicle V such as an automobile, for example, and mainly includes a stereo camera device 11 composed of a plurality of (two in this embodiment) cameras and each camera of the stereo camera device 11.
- the road boundary estimation device 10 for recognizing the road boundary around the vehicle V from the plurality of images captured in synchronization with each other, and various devices mounted on the vehicle V based on the estimation results of the road boundary estimation device 10 (for example, , Accelerator 13, brake 14, speaker 15, steering 16, and the like) and a driving support control device 12 that supports driving of the vehicle V.
- the stereo camera device 11 is installed toward the front of the vehicle V, for example, in front of the upper part of the windshield in front of the vehicle V.
- the stereo camera device 11 captures the front of the vehicle V and acquires image information.
- Each of the left and right cameras has an image sensor such as a CCD (Charge-Coupled Device) or CMOS (Complementary-Metal-Oxide-Semiconductor), and the front of the vehicle V from a position separated from each other in the vehicle width direction (left-right direction). It is installed to take an image.
- CCD Charge-Coupled Device
- CMOS Complementary-Metal-Oxide-Semiconductor
- the track boundary estimation device 10 is a device that estimates a track boundary based on image information of an imaging target area in front of the vehicle V acquired in a time series by the stereo camera device 11 in a predetermined period.
- the road boundary at the position of the edge of the road having a height such as a guardrail, side wall, curb, bank, grass, etc. extending in the depth direction at the position of the vehicle, or the lateral edge of an obstacle on the road positioned in the lateral direction of the vehicle V
- the track boundary estimation device 10 controls the stereo camera device 11 (for example, controls the imaging timing and exposure amount of the right camera and the left camera), and a RAM (Random Access Memory) as a temporary storage area.
- FIG. 1 specifically shows the internal configuration of the road boundary estimation apparatus shown in FIG. Below, the means to estimate the road boundary which exists ahead of the said vehicle V by the road boundary estimation apparatus 10 is demonstrated concretely.
- the runway boundary estimation apparatus includes a distance image acquisition unit 101, a boundary feature map generation unit 102, an optimum route search unit 106, and a runway boundary determination unit 105.
- the optimum route search unit 106 includes a boundary candidate setting unit 103 and a boundary likelihood evaluation unit 104.
- the distance image acquisition unit 101 uses a right camera image and a left camera image acquired from the stereo camera device 11 to store a distance image (or a distance image (or a distance image) in which distance information to an object captured by each pixel is stored). (Parallax image) is generated.
- the distance image generation unit 101 divides one image (for example, the left image) by a first block having a predetermined shape and includes the other image (for example, the right image) by block matching processing.
- the image) is separated by a second block having the same size, shape and position as the first block, and the second block is shifted by one pixel in the horizontal direction, and the two luminances in the first block and the second block at each position.
- a correlation value for the pattern is calculated, and a position having the lowest correlation value, that is, a position having the highest correlation is searched (corresponding point search).
- the correlation value for example, SAD (Sum of Absolute Difference), SSD (Sum of Squared Difference), NCC (Normalized Cross Correlation), a gradient method, or the like can be used.
- SAD Sud of Absolute Difference
- SSD Small of Squared Difference
- NCC Normalized Cross Correlation
- a gradient method or the like.
- the boundary feature map generation unit 102 generates a boundary feature map storing the boundary feature amount for each grid of the image in a space overlooking the space in front of the stereo camera device 11.
- FIG. 3 shows a flowchart.
- the UD histogram generation process 21 is a process for generating a UD histogram storing the feature amount of an obstacle at each distance.
- the UD histogram is a feature value of an obstacle generally used for detecting an obstacle using a stereo camera, and is used as a feature value of a boundary by further processing this feature value.
- the road surface feature suppression process 22 is a process of subtracting the feature amount of the obstacle corresponding to the road surface position so that the boundary feature amount at the road surface position is generally a negative value.
- the feature amount normalization processing 23 performs processing so that the feature amount processed by the road surface feature suppression processing 22 falls within a predetermined value, so that the feature amount of an obstacle having a height above a certain level is the predetermined height. This is a process of normalizing the feature quantity so that the feature quantity is equivalent to the obstacle having
- the background feature subtraction process 24 further subtracts the feature quantity at the corresponding position so that the feature quantity is suppressed at the position that is the background of the obstacle based on the feature quantity processed by the feature quantity normalization process 23. It is processing. As a result, processing is performed so that the feature value generally takes a negative value at the grid position that is the background of the obstacle present on the near side and has a smaller feature value than the obstacle on the near side.
- the reason why the road surface feature suppression process 22 and the background feature subtraction process 24 convert the obstacle background and the road surface so as to easily take a negative feature amount is because the boundary likelihood evaluation unit 104, which is a subsequent process, evaluates the boundary likelihood. The value is calculated based on this feature value, and the boundary shape is determined at the position where the evaluation value is the highest in the road boundary determination unit 105. This is because a negative feature amount is preferable at the position.
- the order which performs the said road surface feature suppression process, the feature-value normalization process, and a background feature origin process can be considered variously, and may be implemented in a different order. Details of each process will be described below.
- each pixel of the distance image is scanned, and the image space (hereinafter referred to as UD) where the image horizontal position u corresponds to the three-dimensional position of the target measured at each pixel and the vertical axis is the parallax d.
- a UD histogram is generated by performing a process of storing the distance measurement number in a (space) grid. This is equivalent to the occupation grid map shown in Prior Art Document 2.
- a high distance measurement number in an arbitrary grid suggests that a large obstacle exists at the position indicated by the grid coordinates. This is because the number of distance measurements that are the same distance increases on the obstacle surface perpendicular to the road surface.
- the feature amount is converted so that the boundary feature generally takes a negative value at the distance (parallax) corresponding to the road surface.
- the distance corresponding to the road surface there are no obstacles vertically, or the height of such a stepped surface is very small, so the feature amount of the obstacle is smaller than the original.
- the feature quantity according to the height of the vertical surface of the small step is observed. Is done.
- the feature value of each grid is changed to the expected value of the feature value observed at the road surface position. If the corresponding value is subtracted by subtracting the corresponding value, a process of setting to zero is performed. Next, a process of simply subtracting a predetermined value from each grid value is performed.
- the feature amount normalization process 23 the feature amount is converted so that an obstacle having a certain height or more has the same boundary-like feature amount. For this reason, for example, when the value of each grid is higher than a predetermined value, threshold processing is performed so that the predetermined value is obtained.
- the feature amount is converted so that a negative value can be easily obtained at a position farther from the frontmost obstacle (hereinafter, the frontmost obstacle) as viewed from the sensor. .
- the frontmost obstacle for example, when the value of each grid is higher than a predetermined value, a process of subtracting the value of the grid behind the grid by a value corresponding to the feature amount of the grid is performed.
- FIG. 4C shows an example of the boundary feature map obtained as described above.
- a black grid indicates that the feature amount is zero, and that the closer to white, the larger the feature amount.
- the original image is shown in (a) of the figure, and the UD histogram (obstruction feature amount) obtained by the UD histogram generation processing 21 is shown in (b).
- the UD histogram (obstruction feature amount) obtained by the UD histogram generation processing 21 is shown in (b).
- (D) shows a map in which the degree of the negative value of the boundary feature map is visualized. Black indicates 0 or a positive value, and the closer to white, the greater the negative value. From the figure, it can be seen that in the boundary feature map, the feature amount on the back side takes a negative value according to the feature amount of the obstacle on the near side.
- the optimum route search unit 106 comprehensively sets a route (local route) that connects each grid of the boundary feature map between lines adjacent in the parallax direction, and further comprehensively sets a cumulative route that is a combination of the local routes.
- the processing of the boundary candidate setting unit 103 and the processing of the boundary likelihood evaluation unit 104 are actually performed at the same time, and therefore will be collectively described as processing of the optimum route search unit 106 hereinafter.
- a local route is set at each position in order from the lower end d start (near side) to the upper end d end (far side) in the parallax direction.
- a cumulative path to the corresponding position is set and the cumulative score is calculated.
- the optimum route is searched by repeating such processing for each line.
- the grid of adjacent lines that can be connected to an arbitrary grid (d n , u n ) is limited to the range indicated by s max , and one of those candidates is (d n ⁇ 1 , u n-1 ).
- the search range in the horizontal direction may be set as an area where there is a high possibility that there is a left and right runway boundary.
- the right and left lane boundaries are estimated independently, and there is no difference in calculation other than the difference in the horizontal search range and how to obtain the coordinates u. In the calculation formulas to be described, the left and right track boundaries are not particularly distinguished.
- FIG. 6 shows an example of setting the detection start position of the road boundary.
- the detection start positions of the left lane boundary and the right lane boundary of each boundary candidate are indicated by ' ⁇ ', 71 and 72 in the figure, respectively, and the grid indicated by 'x' indicates that it is not the detection start position.
- the local score on the lowermost side of each boundary candidate is calculated in the local route passing through the grid indicated by the detection start position 71 or 72.
- p ⁇ meaning an impossible route is substituted for the local score.
- the range of the detection start end position can be determined based on the specifications regarding the near side limit and the far side limit of the road end position to be detected, and the sensor parameters regarding the sensor viewing angle.
- FIG. 7 will be described with reference to a conceptual diagram of a boundary feature map (b) in the UD space corresponding to the sensor viewpoint image (a) and a corresponding overhead view image (c).
- FIG. 7 shows a scene in which the road boundaries 50 and 51 and the edge 52 of the oncoming vehicle VF are seen as road ends. Road edges 50, 51, and 52 indicate that the boundary feature amount is large in the boundary feature map (b) by black lines.
- the hatched area 53 is a background of the foremost obstacle, and thus indicates an area where the boundary feature value takes a negative value.
- a hatched area 55 indicates an area of the road surface that can be seen by the stereo camera device 11 mounted on the vehicle V.
- the local score of an arbitrary local route is calculated by the following formula when the route is not determined to be a boundary with the blind spot area.
- the first term of equation (1) is a score (can be positive or negative) according to the boundary feature on the local route, and the second term is based on the deviation from the expected value of the gradient in the local route.
- Negative score the third term represents a negative score according to the gap length in the cumulative path including the local path.
- the air gap length indicates a distance that is considered that the road boundary has been interrupted in the depth direction because sufficient boundary features are not continuous in the depth direction.
- the local score is calculated by the following calculation formula when the route is determined to be a boundary with the blind spot area.
- the first term of the formula (2) represents a negative score corresponding to the length of the blind spot region in the cumulative route including the local route.
- the criterion for determining that the route is the boundary of the blind spot area satisfies the following two conditions.
- UD (d, u) indicates a boundary feature amount at a corresponding position in the boundary feature map.
- c b1 and c b2 are threshold values indicating the background of the obstacle and the range of the boundary feature amount on the road surface, respectively.
- the cumulative score J [d n , u n ] is calculated based on the local score calculated by the formulas (1) and (2) and the cumulative score J [d n ⁇ 1 , u n ⁇ 1 calculated in the previous line. ] Based on the recurrence formula shown in the following formula (3).
- the cumulative score J [d i , u i ] calculated at each position in the UD space passes through one of the preset detection initial positions, and is optimal when (d i , u i ) is used as the detection end position. Means the path score.
- the accumulated score J [d i , u i ] at each position is stored as an array, and an optimum route having an arbitrary position as a terminal position is determined by the lane boundary determination unit 105 at the subsequent stage.
- the position of the n-1 u n-1 * [d n, u n] holds the sequence as.
- the optimum route search unit 106 calculates the local score of the local route based on the equations (1) and (2), calculates the cumulative score of the cumulative route based on the equation (3), and accumulates in each grid.
- the search for the optimum route is advanced for each line while only the cumulative route having the maximum score is retained.
- equation (3) a path that passes through the same grid position at each grid position but does not have the maximum cumulative score is rejected from the boundary candidate because it is not likely to be the optimal path from the principle of optimality, so it is comprehensive.
- the search can proceed without exploding the combination in a simple search.
- u n ⁇ 1 * indicates the value of u n ⁇ 1 when J [d n ⁇ 1 , u n ⁇ 1 ] + dJ is maximum in the equation (3).
- d g [d, u] in the equation (4) is obtained by storing the terminal distance (parallax) that is not determined as a gap in the optimum route having the grid (d, u) as the terminal position in the array, and g acc [d, u] stores the accumulated value of the boundary feature amount on the path from the terminal position not determined to be a gap to the grid (d, u) in an array.
- (4) is mainly grid (d n, u n) grid when the accumulated value becomes 0 or less in (d n, u n) is the air gap length is increased one unit is regarded as a void, Otherwise, it is set so that it can be seen later that it was set to zero.
- d b [d, u] in the expression (5) stores the end distance (parallax) that is not determined as the blind spot area in the optimum route having the grid (d, u) as the end position in the array.
- Formula (5) is set so that it can be seen later that the length of the blind spot area is increased by one unit when the local route is determined to be a boundary with the blind spot area, and is zero otherwise.
- the first term and the second term of the equation (6) are the grid positions regarded as on the path in the local route connecting the grid (d n , u n ) and the grid (d n ⁇ 1 , u n-1 ). It is a boundary feature amount, and indicates a value obtained by accumulating the boundary feature amount at the grid position illustrated by hatched “ ⁇ ” and dotted line “ ⁇ ” in FIG. 8.
- t s1 and t s2 indicate predetermined thresholds
- ⁇ s indicates a deviation between the gradient of the local route and its expected value s exp
- p s indicates a predetermined value corresponding to the deviation from the expected value of the gradient.
- the expected value s exp is a gradient estimated at the grid position (u n , d n ) when a straight road extending ahead of the vehicle is assumed
- u c and f are the principal points in the lateral direction of the camera ( (Center point) indicates the position and focal length
- ⁇ indicates the yaw angle of the vehicle relative to the straight road in 3D space.
- (A) and (b) in the figure are the corresponding original image and boundary feature map. From the figure, it can be seen that the direction of the road edge (guardrail, lateral edge of the vehicle) in (b) is similar to the direction of the expected value of the gradient in (c).
- t g1, t g2 denotes the predetermined threshold value
- [Delta] d g represents a gap length
- p g represents a parameter that gives the predetermined penalty in accordance with the air gap length
- p ⁇ void length is a predetermined value If it exceeds, an arbitrarily large value for removing from the boundary candidate as an impossible path is shown.
- Conditional expression (A) is the same condition as expression (4), and is a condition for determining whether or not the grid (d n , u n ) is a void.
- t b1, t b2 represents a predetermined threshold value
- [Delta] d b indicates the length of the blind area
- p b represents a parameter that gives the predetermined penalty corresponding to the length of the blind area
- p ⁇ is dead
- the runway boundary determination unit 105 first, the grid position (d goal , u goal ) where the accumulated cost J [d, u] in each grid stored in the optimum path search unit 106 takes the maximum value J max is determined as the end position of the optimum path. Detect as. Further, each position of the optimum route is obtained by back tracing.
- the back trace refers to u n-1 * [d, u] stored in the optimum route search unit 106, and traces backward from the end position (d goal , u goal ) in the direction of d start.
- the lateral position u n-1 * of the optimum route for each distance (parallax) is restored by going.
- the route that takes the detected optimum route is determined as the road boundary.
- the result is output.
- the J max is lower than a predetermined value, or the length of the optimum route in the depth direction (terminal position ⁇ starting position) is shorter than a predetermined value, the reliability of the estimated road boundary is low. Assuming that the road boundary is not estimated, the result is not output.
- the lateral position is measured as a lateral position in the UD space, but is converted and output as a lateral position in the 3D space using a sensor parameter such as a focal length.
- the driving support control device 12 implements driving support for the vehicle based on the road boundary output by the road boundary determination unit 105. Specifically, there is a risk that the vehicle will deviate from the road or collide with an obstacle in the lateral direction based on the position of the road boundary and the state of the vehicle (position, traveling direction, speed, acceleration, etc.). It is determined whether or not there is. When it is determined that the distance to the edge of the road or the obstacle in the lateral direction is close and there is a high risk of road departure or collision, the speaker 15 of the vehicle V is activated to alert the driver. ) Or calculating the accelerator control amount, the brake control amount, and the steering control amount for supporting the traveling of the vehicle V to control the accelerator 13, the brake 14, the steering 16, and the like.
- the lane boundary feature amount generated by the boundary feature map generation unit 102 and the geometric condition predetermined by the boundary candidate setting unit 106 are comprehensively included.
- the boundary feature map corresponding to the boundary shape of each lane boundary candidate based on a plurality of lane boundary candidates that are continuous in the depth direction from the detection start end position at the start of appearance that falls within the set sensor viewing angle to an arbitrary detection end position The candidate that maximizes the boundary evaluation value calculated by adding the continuity evaluation value in the depth direction based on the gradient, gap length, and blind spot length of the boundary shape and the boundary shape is determined as the road boundary .
- the road boundary can be estimated by allowing the discontinuity to some extent. That is, when a certain condition is satisfied, the boundary between the road edge of the obstacle edge, the road edge on the background side, and the blind spot area connecting the road edge can be appropriately estimated as the road boundary.
- FIGS. 10A, 10B, and 10C show examples of road boundaries estimated by the road boundary estimation apparatus 10.
- FIG. From (a) it can be seen that the road boundary is properly estimated at the curb position at the right road edge where the height is low and the feature amount is low, as well as the high side wall at the left road edge.
- the curb at the right side road edge is a scene in which the area where the obstacle surface can be seen by the stereo camera is limited to the vicinity due to a sharp curve, and there is no corresponding boundary feature amount beyond a certain distance. ing.
- the road boundary is appropriately estimated in the section from the vicinity to the end position where the obstacle surface of the curb is visible, and the boundary of the section with high reliability is output.
- the boundary feature map generation unit 102 generates the boundary feature based on the distance image calculated by the stereo camera.
- the boundary feature may be based on distance data measured by a distance sensor such as a radar.
- it may be generated based on the edge information of the grayscale image acquired by the camera instead of the distance image, or the boundary feature based on the edge of the image and the boundary feature based on the distance image based on both the grayscale image and the distance image. May be generated by weighting and adding together.
- the boundary likelihood evaluation unit 104 has shown that the expected value of the gradient can be a value when the yaw angle with respect to the road of the vehicle is assumed to be zero and a straight road is assumed.
- a more precise expected value of the gradient may be calculated based on a result of estimating the yaw angle of the vehicle with respect to the road and estimating the curve shape of the road.
- the yaw angle of the vehicle with respect to the road and the curve shape of the road can be estimated based on, for example, a white line detection result.
- the expected value of the gradient may be set based on the edge feature amount in each part of the image, the map information and the vehicle position information, or the position / gradient of the road boundary detected at the previous time, and the vehicle You may set using position information.
- the boundary likelihood evaluation unit 104 adds the negative evaluation value corresponding to the deviation from the expected value of the gradient.
- a negative evaluation value may be added.
- the expected value of the boundary position is set based on, for example, a white line detection result, and an area within a certain range in the direction outside the road from the white line is set as the expected value. Further, it may be set based on the map information and the vehicle position information, or may be set using the position / gradient of the road boundary detected at the previous time and the vehicle position information.
- the expected value for the geometric shape of the road boundary is calculated.
- an appropriate road boundary can be stably estimated.
- the curvature of a road can be considered.
- FIG. 11 shows the configuration of the second embodiment.
- symbol is attached
- the road boundary estimation apparatus 10A of the second embodiment is different from the structure of the first embodiment in that a plurality of boundary feature maps are mainly generated and a plurality of road boundary boundaries are output based on the plurality of boundary feature maps. Yes, the other configurations are the same.
- a feature map is generated by the same means as the feature map generation unit 102 in the first embodiment.
- the parameter of the road surface feature suppression processing unit, the parameter of the feature amount normalization processing unit, or the background feature is used.
- a plurality of boundary feature maps are generated based on a plurality of combinations set in advance for parameters of the subtraction processing unit.
- a plurality of predetermined combinations are set, for example, by a parameter (A) that is set so that the boundary feature value takes a relatively large value even for an obstacle with a low height such as a curb, and a high value such as a curb.
- two parameters (B) are set so that the boundary feature amount is small or takes a value of zero or less. Based on such settings, a boundary feature map (A) generated based on the parameter (A) and a boundary feature map (B) generated based on the parameter (B) are generated.
- the boundary likelihood evaluation unit 104A performs the same processing as that of the boundary likelihood evaluation unit 104. However, when calculating the boundary likelihood evaluation value, the boundary likelihood evaluation value (A) calculated with reference to the boundary feature map (A) is calculated. ) And a boundary likelihood evaluation value (B) calculated with reference to the boundary feature map (B). In the runway boundary determination unit 105A, the runway boundary (A) and the runway boundary (B) that take the optimum paths from the viewpoint of the boundary likelihood evaluation values (A) and (B) are determined and output, respectively.
- the driver is alerted and the vehicle V is controlled as necessary, similarly to the driving support control device 12, but based on the driving state and the road boundary (A) and the road boundary (B). And determining driving support.
- As the traveling state for example, a normal traveling state (A) where there is no risk of collision with a front obstacle and a traveling state (B) where there is a risk of collision with a front obstacle and where collision avoidance is required are set. Then, for example, it is determined whether the vehicle is in the traveling state (A) or the traveling state (B) according to the distance or relative distance to the front obstacle detected by the stereo camera device 11.
- the driving support is provided so that a small obstacle such as a curbstone does not deviate from the boundary that can be regarded as the end of the road based on the road boundary (A).
- a small obstacle such as a curbstone
- the vehicle V is controlled so as to be urgently evacuated.
- Embodiment 2 a plurality of road boundaries are estimated from the viewpoint of a plurality of evaluation scales based on the definition of a plurality of boundary features, and information on boundary positions suitable for the driving state is referred to. Therefore, it is possible to perform appropriate driving support according to the situation.
- the present embodiment is characterized in that a plurality of boundary feature maps are generated.
- a single boundary feature map is generated by dynamically setting the parameters to obtain the same effect. Good.
- the parameter in the boundary feature map generation unit is dynamically set to a value suitable for the road boundary determined in the driving state. It is possible to do.
- FIG. 12 shows the configuration of the third embodiment.
- the road boundary estimation apparatus 10B according to the third embodiment mainly includes a boundary feature map generation unit 102B that generates a plurality of boundary feature maps, an obstacle boundary estimation unit 107B, and a travelable region.
- the setting unit 108B is added, and the other configurations are the same.
- the boundary feature map generation unit 102B generates two boundary feature maps based on the two types of parameter settings, similar to the boundary feature map generation unit 102A in the second embodiment.
- the two types of parameter settings are a parameter (A) determined so that the boundary feature value takes a relatively large value even for an obstacle with a low height such as a curb, and a height of the curb.
- the parameter (B) is set so that the boundary feature amount is small or takes a value of zero or less at a boundary that is not considered as an obstacle that cannot be traversed on the road.
- the boundary feature map generated based on the parameter (A) is input to the boundary candidate setting unit 103, and the subsequent process is the same as in the first embodiment, and the road boundary determination unit 105 outputs the road boundary. It is assumed that the boundary feature map generated based on the parameter (B) is input to the obstacle boundary estimation unit 107B.
- the obstacle boundary estimation unit 107B uses the left end of the sensor viewing angle as the detection start position and the right end of the sensor viewing angle as the detection end position. ). As a method for estimating such a boundary line with an obstacle, for example, a method disclosed in Prior Art Document 2 is used.
- the travelable area setting unit 108B sets a travelable area based on the travel path boundary estimated by the travel path boundary determination unit 105 and the obstacle boundary estimated by the obstacle boundary estimation unit 107B.
- a setting method for example, the boundary position on the near side of the road boundary and the obstacle boundary estimated in each line-of-sight direction of the sensor is set as an end of the travelable region, and the region within the viewing angle of the sensor and A region in front of the end of the travelable region is set as a travelable region.
- the boundary of the obstacle has a continuous boundary line in the lateral direction of the sensor, but the boundary position cannot be estimated properly for small obstacles such as curbs, and the position of the large obstacle on the back side
- the road boundary has a property that the boundary position is appropriately estimated even for a small obstacle such as a curb stone, but the boundary is not estimated at the position of the obstacle in the front direction on the road.
- FIG. 1 A conceptual diagram of the travelable area set in this way is shown in FIG.
- a conceptual diagram of the estimated road boundary is shown in (a)
- a conceptual diagram of the estimated obstacle boundary is shown in (b)
- a conceptual diagram of the travelable area and its end set in (c) Indicates.
- the boundary is indicated by a white line
- the travelable area is indicated by a hatched area.
- the driver is alerted and the vehicle V is controlled as necessary, similarly to the driving support control device 12, but based on the driving possible region set by the driving possible region setting unit 108.
- the vehicle V is controlled by planning a route with a low risk of deviation from the road and obstacles and collisions in consideration of the travelable area and the ride comfort.
- the ride comfort is based on the speed and acceleration of the vehicle V, and the route candidate sets a large number of candidates within a range that does not deviate from the end of the travelable region.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
Abstract
Description
しかしながら道路コンディションや天候コンディションなどにより白線が必ずしも明瞭に見えないケースや、白線がそもそもペイントされていないケースが往々に存在し、そのような場合に前記機能は動作しないという問題がある。道路の端部および障害物との境界である走路境界を適切に推定できれば、そのような場合においても道路の逸脱を防止する機能を提供できる。さらに、昨今実用化に向けた技術開発が進んでいる自動運転向けの要素技術という観点においても重要である。走行可能領域の情報を基にすることにより自律的に安全な経路を計画し、ドライバの介入なしに自動運転する機能の実現が容易となる。
図2は、本発明に係る走路境界推定装置の実施形態1が適用された走行支援システムのシステム構成を概略的に示したものである。
走行支援制御装置12は、走路境界推定装置10から受信した走路境界の推定結果に基づいて、当該車両Vのスピーカ15を作動させたり、当該車両Vの走行を支援するためのアクセル制御量、ブレーキ制御量、ステアリング制御量を算出して、アクセル13やブレーキ14やステアリング16などの調整を行うようになっている。
図1は、図2に示す走路境界推定装置の内部構成を具体的に示したものである。以下では、走路境界推定装置10によって当該車両Vの前方に存在する走路境界を推定する手段について具体的に説明する。
また、最適経路探索部106は、境界候補設定部103と、境界らしさ評価部104と、から構成されている。
なお、この相関値の算出方法としては、例えば、SAD(Sum of Absolute Difference)やSSD(Sum of Squared Difference)、NCC(Normalized Cross Correlation)、勾配法などを用いることができる。探索の結果、相関が最も高くなる位置を特定した場合には、第1ブロック内の特定の画素と特定された位置における第2ブロック内の特定の画素との間の距離を視差として算出し、これを1つのステップとして同様のステップを全画素について実行することにより、前記距離画像を生成することができる。
UDヒストグラム生成処理21は、各距離において障害物の特徴量を格納したUDヒストグラムを生成する処理である。UDヒストグラムはステレオカメラを用いて障害物を検出するために一般に利用される障害物の特徴量であり、本特徴量をさらに処理することにより境界の特徴量として用いる。
路面特徴抑制処理22は、路面の位置における境界の特徴量が概してマイナスの値となるように路面位置に対応する前記障害物の特徴量を減算する処理である。特徴量正規化処理23は、前記路面特徴抑制処理22によって処理された特徴量が所定値以下に収まるように処理することによって、一定以上高さのある障害物の特徴量は前記一定の高さを有する障害物と同等の特徴量となるように、特徴量を正規化する処理である。
背景特徴減算処理24は、前記特徴量正規化処理23によって処理された特徴量に基づいて、障害物の背景となる位置においては特徴量が抑制されるように該当位置の特徴量をさらに減算する処理である。これにより、手前側に存在する障害物の背景であって、前記手前側の障害物よりも特徴量の小さいグリッド位置においては、特徴量が概してマイナスの値をとるように処理するものである。
路面特徴抑制処理22および背景特徴減算処理24において障害物の背景および路面の位置においてマイナスの特徴量をとりやすいように変換する理由は、後段処理である境界らしさ評価部104においては境界らしさの評価値が本特徴量に基づいて算出され、走路境界決定部105においてこの評価値が最も高い位置において境界形状を決定するので、走路境界と推定されると誤推定となる障害物の背景および路面の位置においてはマイナスの特徴量となることが好ましいためである。なお、前記の路面特徴抑制処理、特徴量正規化処理、および背景特徴原産処理を実施する順序は様々に考えられ、異なる順序で実施されてもよい。以下、各処理について詳細を説明する。
このように特徴量を変換することにより、手前側と奥側に同等の特徴量を有する路端が存在する場合には手前側の路端の位置において境界が推定されやすくなる効果がある。
図の(a)に元画像を、(b)にUDヒストグラム生成処理21で得られたUDヒストグラム(障害物の特徴量)を示している。(b)と(c)を比較することにより、境界特徴マップにおいては、路面の特徴が抑制されていること、および背景の特徴が抑制されていることがわかる。(d)に境界特徴マップのマイナス値の程度を可視化したマップを示す。黒色は0あるいはプラスの値であることを示し、白色に近いほどマイナスの値が大きいことを示す。図より、境界特徴マップは、手前側の障害物の特徴量に応じて、奥側の特徴量がマイナスの値をとっていることがわかる。
図5においては、任意のグリッド(dn,un)と連結しうる隣接ラインのグリッドはsmaxの示した範囲内に限定され、それらの候補のうちのひとつを(dn-1,un-1)として図示している。また、図5(a)(b)のそれぞれは左側走路境界および右側走路境界の最適経路探索においては、横方向の探索範囲と、座標uの取り方が異なっていることを示している。
横方向の探索範囲は、それぞれ左右の走路境界が存在する可能性の高い領域を設定すればよいが、単純には図に示すように、画像中心位置から左側と右側でそれぞれ設定してもよい。
なお、右側走路境界と左側走路境界は独立に推定されるようになっており、上述した横方向の探索範囲と、座標uの取り方の差異以外に計算上の差異はないため、以降で示される計算式では特に左右の走路境界を区別することなく説明する。
各境界候補の最も下端側の局所スコアは検出始端位置71,72で示したグリッドを通過する局所経路において計算される。一方で、‘×’で示したグリッドと連結する局所経路においては局所スコアは不可能経路を意味するp∞が代入されるようにする。このような不可能経路と判断された局所経路と連結する局所経路はすべて不可能経路と設定することにより、‘×’で示したグリッドを始端位置とする走路境界は検出されないように設定される。検出始端位置の範囲は、検出対象の路端位置の近傍側限界と遠方側限界に関する仕様、およびセンサ視野角に関するセンサパラメータを基に決めることができる。
図7にセンサ視点の画像(a)と対応するUD空間の境界特徴マップ(b)と対応する俯瞰画像(c)の概念図を用いて説明する。図7は、道路の境界50、51、および対向車両VFのエッジ52が路端として見えているシーンを示している。路端50,51,52は境界特徴マップ(b)において境界特徴量が大きいことを黒線で示している。また、ハッチング領域53は最手前障害物の背景となっているため境界特徴量がマイナスの値をとる領域を示す。俯瞰画像(c)においてハッチング領域55は車両Vに搭載されたステレオカメラ装置11によって見えている路面の領域を示している。点線部分の境界54が境界特徴マップ(b)において勾配s=0となる走路境界の部位を示し、対応する俯瞰画像(c)においては死角領域の境界となっていることを示している。死角領域との境界54は路端との境界ではないが、障害物の存在によって不連続となる前景側の路端52および背景側の路端51における境界をつなぐ境界であり、右側走路境界として51、52、54で示す境界が走路境界として推定されることは好ましいと考えられる。このため、勾配s=0かつ一定の条件を満たす場合は死角領域との境界を通る経路と判断し、そうでない場合とは異なる計算式で局所スコアを計算する。
局所経路を含む累積経路における空隙長に応じた負のスコア、を表す。ここで空隙長とは、奥行き方向に十分な境界特徴が連続していないために奥行き方向に走路境界が途切れていたとみなされた距離を示す。
2)を表す境界特徴を有する場合。すなわち、
1)空隙長に関連する情報の更新:
(位置、進行方向、速度、加速度など)に基づいて、自車が道路を逸脱する、あるいは横方向の障害物と衝突する危険性があるか否か、を判断する。そして、道路の端部や横方向の障害物との距離が近く道路の逸脱あるいは衝突の危険性が高いと判断した場合には、当該車両Vのスピーカ15を作動させてドライバーに警告(注意喚起)を行ったり、当該車両Vの走行を支援するためのアクセル制御量、ブレーキ制御量、ステアリング制御量を算出して、アクセル13やブレーキ14やステアリング16などの制御を行う。
さらに、検出終端位置には制限を設けておらず、また境界形状の終端位置からの空隙長に応じて境界らしさ評価値が低下するように設計されているので、適切な検出終端位置を有する最適な境界形状を推定することができる。これにより、例えば路端が一定長さ以上途切れていたり路端の特徴量が十分に得られていない領域において走路境界を誤推定することを抑制できるとともに、信頼性が十分高いとみなせる近傍側区間の走路境界を推定し出力することができる。
また、局所経路が死角領域との境界を通ると判断される場合には、死角領域の長さに応じたマイナス評価値を加算する一方で、勾配の期待値とのずれに応じたマイナス評価値は加算されないように設計されているので、死角をつくる障害物の存在によって、前景側の障害物のエッジの路端と背景側の路端との境界が不連続となる場合であっても、前記不連続性を一定程度許容して走路境界が推定することができる。すなわち、一定の条件を満たす場合は前記障害物のエッジの路端と背景側の路端とその間をつなぐ死角領域との境界を適切に走路境界と推定することができる。
(a)の推定結果においては走路境界は近傍から縁石の障害物面が見えている終端の位置までの区間において適切に推定されており、信頼性の高い区間の境界が出力されるようになっていることがわかる。
(b)においては車両が道路の端部を遮蔽しており近傍側の道路の端部が見えない場合に、車両のエッジおよびその奥側の道路の端部を適切に走路境界と推定できていることがわかる。
(c)においては、車両が道路の端部を遮蔽している一方、道路の端部が検出始端位置から一定の区間にわたって十分に見えている場合においては、道路の端部が走路境界と推定され、障害物の遮蔽によって不連続となる位置が適切に検出終端位置として推定されていることがわかる。
さらに、地図情報と自車位置情報に基づいて設定してもよいし、あるいは前回の時刻に検出した走路境界の位置・勾配・および自車位置情報を用いて設定してもよい。このように、道路形状に関する事前知識において前提条件に一定程度の妥当性が確保できる場合、あるいは別手段による事前計測の一定程度の精度が確保できる場合は、走路境界の幾何学形状に関する期待値を設定し、期待値とのズレに応じたマイナス評価値を加算することにより、安定的に適切な走路境界を推定できる。その他の幾何学形状の属性としては、例えば道路の曲率などが考えられる。
図11は実施形態2の構成を示したものである。なお、実施形態1に係る走路境界推定装置10と同様の構成については、同一の符号を付して説明を省略する。実施形態2の走路境界推定装置10Aは、実施形態1の構成と、主に境界特徴マップを複数生成し、これらの複数の境界特徴マップに基づいて複数の走路境界が出力される点に差異があり、その他の構成は同様である。
図12は実施形態3の構成を示したものである。なお、実施形態1に係る走路境界推定装置10と同様の構成については、同一の符号を付して説明を省略する。実施形態3の走路境界推定装置10Bは、実施形態1の構成に加えて、主に境界特徴マップ生成部102Bが複数の境界特徴マップを生成する点と、障害物境界推定部107Bおよび走行可能領域設定部108Bが追加されている点に差異があり、その他の構成は同様である。
ここで、障害物境界はセンサ横角度方向に途切れなく連続した境界線が得られているが、縁石などの小さな障害物においては境界位置が適切に推定されず、奥側の大きな障害物の位置において推定される傾向を有し、走路境界は縁石などの小さな障害物においても境界位置が適切に推定されるが、路上の正面方向の障害物の位置においては境界が推定されない性質をもっている。このような性質の異なる境界情報を統合することにより、センサ横角度方向に途切れのない境界線が得られ、かつ側面方向の境界は縁石などの高さの低い路端の位置においても適切な境界位置が得られるので、いずれか一方のみを境界情報とするよりも利便性の高い形式でかつ正確な走行可能領域を設定することができる。
このようにして設定した走行可能領域の概念図を図13に示す。ここで(a)に推定された走路境界の概念図を示し、(b)に推定された障害物境界の概念図を示し、(c)に設定された走行可能領域およびその端部の概念図を示す。なお境界は白線で示し、走行可能領域はハッチングした領域で示している。
Claims (17)
- 外界センサより取得した画像に基づいて、前記画像あるいは前記画像を別視点に変換した画像のグリッドごとに走路境界が存在する特徴量を格納する境界特徴マップを生成する境界特徴マップ生成部と、予め定めたセンサ視野角に入る見え始めの検出始端位置から任意の検出終端位置までの奥行き方向に連続する複数の経路を境界候補として設定する境界候補設定部と、前記境界候補について、該当する前記境界特徴マップの特徴量および奥行き方向の連続性評価値を加算した境界らしさ評価値を算出する境界らしさ評価部と、前記境界らしさ評価部において算出された各境界候補の境界らしさ評価値を比較して、走路境界を決定する走路境界決定部と、を有する走路境界推定装置。
- 前記境界候補設定部において、走路境界の各部形状における奥行き方向の勾配が一定範囲内である幾何拘束条件を用いることを特徴とする、請求項1に記載の走路境界推定装置。
- 前記境界らしさ評価部において、奥行き方向の連続性評価値は、境界形状の各部における走路境界の奥行き方向の勾配の期待値とのズレに応じたマイナス評価値であることを特徴とし、前記勾配の期待値は前記境界形状の各部におけるエッジの特徴量に基づく勾配、あるいは自車に対して直線道路を想定して推定される勾配、あるいは外界センサにより検出した白線の勾配、あるいは地図情報と自車位置情報を用いて推定される勾配、あるいは前回の時刻に検出した走路境界の位置・勾配・および自車挙動情報を用いて推定される勾配、であることを特徴とする請求項1に記載の走路境界推定装置。
- 前記境界らしさ評価部において、前記境界らしさ評価値は前記境界特徴マップの特徴量および前記奥行き方向の連続性評価値に加えて、走路境界の位置の期待値からのズレに応じたマイナス評価値が加算されることを特徴とし、前記走路境界の位置の期待値は、外界センサにより検出した白線位置に基づいて推定される位置、あるいは地図情報と自車位置情報を用いて推定される位置、あるいは前回の時刻に検出した走路境界の位置・勾配・および自車挙動情報を用いて推定される位置、であることを特徴とする請求項1に記載の走路境界推定装置。
- 前記境界らしさ評価部において、奥行き方向の連続性評価値は、さらに、前記境界特徴マップに基づいて走路境界が途切れたとみなされる長さを示す空隙長さに応じたマイナス値が加算される、ことを特徴とする請求項3に記載の走路境界推定装置。
- 前記境界らしさ評価部において、奥行き方向の連続性評価値は、前記境界候補設定部における奥行き方向の伸展方向および前記境界特徴マップに基づいて該経路が死角領域との境界であると判定される場合においては死角領域の長さに応じた所定のマイナス値が加算され、前記の境界形状の各部における勾配の期待値とのズレに応じたマイナス評価値は加算されない、ことを特徴とする請求項3に記載の走路境界推定装置。
- 前記境界決定部において、前記走路境界に加えて走路境界中に存在した死角領域との境界と判定された区間を出力する、ことを特徴とする請求項6に記載の走路境界推定装置。
- 前記境界候補設定部は、走路境界の経路を前記境界特徴マップの隣接距離間のグリッドを連結する局所経路を組み合わせた累積経路として表現し、前記累積経路が予め定められた走路境界の幾何属性の制約範囲内においてとりうる組み合わせを網羅的に設定することを特徴とする、請求項1に記載の走路境界推定装置。
- 前記境界候補設定部および前記境界らしさ評価部は、前記累積経路を前記境界特徴マップの近傍側から遠方側の距離方向に向かって順次設定するとともに、前記累積経路の境界らしさ評価値を算出するものとし、任意距離において、途中経路が異なりかつ同じ位置をとる経路が複数存在する場合には、前記累積経路の境界らしさ評価値が最も高くなる経路以外の経路を境界候補から棄却することを特徴とする、請求項8に記載の走路境界推定装置。
- 前記境界特徴マップは、距離センサが計測した距離画像に基づいて、あるいはカメラにより撮像した濃淡画像に基づいて、各グリッドに対応する空間位置において走路の境界が存在する確度に相当する値を境界特徴として格納したものであることを特徴とする、請求項1に記載の走路境界推定装置。
- 前記境界特徴マップ生成部は、距離センサが計測した距離画像に基づいて、各グリッド位置に障害物が存在する特徴量を格納した占有グリッドマップを生成する占有グリッドマップ生成処理部と、前記占有グリッドマップの路面位置に相当する特徴量を抑制するための特徴量差分値を設定する路面特徴抑制効果設定部と、前記占有グリッドマップの各グリッド位置の特徴量が所定の値以下に収まるように正規化するための特徴量差分値を設定する特徴量正規化効果設定部と、前記占有グリッドマップのセンサから見て最も手前側の障害物より奥側とみなされるグリッドの特徴量を抑制するための特徴量差分値を設定する背景特徴抑制効果設定部と、前記路面特徴抑制効果設定部および前記特徴量正規化効果設定部および背景特徴抑制効果設定部において設定された特徴量差分値を前記占有グリッドマップの特徴量に加算するフィルタ処理部、とを備えることを特徴とする、請求項1に記載の走路境界推定装置。
- 前記外界センサより取得した画像は濃淡画像と距離画像の双方であって、前記境界特徴マップは濃淡情報に基づいた境界特徴マップおよび距離情報に基づいた境界特徴マップをそれぞれ重み付けして足し合わせたものを用いる、あるいはいずれかの境界特徴マップを選択的に用いることを特徴とする、請求項1に記載の走路境界推定装置。
- 請求項1に記載の走路境界推定装置と、走路境界推定装置によって出力された走路境界に基づいて、車両の走行を支援する走行支援制御装置とを備えることを特徴とする、請求項1に記載の走行支援システム。
- 前記境界特徴マップ生成部は、車両の走行状態に応じて前記路面特徴抑制処理部あるいは前記特徴量正規化処理部あるいは前記背景特徴抑制処理のパラメータを動的に変化させることを特徴とする、請求項11に記載の走路境界推定装置。
- 前記境界特徴マップ生成部は、前記路面特徴抑制処理部のパラメータあるいは前記特徴量正規化処理部のパラメータあるいは前記背景特徴減算処理部のパラメータ、の複数の組み合わせの設定に基づいて複数の境界特徴マップを生成することを特徴とし、前期境界らしさ評価部は前記複数の境界特徴マップに基づいて複数尺度の境界らしさ評価値を算出することを特徴とし、前記走路境界決定部は前記各尺度の境界らしさ評価値にもとづいて、各尺度の境界らしさ評価値についてそれぞれ最適となる複数の走路境界を決定し、出力することを特徴とする、請求項11に記載の走路境界推定装置。
- 前記走路境界推定装置により出力された複数の走路境界と、車両の走行状態に基づいて、車両の走行を支援する、請求項15に記載の走行支援システム。
- さらに、予め用意された障害物形状に関する情報を用いて、センサ視野角の左端の検出始端位置からセンサ視野角の右端の検出終端位置までのセンサ横角度方向に途切れなく連続する障害物との境界線である障害物境界を推定する障害物境界推定部と、前記走路境界と前記障害物境界とに基づいて走行可能領域を設定する走行可能領域設定部と、を備えることを特徴とする、請求項1に記載の走路境界推定装置。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP17830913.4A EP3489928B1 (en) | 2016-07-22 | 2017-07-12 | Traveling road boundary estimation apparatus and traveling assistance system using same |
| US16/301,782 US10832061B2 (en) | 2016-07-22 | 2017-07-12 | Traveling road boundary estimation apparatus and traveling assistance system using same |
| JP2018528508A JP6606610B2 (ja) | 2016-07-22 | 2017-07-12 | 走路境界推定装置及びそれを用いた走行支援システム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016143893 | 2016-07-22 | ||
| JP2016-143893 | 2016-07-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018016394A1 true WO2018016394A1 (ja) | 2018-01-25 |
Family
ID=60993079
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/025356 Ceased WO2018016394A1 (ja) | 2016-07-22 | 2017-07-12 | 走路境界推定装置及びそれを用いた走行支援システム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US10832061B2 (ja) |
| EP (1) | EP3489928B1 (ja) |
| JP (1) | JP6606610B2 (ja) |
| WO (1) | WO2018016394A1 (ja) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112257652A (zh) * | 2020-11-06 | 2021-01-22 | 北京航迹科技有限公司 | 确定可行驶区域的方法、装置、设备和存储介质 |
| CN112393737A (zh) * | 2019-08-16 | 2021-02-23 | 苏州科瓴精密机械科技有限公司 | 障碍地图的创建方法、系统,机器人及可读存储介质 |
| KR20220084790A (ko) * | 2020-12-14 | 2022-06-21 | 건국대학교 산학협력단 | 3차원 공간정보 생성을 위한 병렬처리 장치 및 방법 |
| US11790668B2 (en) * | 2019-01-31 | 2023-10-17 | Uatc, Llc | Automated road edge boundary detection |
| WO2024042607A1 (ja) * | 2022-08-23 | 2024-02-29 | 日立Astemo株式会社 | 外界認識装置及び外界認識方法 |
| JP2025508060A (ja) * | 2022-03-24 | 2025-03-21 | センスタイム グループ リミテッド | 道路障害物の検出方法、装置、機器及び記憶媒体 |
| US12351167B2 (en) | 2020-12-02 | 2025-07-08 | Panasonic Automotive Systems Co., Ltd. | Vehicle and control device |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3529742A1 (en) * | 2016-10-24 | 2019-08-28 | Starship Technologies OÜ | Sidewalk edge finder system and method |
| JP6548312B2 (ja) * | 2017-09-19 | 2019-07-24 | 株式会社Subaru | 画像処理装置 |
| JP6815963B2 (ja) * | 2017-09-29 | 2021-01-20 | クラリオン株式会社 | 車両用外界認識装置 |
| US10996679B2 (en) * | 2018-04-17 | 2021-05-04 | Baidu Usa Llc | Method to evaluate trajectory candidates for autonomous driving vehicles (ADVs) |
| CN110162040B (zh) * | 2019-05-10 | 2022-06-17 | 重庆大学 | 一种基于深度学习的低速自动驾驶小车控制方法及系统 |
| US11132772B2 (en) | 2019-06-11 | 2021-09-28 | Samsung Electronics Co., Ltd. | Asymmetric normalized correlation layer for deep neural network feature matching |
| WO2021096152A1 (en) * | 2019-11-15 | 2021-05-20 | Samsung Electronics Co., Ltd. | Asymmetric normalized correlation layer for deep neural network feature matching |
| FR3106659B1 (fr) * | 2020-01-23 | 2023-07-28 | Second Bridge Inc | Carte numérique pour la navigation |
| CN111256712B (zh) * | 2020-02-24 | 2021-10-29 | 深圳市优必选科技股份有限公司 | 地图优化方法、装置及机器人 |
| WO2021171049A1 (ja) * | 2020-02-24 | 2021-09-02 | 日産自動車株式会社 | 車両制御方法及び車両制御装置 |
| JP6837626B1 (ja) * | 2020-08-03 | 2021-03-03 | 株式会社空間技術総合研究所 | 地物データの生成システム、地物データベース更新システム及び地物データの生成方法 |
| CN112415995B (zh) * | 2020-09-22 | 2023-08-01 | 北京智行者科技股份有限公司 | 基于实时安全边界的规划控制方法 |
| JP2022059958A (ja) * | 2020-10-02 | 2022-04-14 | フォルシアクラリオン・エレクトロニクス株式会社 | ナビゲーション装置 |
| CN114858119B (zh) * | 2021-02-04 | 2024-04-02 | 长沙智能驾驶研究院有限公司 | 边距测量方法、装置、设备及计算机存储介质 |
| JP2022123239A (ja) * | 2021-02-12 | 2022-08-24 | 本田技研工業株式会社 | 区画線認識装置 |
| DE102021107904A1 (de) * | 2021-03-29 | 2022-09-29 | Conti Temic Microelectronic Gmbh | Verfahren und System zur Bestimmung der Bodenebene mit einem künstlichen neuronalen Netz |
| CN113479191B (zh) * | 2021-06-30 | 2023-04-07 | 重庆长安汽车股份有限公司 | 用于泊车的无车道线的车道边界检测系统、方法及车辆 |
| CN114077249B (zh) * | 2021-10-22 | 2024-03-15 | 陕西欧卡电子智能科技有限公司 | 一种作业方法、作业设备、装置、存储介质 |
| CN115235490A (zh) * | 2022-07-08 | 2022-10-25 | 东风柳州汽车有限公司 | 行驶路线生成方法、装置、设备及存储介质 |
| CN116543000B (zh) * | 2023-04-29 | 2025-11-25 | 武汉中海庭数据技术有限公司 | 一种遥感影像道路边缘检测方法及系统 |
| CN116901085B (zh) * | 2023-09-01 | 2023-12-22 | 苏州立构机器人有限公司 | 智能机器人避障方法、装置、智能机器人及可读存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009298356A (ja) * | 2008-06-17 | 2009-12-24 | Nissan Motor Co Ltd | 車両用障害物回避支援装置及び車両用障害物回避支援方法 |
| JP2013003800A (ja) | 2011-06-15 | 2013-01-07 | Toyota Central R&D Labs Inc | 走行可能領域検出装置及びプログラム |
| JP2014175007A (ja) * | 2013-03-11 | 2014-09-22 | Ricoh Co Ltd | 道路エッジ検出方法及び道路エッジ検出装置 |
| JP2015011619A (ja) * | 2013-07-01 | 2015-01-19 | 株式会社リコー | 情報検出装置、移動体機器制御システム、移動体及び情報検出用プログラム |
| JP2016009333A (ja) | 2014-06-24 | 2016-01-18 | トヨタ自動車株式会社 | 走路境界推定装置及び走路境界推定方法 |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5074365B2 (ja) * | 2008-11-28 | 2012-11-14 | 日立オートモティブシステムズ株式会社 | カメラ装置 |
| JP5468426B2 (ja) * | 2010-03-12 | 2014-04-09 | 日立オートモティブシステムズ株式会社 | ステレオカメラ装置 |
-
2017
- 2017-07-12 WO PCT/JP2017/025356 patent/WO2018016394A1/ja not_active Ceased
- 2017-07-12 JP JP2018528508A patent/JP6606610B2/ja active Active
- 2017-07-12 EP EP17830913.4A patent/EP3489928B1/en active Active
- 2017-07-12 US US16/301,782 patent/US10832061B2/en active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009298356A (ja) * | 2008-06-17 | 2009-12-24 | Nissan Motor Co Ltd | 車両用障害物回避支援装置及び車両用障害物回避支援方法 |
| JP2013003800A (ja) | 2011-06-15 | 2013-01-07 | Toyota Central R&D Labs Inc | 走行可能領域検出装置及びプログラム |
| JP2014175007A (ja) * | 2013-03-11 | 2014-09-22 | Ricoh Co Ltd | 道路エッジ検出方法及び道路エッジ検出装置 |
| JP2015011619A (ja) * | 2013-07-01 | 2015-01-19 | 株式会社リコー | 情報検出装置、移動体機器制御システム、移動体及び情報検出用プログラム |
| JP2016009333A (ja) | 2014-06-24 | 2016-01-18 | トヨタ自動車株式会社 | 走路境界推定装置及び走路境界推定方法 |
Non-Patent Citations (3)
| Title |
|---|
| F. ONIGA ET AL.: "Curb Detection Based on Elevation Maps from Dense Stereo", IEEE INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING, 2007 |
| H. BADINO ET AL.: "Free Space Computation Using Stochastic Occupancy Grids and Dynamic Programming", WORKSHOP ON DYNAMIC VISION, ICCV, 2007 |
| See also references of EP3489928A4 |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11790668B2 (en) * | 2019-01-31 | 2023-10-17 | Uatc, Llc | Automated road edge boundary detection |
| CN112393737A (zh) * | 2019-08-16 | 2021-02-23 | 苏州科瓴精密机械科技有限公司 | 障碍地图的创建方法、系统,机器人及可读存储介质 |
| CN112393737B (zh) * | 2019-08-16 | 2024-03-08 | 苏州科瓴精密机械科技有限公司 | 障碍地图的创建方法、系统,机器人及可读存储介质 |
| CN112257652A (zh) * | 2020-11-06 | 2021-01-22 | 北京航迹科技有限公司 | 确定可行驶区域的方法、装置、设备和存储介质 |
| US12351167B2 (en) | 2020-12-02 | 2025-07-08 | Panasonic Automotive Systems Co., Ltd. | Vehicle and control device |
| KR20220084790A (ko) * | 2020-12-14 | 2022-06-21 | 건국대학교 산학협력단 | 3차원 공간정보 생성을 위한 병렬처리 장치 및 방법 |
| KR102520375B1 (ko) | 2020-12-14 | 2023-04-11 | 건국대학교 산학협력단 | 3차원 공간정보 생성을 위한 병렬처리 장치 및 방법 |
| JP2025508060A (ja) * | 2022-03-24 | 2025-03-21 | センスタイム グループ リミテッド | 道路障害物の検出方法、装置、機器及び記憶媒体 |
| JP7785192B2 (ja) | 2022-03-24 | 2025-12-12 | 本田技研工業株式会社 | 道路障害物の検出方法、道路障害物の検出装置、コンピュータ記憶媒体、コンピュータ機器及びコンピュータプログラム |
| WO2024042607A1 (ja) * | 2022-08-23 | 2024-02-29 | 日立Astemo株式会社 | 外界認識装置及び外界認識方法 |
| JPWO2024042607A1 (ja) * | 2022-08-23 | 2024-02-29 | ||
| JP7788005B2 (ja) | 2022-08-23 | 2025-12-17 | Astemo株式会社 | 外界認識装置及び外界認識方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190156129A1 (en) | 2019-05-23 |
| JPWO2018016394A1 (ja) | 2019-03-07 |
| US10832061B2 (en) | 2020-11-10 |
| EP3489928B1 (en) | 2025-04-30 |
| EP3489928A1 (en) | 2019-05-29 |
| JP6606610B2 (ja) | 2019-11-13 |
| EP3489928A4 (en) | 2020-03-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6606610B2 (ja) | 走路境界推定装置及びそれを用いた走行支援システム | |
| US11200433B2 (en) | Detection and classification systems and methods for autonomous vehicle navigation | |
| US10580155B2 (en) | Image processing apparatus, imaging device, device control system, frequency distribution image generation method, and recording medium | |
| US9330320B2 (en) | Object detection apparatus, object detection method, object detection program and device control system for moveable apparatus | |
| JP5820774B2 (ja) | 路面境界推定装置及びプログラム | |
| US20150367781A1 (en) | Lane boundary estimation device and lane boundary estimation method | |
| CN105849585B (zh) | 物体识别装置 | |
| JP4940177B2 (ja) | 交通流計測装置 | |
| Liu et al. | Development of a vision-based driver assistance system with lane departure warning and forward collision warning functions | |
| JP4296287B2 (ja) | 車両認識装置 | |
| JP2018092602A (ja) | 情報処理装置、撮像装置、機器制御システム、移動体、情報処理方法、及び、情報処理プログラム | |
| CN109241855B (zh) | 基于立体视觉的智能车辆可行驶区域探测方法 | |
| JP5888275B2 (ja) | 道路端検出システム、方法およびプログラム | |
| JP2017117105A (ja) | 見通し判定装置 | |
| US20200193184A1 (en) | Image processing device and image processing method | |
| JP6038422B1 (ja) | 車両判定装置、車両判定方法及び車両判定プログラム | |
| JP7344744B2 (ja) | 路側端検出方法、及び、路側端検出装置 | |
| JP2021047099A (ja) | 車両の自己位置推定装置、および、自己位置推定方法 | |
| EP3287948B1 (en) | Image processing apparatus, moving body apparatus control system, image processing method, and program | |
| JP5746996B2 (ja) | 道路環境認識装置 | |
| JP4956099B2 (ja) | 壁検出装置 | |
| KR102119678B1 (ko) | 차선 검출 방법 및 그 방법을 수행하는 전자 장치 | |
| JP2022065044A (ja) | 情報処理装置 | |
| CN113994663A (zh) | 立体图像处理装置及立体图像处理方法 | |
| EP4558973A1 (en) | A computer-implemented method of generating training data for training a machine learning model |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| ENP | Entry into the national phase |
Ref document number: 2018528508 Country of ref document: JP Kind code of ref document: A |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17830913 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2017830913 Country of ref document: EP Effective date: 20190222 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2017830913 Country of ref document: EP |