[go: up one dir, main page]

WO2018123641A1 - Travelable area detection device and travel assistance system - Google Patents

Travelable area detection device and travel assistance system Download PDF

Info

Publication number
WO2018123641A1
WO2018123641A1 PCT/JP2017/045032 JP2017045032W WO2018123641A1 WO 2018123641 A1 WO2018123641 A1 WO 2018123641A1 JP 2017045032 W JP2017045032 W JP 2017045032W WO 2018123641 A1 WO2018123641 A1 WO 2018123641A1
Authority
WO
WIPO (PCT)
Prior art keywords
travelable
road
region
area
travel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/045032
Other languages
French (fr)
Japanese (ja)
Inventor
優貴 造田
寛人 三苫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Priority to JP2018559043A priority Critical patent/JP6837262B2/en
Priority to CN201780067417.1A priority patent/CN110088801B/en
Publication of WO2018123641A1 publication Critical patent/WO2018123641A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a travelable area detection device and a travel support system.
  • Patent Document 1 discloses an in-vehicle system in which a road end break is detected by an in-vehicle camera that captures the periphery of a vehicle, and the target section is set as a travelable area based on edge information of the break section.
  • a travelable region detection device capable of detecting a road end break in front of a vehicle that is a moving body and determining whether or not there is a travelable region in which the vehicle can travel from the break. And a driving support system using the same has been desired.
  • the present invention has been made in view of the above-described problems of the prior art.
  • the object of the present invention is to detect a road end break in front of the moving body, and the moving body travels from the detected road end break. It is to realize a travelable area detection device capable of determining whether there is a travelable area that can be performed and a travel support system using the travelable area detection device.
  • the present invention is configured as follows.
  • a road edge detection unit that detects a road edge existing on the road on which the mobile body travels and outputs road edge information
  • a search range determination unit that determines a search range based on road edge information detected by the road edge detection unit
  • a road surface gradient estimation unit that estimates the road surface gradient of the road-side travelable determination region as viewed from the mobile body
  • a travelable region determination unit that determines whether or not the travelable determination region is a region in which the moving body can travel from at least the road surface gradient estimated by the road surface gradient estimation unit.
  • An imaging device for imaging the front of the moving body Based on the captured image captured by the imaging device, a road edge detection unit that detects a road edge existing on a road on which the moving body travels and outputs road edge information; and a road detected by the road edge detection unit
  • a search range determination unit that determines a search range based on edge information, and a road surface gradient in the travelable determination region on the road edge side as viewed from the mobile body is estimated based on the search range determined by the search range determination unit.
  • a road surface gradient estimation unit; and a travelable region determination unit that determines whether or not the travelable determination region is a region in which the mobile body can travel based on at least the road surface gradient estimated by the road surface gradient estimation unit.
  • a travelable area detection device A travel support control device that assists travel by controlling the operation of the moving body according to the determination result of the travelable region detection device.
  • a travelable region detection device and a travel support system using the travelable region detection device can be realized.
  • FIG. 1 is a schematic configuration diagram of a travel support system to which a travelable area detection device according to an embodiment of the present invention is applied. It is a flowchart of a driving
  • An embodiment of the present invention is an example when applied to a vehicle as a moving body.
  • FIG. 1 is a schematic configuration diagram of a travel support system to which a travelable area detection device according to an embodiment of the present invention is applied.
  • the driving support system is mounted on a vehicle V (shown in FIG. 3) such as an automobile, and mainly includes a plurality of (two in this embodiment) cameras that capture the front of the vehicle V.
  • a stereo camera device (imaging device) 2 a travelable region detection device 1 that detects a travelable region around the vehicle V from a plurality of images captured in synchronization with each camera of the stereo camera device 2, and travel
  • various control devices for example, an accelerator control unit 31, a brake control unit 32, a speaker control unit 33, a steering control unit 34, etc. mounted on the vehicle V are controlled to correspond to the corresponding vehicle.
  • a travel support control device 3 that supports travel by controlling the operation of V.
  • the stereo camera device 2 is installed toward the front of the vehicle V, for example, in front of the upper part of the windshield in front of the vehicle V, and as a pair of imaging units that capture the image of the front of the vehicle V and acquire image information.
  • Each of the left camera 10 and the right camera 11 has an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and the left and right directions of the vehicle (direction perpendicular to the longitudinal direction of the vehicle). Are installed so as to image the front of the vehicle V from positions separated from each other.
  • an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor)
  • the left and right directions of the vehicle are installed so as to image the front of the vehicle V from positions separated from each other.
  • the travelable region detection device 1 is a device that detects a travelable region based on the image information of the imaging target region in front of the vehicle V acquired in time series by the stereo camera device 2 in a predetermined cycle.
  • the travelable area detection device 1 detects a travelable area from a road edge that is a boundary with a sidewalk, a roadside belt, a curb, etc. on the side of the travel road, and outputs the detection result to the travel support control device 3. To do.
  • the travelable area detection device 1 controls the stereo camera device 2 (for example, controls the imaging timing and exposure amount of each camera), RAM (Random Access Memory) which is a temporary storage area, ROM (Read Only Memory) that stores programs and various initial values, CPU (Central Processing Unit) recognition information that controls the entire system, etc.
  • External IF Interface
  • image processing LSI Large Scale Integration
  • the travelable area detection device 1 includes a parallax image generation unit 12, a road edge detection unit 13, a search range determination unit 14, a road surface gradient estimation unit 15, and a travelable region determination unit 16. It has.
  • the left image acquisition unit 10 and the right image acquisition unit 11 of the stereo camera device 2 acquire the captured left image and right image, respectively, and transmit the acquired left image and right image to the parallax image generation unit 12.
  • the travel support control device 3 is based on the detection result (determination result) received from the travelable region detection device 1, and the accelerator control unit for assisting the operation of the speaker control unit 33 of the vehicle V and the travel of the vehicle V.
  • the control amount of 31, the control amount of the brake control unit 32, and the control amount of the steering control unit 34 are calculated to adjust the accelerator, the brake, the steering, and the like.
  • the height of the interrupted section It is possible to determine whether or not there is a travelable region where the vehicle can travel from the interrupted section by using the change and the gradient change of the road surface in the area opposite to the traveling lane of the interrupted section. It is considered possible.
  • FIG. 2 is a flowchart of the travelable area determination process
  • FIG. FIG. 4 is a diagram showing a road end point group and a search range detected by the travelable area detection device 1.
  • a travelable area determination process will be described with reference to the flowchart shown in FIG. This process is started when the vehicle starts running as an example.
  • steps S2, S3, and S4 are processing of the road edge detection unit 13
  • steps S5 and S6 are processing of the search range determination unit 14. is there.
  • Step S7 is a process of the road surface gradient estimation unit 15, and steps S8 and S9 are a process of the travelable area determination unit 16.
  • a parallax image is generated using the left image and the right image acquired by the left image acquisition unit 10 and the right image acquisition unit 11, and the generated parallax image is processed in step S2.
  • a parallax positional deviation between images
  • the calculated parallax is calculated.
  • a parallax image including three-dimensional distance information (spatial information) is generated.
  • step S1 in block matching processing, one image (for example, the left image) is divided so as to include a plurality of pixels by a first block having a predetermined shape, and the other image (for example, the right image) is divided.
  • the second block is the same size, shape and position as the first block.
  • the second block is shifted by one pixel at a time in the horizontal direction to calculate the correlation value for the two luminance patterns in the first block and the second block at each position, and the correlation value is the smallest, that is, the correlation Search for the highest position (corresponding point search).
  • SAD Sum of Absolute Difference
  • SSD Sum of Squared Difference
  • NCC Normalized Cross Correlation
  • a gradient method etc.
  • a parallax image can be generated by executing the same step for all the pixels as a single step.
  • Step S2 uses the parallax image acquired in Step S1 to extract road edge parts that can be candidates for road edges (eg, sidewalks / curbs, guardrails, continuous poles, walls, etc.) on the road, and the extracted road edges Is processed in step S3.
  • road edge parts eg, sidewalks / curbs, guardrails, continuous poles, walls, etc.
  • Describing step S2 in detail, in the parallax image acquired in step S1, using the fact that the distance change near the object that is the road edge candidate is smaller than the distance change of the flat part, Detect feature points.
  • the image data is scanned in the vertical direction to detect a step that can be regarded as a roadside object.
  • step S3 the detected road edge is distributed to the left road edge and the right road edge of the vehicle V. Furthermore, as shown in FIG. 3, the road end point group 100 other than the roadside objects such as other vehicles and pedestrians (the illustrated example is a case of the vehicle) is excluded, and the road end point group used for the travelable area search Only 101 is extracted, and only the extracted road edge information (output from the road edge detector 13) is processed in step S4.
  • the road edge used for the search for the travelable area is a boundary between the travel lane and the road edge excluding an obstacle (for example, another vehicle) on the travel lane of the vehicle V.
  • step S4 when there is a road edge break at the extracted road edge and it is determined that the road edge break is a travelable area candidate of the vehicle V, the extracted road edge position information is obtained. Processed in step S5. In determining the travelable area candidate of the vehicle V, the travelable area candidate is determined when the distance between the road end end point 103 and the road end start point 102 in FIG.
  • step S5 the processing area used for road surface slope estimation from the road edge discontinuity section (between point 102 and point 103) detected in step S4, that is, the search area (area area on the road edge side as viewed from the vehicle). ) And the coordinates of the determined search range are processed in step S6.
  • the straight line 400 from the start point 102 of the road edge discontinuity section to the discontinuation end point 103 is drawn in the horizontal direction from the point 102 and the point 103 to the opposite side of the traveling road.
  • the rectangular area 600 formed by the straight lines 200 and 300 and the straight line 500 at the screen edge is set as the search range.
  • the setting of the road surface gradient estimation filter size described in step S5 of FIG. 2 means the setting of the rectangular area 600.
  • step S6 when a shield (tree, guardrail, etc.) is detected within the rectangular area 600 determined in step S5, and the road gradient cannot be estimated, the road surface is determined in step S17.
  • the information in step S5 is processed in step S7 only when the premises estimation is impossible and the road surface gradient can be estimated.
  • step S7 the gradient of the road surface of the rectangular area 600 that is the search area determined in step S5 is estimated, and the estimated gradient information is processed in step S8.
  • step S7 scans the rectangular area 600 which is a search area (running determination area) in the horizontal direction, and acquires road surface height information.
  • the road surface height information is obtained from the near side (the vehicle V side with respect to the road edge break), and the road surface gradient in the rectangular region 600 is estimated.
  • step S8 it is determined from the change in the road surface gradient estimated in step S7 whether the road surface gradient is within a range in which the vehicle V can travel. If it is determined that the vehicle cannot travel, the vehicle cannot travel. The determination is made in step S18. Only when it is determined that the rectangular area 600 can travel, the information in step S7 is processed in step S9.
  • step S9 when the vehicle V enters the road edge section from the roadway, there is an area where the vehicle V can stop without protruding into the traveling lane, that is, an obstacle (such as a wall or a step).
  • an obstacle such as a wall or a step. The presence / absence, presence of a hole, and whether or not the vehicle can travel are determined.
  • step S18 If it is determined that there is an object that becomes an obstacle, a hole, or a region where the host vehicle is not allowed to travel, it is determined in step S18 that the region is not travelable.
  • step S8 If it is determined in step S8 that there is no obstacle, no hole, and the vehicle can travel, the travelable region is determined in step S19. Is transmitted to the driving support device 3.
  • the driving support control device 3 determines the road surface gradient estimation impossibility determination in step S17, the improper travel area determination in step S18, and the step based on the shielding determination in step S6 and the determination results in step S8 and step S9. Based on the result of the determination of the travelable area in S19, the travel support of the vehicle V is implemented.
  • the position of the road edge in the travel region is detected from the parallax images generated based on the plurality of images acquired by the stereo camera device 2. Then, based on the detected position information of the road edge, a change in the height direction of the break between the road edges 102 and 103 and a change in the gradient of the region extending from the road edge break to the opposite side of the traveling lane are estimated. Thus, it can be determined whether or not the road edge break is in the travelable region of the host vehicle V.
  • a road area detection is possible in which a road edge break in front of the moving body is detected, and it is possible to determine whether or not there is a driveable area in which the mobile body can travel from the detected road edge break.
  • a device and a driving support system using the device can be realized.
  • the example mentioned above is an example which mounts the driving
  • this invention is applicable not only to a vehicle but another mobile body.
  • the present invention can be applied to a traveling robot (rescue robot) used for disasters.
  • an imaging element such as a CCD or CMOS is taken as an example of the stereo camera device, but a laser radar can also be used as the imaging device.
  • the parallax image generation unit 12 generates a parallax image and detects a roadside object.
  • the roadside object can be detected by image recognition processing other than the parallax image. It is.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present invention achieves: a travelable area detection device that can detect breaks in road edges ahead of a moving body and determine whether there are travelable areas that the moving body can travel from the detected breaks in the road edges; and a travel assistance system that uses the travelable area detection device. The travelable area detection device 1: detects the position of a roadside of a travel area from a stereoscopic image generated on the basis of a plurality of images acquired by a stereo camera device 2; and uses position information for detected road edges to estimate the change in height of breaks between the road edges and the change in gradient of areas that extend from the breaks in the road edges to the opposite side of a travel lane. The present invention can thereby determine whether the breaks in the road edges are areas that can be traveled by a vehicle V.

Description

走行可能領域検出装置及び走行支援システムRUNNING AREA DETECTION DEVICE AND RUNNING SUPPORT SYSTEM

 本発明は、走行可能領域検出装置及び走行支援システムに関する。 The present invention relates to a travelable area detection device and a travel support system.

 従来、移動体である自動車等の車両に搭載されたカメラやレーダから得られる情報に基づいて、該当車両が走行する道路(走行路)上の白線や路端(歩道や路側帯や縁石などとの境界)等を検知し、その検知結果をドライバーの運転支援に利用するシステムが開発されている。 Conventionally, based on information obtained from a camera or radar mounted on a vehicle such as an automobile that is a moving body, a white line or a roadside (a sidewalk, a roadside belt, a curb, etc.) A system has been developed that detects the boundary of the vehicle and uses the detection result for driver assistance.

 特許文献1には、車両の周辺を撮像する車載カメラにより路端の途切れを検出し、その途切れ区間のエッジ情報から対象区間を走行可能領域とする車載システムが開示されている。 Patent Document 1 discloses an in-vehicle system in which a road end break is detected by an in-vehicle camera that captures the periphery of a vehicle, and the target section is set as a travelable area based on edge information of the break section.

特開2016-18371号公報JP 2016-18371 A

 しかしながら、特許文献1に開示されている技術では、路端途切れを検出しているが、路端途切れ区間の距離の算出や、路端途切れ区間の車道と反対側の領域が走行可能か否かを判定しておらず、信頼性の高い走行可能領域を検出することが困難であるという問題があった。 However, in the technique disclosed in Patent Document 1, road edge interruption is detected, but calculation of the distance of the road edge interruption section and whether or not the area on the opposite side of the roadway of the road edge interruption section can travel. There is a problem that it is difficult to detect a reliable travelable region.

 このため、移動体である車両の前方における路端の途切れを検出し、その途切れから車両が走行することができる走行可能領域が存在するか否かを判定することが可能な走行可能領域検出装置及びそれを用いた走行支援システムが望まれていた。 For this reason, a travelable region detection device capable of detecting a road end break in front of a vehicle that is a moving body and determining whether or not there is a travelable region in which the vehicle can travel from the break. And a driving support system using the same has been desired.

 本発明は、上記従来技術の問題に鑑みてなされたものであって、その目的とするところは、移動体の前方における路端の途切れを検出し、検出した路端の途切れから移動体が走行することができる走行可能領域が存在するか否かを判定することが可能な走行可能領域検出装置及びそれを用いた走行支援システムを実現することである。 The present invention has been made in view of the above-described problems of the prior art. The object of the present invention is to detect a road end break in front of the moving body, and the moving body travels from the detected road end break. It is to realize a travelable area detection device capable of determining whether there is a travelable area that can be performed and a travel support system using the travelable area detection device.

 上記する課題を解決するために、本発明は、次のように構成される。 In order to solve the above-described problems, the present invention is configured as follows.

 走行可能領域検出装置において、
  撮像画像に基づいて、移動体が走行する道路上に存在する路端を検出し、路端情報を出力する路端検出部と、
  上記路端検出部で検出された路端情報に基づいて探索範囲を決定する探索範囲決定部と、
  上記探索範囲決定部により決定された探索範囲に基づいて、移動体から見て路端側の走行可能判定領域の路面勾配を推定する路面勾配推定部と、
  少なくとも上記路面勾配推定部により推定された上記路面勾配から、上記走行可能判定領域は上記移動体が走行可能な領域か否かを判定する走行可能領域判定部と、を備える。
In the travelable area detection device,
Based on the captured image, a road edge detection unit that detects a road edge existing on the road on which the mobile body travels and outputs road edge information;
A search range determination unit that determines a search range based on road edge information detected by the road edge detection unit;
Based on the search range determined by the search range determination unit, a road surface gradient estimation unit that estimates the road surface gradient of the road-side travelable determination region as viewed from the mobile body,
A travelable region determination unit that determines whether or not the travelable determination region is a region in which the moving body can travel from at least the road surface gradient estimated by the road surface gradient estimation unit.

 また、走行支援システムにおいて、
  移動体の前方を撮像する撮像装置と、
  上記撮像装置が撮像した撮像画像に基づいて、移動体が走行する道路上に存在する路端を検出し、路端情報を出力する路端検出部と、上記路端検出部で検出された路端情報に基づいて探索範囲を決定する探索範囲決定部と、上記探索範囲決定部により決定された探索範囲に基づいて、移動体から見て路端側の走行可能判定領域の路面勾配を推定する路面勾配推定部と、少なくとも上記路面勾配推定部により推定された上記路面勾配から、上記走行可能判定領域は上記移動体が走行可能な領域か否かを判定する走行可能領域判定部と、を有する走行可能領域検出装置と、
  上記走行可能領域検出装置の判定結果に従って上記移動体の動作を制御して走行を支援する走行支援制御装置と、を備える。
In the driving support system,
An imaging device for imaging the front of the moving body;
Based on the captured image captured by the imaging device, a road edge detection unit that detects a road edge existing on a road on which the moving body travels and outputs road edge information; and a road detected by the road edge detection unit A search range determination unit that determines a search range based on edge information, and a road surface gradient in the travelable determination region on the road edge side as viewed from the mobile body is estimated based on the search range determined by the search range determination unit. A road surface gradient estimation unit; and a travelable region determination unit that determines whether or not the travelable determination region is a region in which the mobile body can travel based on at least the road surface gradient estimated by the road surface gradient estimation unit. A travelable area detection device;
A travel support control device that assists travel by controlling the operation of the moving body according to the determination result of the travelable region detection device.

 本発明によれば、移動体の前方における路端の途切れを検出し、検出した路端の途切れから移動体が走行することができる走行可能領域が存在するか否かを判定することが可能な走行可能領域検出装置及びそれを用いた走行支援システムを実現することができる。 According to the present invention, it is possible to detect a road edge break in front of the mobile body and determine whether or not there is a travelable region in which the mobile body can travel from the detected road edge break. A travelable region detection device and a travel support system using the travelable region detection device can be realized.

本発明の一実施例に係わる走行可能領域検出装置が適用された走行支援システムの概略構成図である。1 is a schematic configuration diagram of a travel support system to which a travelable area detection device according to an embodiment of the present invention is applied. 走行可能領域判定処理のフローチャートである。It is a flowchart of a driving | running | working area | region determination process. 走行可能領域検出装置により検出された路端点群を示す俯瞰図である。It is an overhead view which shows the road end point group detected by the driving | running | working area | region detection apparatus. 走行可能領域検出装置により検出された路端点群と探索範囲を示す図である。It is a figure which shows the road edge point group and search range which were detected by the driving | running | working area | region detection apparatus.

 以下、本発明を実施するための形態について、図面を参照しながら詳細に説明する。 Hereinafter, embodiments for carrying out the present invention will be described in detail with reference to the drawings.

 本発明の一実施例は、移動体として車両に適用された場合の例である。 An embodiment of the present invention is an example when applied to a vehicle as a moving body.

 図1は、本発明の一実施例に係わる走行可能領域検出装置が適用された走行支援システムの概略構成図である。 FIG. 1 is a schematic configuration diagram of a travel support system to which a travelable area detection device according to an embodiment of the present invention is applied.

 図1において、走行支援システムは、例えば、自動車等の車両V(図3に示す)搭載されており、主に、車両Vの前方を撮像する複数(本実施例では、2つ)のカメラからなるステレオカメラ装置(撮像装置)2と、ステレオカメラ装置2の各カメラで同期して撮像された複数の画像から該当車両Vの周囲の走行可能領域を検出する走行可能領域検出装置1と、走行可能領域検出装置1の検出結果に基づいて車両Vに搭載された各種制御装置(例えば、アクセル制御部31、ブレーキ制御部32、スピーカ制御部33、ステアリング制御部34など)を制御して該当車両Vの動作を制御して走行を支援する走行支援制御装置3と、を備えている。 In FIG. 1, the driving support system is mounted on a vehicle V (shown in FIG. 3) such as an automobile, and mainly includes a plurality of (two in this embodiment) cameras that capture the front of the vehicle V. A stereo camera device (imaging device) 2, a travelable region detection device 1 that detects a travelable region around the vehicle V from a plurality of images captured in synchronization with each camera of the stereo camera device 2, and travel Based on the detection result of the possible region detection device 1, various control devices (for example, an accelerator control unit 31, a brake control unit 32, a speaker control unit 33, a steering control unit 34, etc.) mounted on the vehicle V are controlled to correspond to the corresponding vehicle. And a travel support control device 3 that supports travel by controlling the operation of V.

 ステレオカメラ装置2は、例えば車両Vの前方のフロントガラスの上部の手前に、車両Vの前方に向かって設置されており、車両Vの前方を撮像して画像情報を取得する一対の撮像部としての左カメラ(左画像取得部)10と、右カメラ(右画像取得部)11とを備えている。 The stereo camera device 2 is installed toward the front of the vehicle V, for example, in front of the upper part of the windshield in front of the vehicle V, and as a pair of imaging units that capture the image of the front of the vehicle V and acquire image information. Left camera (left image acquisition unit) 10 and right camera (right image acquisition unit) 11.

 左カメラ10と右カメラ11は、それぞれ、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)などの撮像素子を有しており、互いに車両の左右方向(車両の長手方向に直交する方向)に互いに離間した位置から車両Vの前方を撮像するように設置されている。 Each of the left camera 10 and the right camera 11 has an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor), and the left and right directions of the vehicle (direction perpendicular to the longitudinal direction of the vehicle). Are installed so as to image the front of the vehicle V from positions separated from each other.

 走行可能領域検出装置1は、ステレオカメラ装置2で所定の周期で時系列的に取得した車両Vの前方の撮影対象領域の画像情報に基づいて走行可能領域を検出する装置である。 The travelable region detection device 1 is a device that detects a travelable region based on the image information of the imaging target region in front of the vehicle V acquired in time series by the stereo camera device 2 in a predetermined cycle.

 走行可能領域検出装置1は、特に、走行路側面の歩道や路側帯や縁石などとの境界である道路端の途切れから、走行可能領域を検出し、その検知結果を走行支援制御装置3へ出力する。 The travelable area detection device 1 detects a travelable area from a road edge that is a boundary with a sidewalk, a roadside belt, a curb, etc. on the side of the travel road, and outputs the detection result to the travel support control device 3. To do.

 なお、この走行可能領域検出装置1は、ステレオカメラ装置2を制御する(例えば、各カメラ、の撮像タイミングや露光量を制御する)カメラ制御部、一時記憶領域であるRAM(Random Access Memory)、プログラムや各種初期値を格納するROM(Read Only Memory)、システム全体の制御を行うCPU(Central Processing Unit)認識情報等をシステム外へ出力する外部IF(Interface)画像処理LSI(Large Scale Integration)などから構成され、各構成要素はバスを介して通信可能に接続されている。 The travelable area detection device 1 controls the stereo camera device 2 (for example, controls the imaging timing and exposure amount of each camera), RAM (Random Access Memory) which is a temporary storage area, ROM (Read Only Memory) that stores programs and various initial values, CPU (Central Processing Unit) recognition information that controls the entire system, etc. External IF (Interface) image processing LSI (Large Scale Integration) that outputs outside the system, etc. Each component is communicably connected via a bus.

 走行可能領域検出装置1は、図1に示すように、視差画像生成部12と、路端検出部13と、探索範囲決定部14と、路面勾配推定部15と、走行可能領域判定部16とを備えている。 As shown in FIG. 1, the travelable area detection device 1 includes a parallax image generation unit 12, a road edge detection unit 13, a search range determination unit 14, a road surface gradient estimation unit 15, and a travelable region determination unit 16. It has.

 ステレオカメラ装置2の左画像取得部10及び右画像取得部11は、それぞれ撮像した左画像及び右画像を取得し、取得した左画像及び右画像を視差画像生成部12に送信する。 The left image acquisition unit 10 and the right image acquisition unit 11 of the stereo camera device 2 acquire the captured left image and right image, respectively, and transmit the acquired left image and right image to the parallax image generation unit 12.

 走行支援制御装置3は、走行可能領域検出装置1から受信した検出結果(判定結果)に基づいて、該当車両Vのスピーカ制御部33の作動、該当車両Vの走行を支援するためのアクセル制御部31の制御量、ブレーキ制御部32の制御量、ステアリング制御部34の制御量を算出して、アクセルやブレーキやステアリングなどの調整を行うようにしている。 The travel support control device 3 is based on the detection result (determination result) received from the travelable region detection device 1, and the accelerator control unit for assisting the operation of the speaker control unit 33 of the vehicle V and the travel of the vehicle V. The control amount of 31, the control amount of the brake control unit 32, and the control amount of the steering control unit 34 are calculated to adjust the accelerator, the brake, the steering, and the like.

 なお、以下では、走行可能領域検出装置1によって、該当車両Vの前方に存在する走行可能領域を検出する場合について具体的に説明する。 In addition, below, the case where the driving | running | working area | region detection apparatus 1 detects the driving | running | working area | region which exists ahead of the applicable vehicle V is demonstrated concretely.

 車両の走行路上の路端(歩道や縁石や中央分離帯など)の途切れには高さの変化が存在している(例えば、路端の高さや勾配の変化など)ため、途切れ区間の高さ変化と、途切れ区間の走行車線と反対方向に存在する領域の路面の勾配変化を利用することで、途切れ区間から車両が走行することができる走行可能領域が存在するか否かを判断することができると考えられる。 Since there is a change in the height of the road edge (such as a sidewalk, curbstone, or median) on the road of the vehicle, there is a change in height (for example, a change in the height of the road edge or slope), so the height of the interrupted section It is possible to determine whether or not there is a travelable region where the vehicle can travel from the interrupted section by using the change and the gradient change of the road surface in the area opposite to the traveling lane of the interrupted section. It is considered possible.

 次に、走行可能領域検出装置1により実行される走行可能領域判定処理について、図2、図3、図4を参照して説明する。図2は、走行可能領域判定処理のフローチャートであり、図3は、走行可能領域検出装置1により検出された路端点群を示す俯瞰図である。また、図4は、走行可能領域検出装置1により検出された路端点群と探索範囲を示す図である
 走行可能領域判定処理を、図2に示すフローチャートに沿って説明する。本処理は、一例として車両の走行が開始されたときに開始される。
Next, the travelable area determination process executed by the travelable area detection device 1 will be described with reference to FIGS. 2, 3, and 4. FIG. 2 is a flowchart of the travelable area determination process, and FIG. FIG. 4 is a diagram showing a road end point group and a search range detected by the travelable area detection device 1. A travelable area determination process will be described with reference to the flowchart shown in FIG. This process is started when the vehicle starts running as an example.

 なお、図2に示すステップS1は視差画像生成部12の処理であり、ステップS2、S3、及びS4は路端検出部13の処理であり、ステップS5及びS6は探索範囲決定部14の処理である。 2 is processing of the parallax image generation unit 12, steps S2, S3, and S4 are processing of the road edge detection unit 13, and steps S5 and S6 are processing of the search range determination unit 14. is there.

 また、ステップS7は路面勾配推定部15の処理であり、ステップS8及びS9が走行可能領域判定部16の処理である。 Step S7 is a process of the road surface gradient estimation unit 15, and steps S8 and S9 are a process of the travelable area determination unit 16.

 図2のステップS1において、左画像取得部10および右画像取得部11で取得した左画像及び右画像を使用して視差画像を生成し、生成した視差画像がステップS2にて処理される。 2, a parallax image is generated using the left image and the right image acquired by the left image acquisition unit 10 and the right image acquisition unit 11, and the generated parallax image is processed in step S2.

 具体的には、左画像取得部10および右画像取得部11で取得した左画像及び右画像に基づいて、ブロックマッチング処理により視差(画像間での位置ずれ)を算出し、算出された視差に基づいて3次元的な距離情報(空間情報)を含む視差画像を生成する。 Specifically, based on the left image and the right image acquired by the left image acquisition unit 10 and the right image acquisition unit 11, a parallax (positional deviation between images) is calculated by block matching processing, and the calculated parallax is calculated. Based on this, a parallax image including three-dimensional distance information (spatial information) is generated.

 より詳細には、ステップS1は、ブロックマッチング処理にて、一方の画像(例えば左画像)を、所定形状の第1ブロックにより複数の画素を含むように区切り、他方の画像(例えば右画像)を、第1ブロックと同じサイズ・形状・位置の第2ブロックで区切る。 More specifically, in step S1, in block matching processing, one image (for example, the left image) is divided so as to include a plurality of pixels by a first block having a predetermined shape, and the other image (for example, the right image) is divided. The second block is the same size, shape and position as the first block.

 そして、第2ブロックを1画素ずつ横方向にずらして各位置で第1ブロック内及び第2ブロック内における2つの輝度パターンについての相関値を算出し、その相関値が最も小さくなる、すなわち、相関が最も高い位置を探索(対応点検索)する。 Then, the second block is shifted by one pixel at a time in the horizontal direction to calculate the correlation value for the two luminance patterns in the first block and the second block at each position, and the correlation value is the smallest, that is, the correlation Search for the highest position (corresponding point search).

 なお、相関値の算出方法としては、例えば、SAD(Sum of Absolute Difference)やSSD(Sum of Squared Difference)、NCC(Normalized Cross Correlation)、勾配法などを用いることができる。 In addition, as a calculation method of a correlation value, SAD (Sum of Absolute Difference), SSD (Sum of Squared Difference), NCC (Normalized Cross Correlation), a gradient method, etc. can be used, for example.

 探索の場合、相関が最も高くなる位置を特定した場合には、第1ブロック内の特定の画素と特定された位置における第2ブロック内の特定の画素との間の距離を視差として算出し、これを1つのステップとして同様のステップを全画素について実行することにより、視差画像を生成することができる。 In the case of search, when the position where the correlation is highest is specified, the distance between the specific pixel in the first block and the specific pixel in the second block at the specified position is calculated as parallax, A parallax image can be generated by executing the same step for all the pixels as a single step.

 ステップS2は、ステップS1で取得した視差画像を使用して、走行路上の道路端(例えば歩道・縁石やガードレールや連続するポール、壁など)候補となり得る路端部位を抽出し、抽出した路端の位置情報がステップS3により処理される。 Step S2 uses the parallax image acquired in Step S1 to extract road edge parts that can be candidates for road edges (eg, sidewalks / curbs, guardrails, continuous poles, walls, etc.) on the road, and the extracted road edges Is processed in step S3.

 ステップS2を具体的に説明すると、ステップS1で取得された視差画像において、路端候補となる物体付近の距離変化は、平坦部分の距離変化と比べて小さいことを利用して、路端候補部分の特徴点を検出する。例えば、画像データを垂直方向に走査し、路側物と見なせる段差を検出する。 Describing step S2 in detail, in the parallax image acquired in step S1, using the fact that the distance change near the object that is the road edge candidate is smaller than the distance change of the flat part, Detect feature points. For example, the image data is scanned in the vertical direction to detect a step that can be regarded as a roadside object.

 次に、ステップS3では、検出した路端を車両Vの左側路端と右側路端に振り分ける。
さらに、図3に示すように、他車両や歩行者等の路側物以外の路端点群100(図示した例は車両の場合である)を除外し、走行可能領域探索のために用いる路端点群101のみを抽出し、抽出した路端情報(路端検出部13から出力される)のみがステップS4にて処理される。
Next, in step S3, the detected road edge is distributed to the left road edge and the right road edge of the vehicle V.
Furthermore, as shown in FIG. 3, the road end point group 100 other than the roadside objects such as other vehicles and pedestrians (the illustrated example is a case of the vehicle) is excluded, and the road end point group used for the travelable area search Only 101 is extracted, and only the extracted road edge information (output from the road edge detector 13) is processed in step S4.

 ここで、走行可能領域探索のために用いる路端とは、車両Vの走行車線上の障害物(例えば他車両)を除く、走行車線と道路端との境界のことである。 Here, the road edge used for the search for the travelable area is a boundary between the travel lane and the road edge excluding an obstacle (for example, another vehicle) on the travel lane of the vehicle V.

 次に、ステップS4では、抽出された路端において、路端の途切れが存在し、かつその路端の途切れが車両Vの走行可能領域候補と判断した場合、抽出された路端の位置情報がステップS5で処理され。車両Vの走行可能領域候補の判断においては、図3の路端終了地点103と路端開始地点102との間の距離が車両Vの幅以上となった場合に、走行可能領域候補とする。 Next, in step S4, when there is a road edge break at the extracted road edge and it is determined that the road edge break is a travelable area candidate of the vehicle V, the extracted road edge position information is obtained. Processed in step S5. In determining the travelable area candidate of the vehicle V, the travelable area candidate is determined when the distance between the road end end point 103 and the road end start point 102 in FIG.

 ステップS5では、ステップS4で検出された路端の途切れ区間(地点102と地点103との間)から、路面勾配推定に用いる処理領域、つまり、探索範囲(車両から見て路端側の領域範囲)を決定し、決定した探索範囲の座標がステップS6で処理される。 In step S5, the processing area used for road surface slope estimation from the road edge discontinuity section (between point 102 and point 103) detected in step S4, that is, the search area (area area on the road edge side as viewed from the vehicle). ) And the coordinates of the determined search range are processed in step S6.

 具体的には、図4に示すように、路端の途切れ区間の開始地点102から、途切れ終了地点103までの直線400と、地点102と地点103とから走行路と反対側に水平方向にひかれた直線200及び300、さらに画面端の直線500によって形成された矩形領域600を探索範囲とする。図2のステップS5に記載された路面勾配推定用フィルタサイズの設定とは、矩形領域600の設定を意味する。 Specifically, as shown in FIG. 4, the straight line 400 from the start point 102 of the road edge discontinuity section to the discontinuation end point 103 is drawn in the horizontal direction from the point 102 and the point 103 to the opposite side of the traveling road. The rectangular area 600 formed by the straight lines 200 and 300 and the straight line 500 at the screen edge is set as the search range. The setting of the road surface gradient estimation filter size described in step S5 of FIG. 2 means the setting of the rectangular area 600.

 次に、ステップS6では、ステップS5で決定された矩形領域600の範囲内に、遮蔽物(樹木やガードレール等)が検出され、路面の勾配を推定することができない場合は、ステップS17にて路面構内推定不可判定とし、路面勾配推定が可能な場合のみ、ステップS5の情報がステップS7にて処理される。 Next, in step S6, when a shield (tree, guardrail, etc.) is detected within the rectangular area 600 determined in step S5, and the road gradient cannot be estimated, the road surface is determined in step S17. The information in step S5 is processed in step S7 only when the premises estimation is impossible and the road surface gradient can be estimated.

 ステップS7は、ステップS5で決定された探索領域である矩形領域600の路面の勾配を推定し、推定した勾配情報がステップS8にて処理される。 In step S7, the gradient of the road surface of the rectangular area 600 that is the search area determined in step S5 is estimated, and the estimated gradient information is processed in step S8.

 ステップS7における具体的な推定手順は、探索領域(走行可能判定領域)である矩形領域600を水平方向に走査し、路面の高さ情報を取得する。これを、1つのステップとして、近方側(路端の途切れに対して自車両V側)から路面の高さ情報を取得して、矩形領域600における路面の勾配を推定する。 The specific estimation procedure in step S7 scans the rectangular area 600 which is a search area (running determination area) in the horizontal direction, and acquires road surface height information. As one step, the road surface height information is obtained from the near side (the vehicle V side with respect to the road edge break), and the road surface gradient in the rectangular region 600 is estimated.

 ステップS8は、ステップS7で推定された路面の勾配変化から、路面勾配が車両Vの走行可能な範囲の傾斜かどうかを判断しており、走行不可能な勾配と判定された場合は走行不可能領域とステップS18で判定する。そして、矩形領域600が走行可能と判断された場合のみ、ステップS7の情報がステップS9にて処理される。 In step S8, it is determined from the change in the road surface gradient estimated in step S7 whether the road surface gradient is within a range in which the vehicle V can travel. If it is determined that the vehicle cannot travel, the vehicle cannot travel. The determination is made in step S18. Only when it is determined that the rectangular area 600 can travel, the information in step S7 is processed in step S9.

 ステップS9は、車両Vが車道から路端途切れ区間内に進入した際に、車両Vが走行車線にはみ出すことなく停車できるだけの領域が存在しているか、すなわち、障害物(壁や段差等)の有無、穴の存在、自車両が走行できる領域か否かを判定している。 In step S9, when the vehicle V enters the road edge section from the roadway, there is an area where the vehicle V can stop without protruding into the traveling lane, that is, an obstacle (such as a wall or a step). The presence / absence, presence of a hole, and whether or not the vehicle can travel are determined.

 障害物となる物体が存在するか、穴が存在するか、又は自車両が走行できる領域ではないと判定した場合は、ステップS18において走行不可能領域と判定される。 If it is determined that there is an object that becomes an obstacle, a hole, or a region where the host vehicle is not allowed to travel, it is determined in step S18 that the region is not travelable.

 また、ステップS8で、障害物となる物体が存在せず、穴も存在せず、及び自車両が走行できる領域ではあると判定した場合は、ステップS19にて走行可能領域判定し、その判定結果を走行支援装置3に送信する。 If it is determined in step S8 that there is no obstacle, no hole, and the vehicle can travel, the travelable region is determined in step S19. Is transmitted to the driving support device 3.

 そして、走行支援制御装置3は、ステップS6での遮蔽判定と、ステップS8及びステップS9での判定結果に基づいて、ステップS17の路面勾配推定不可判定とステップS18の走行不可能領域判定、及びステップS19の走行可能領域判定の結果より車両Vの走行支援を実施する。 Then, the driving support control device 3 determines the road surface gradient estimation impossibility determination in step S17, the improper travel area determination in step S18, and the step based on the shielding determination in step S6 and the determination results in step S8 and step S9. Based on the result of the determination of the travelable area in S19, the travel support of the vehicle V is implemented.

 以上のように、本発明の一実施例における走行可能領域検出装置においては、ステレオカメラ装置2により取得された複数の画像に基づいて生成された視差画像から、走行領域の道路端の位置を検出し、検出したその路端の位置情報に基づいて路端102及び103の間である途切れの高さ方向の変化と、路端の途切れから走行車線と反対側に広がる領域の勾配変化を推定することにより、路端の途切れが自車両Vの走行可能領域か否かを判定することができる。 As described above, in the travelable region detection device according to the embodiment of the present invention, the position of the road edge in the travel region is detected from the parallax images generated based on the plurality of images acquired by the stereo camera device 2. Then, based on the detected position information of the road edge, a change in the height direction of the break between the road edges 102 and 103 and a change in the gradient of the region extending from the road edge break to the opposite side of the traveling lane are estimated. Thus, it can be determined whether or not the road edge break is in the travelable region of the host vehicle V.

 これにより、例えば、自車両Vの走行路上の路端の途切れを検出した場合であっても、その路端の途切れから走行可能領域を推定でき、自動運転に向けた高精度な走行支援を提供することができる。 As a result, for example, even when a road end break on the road of the host vehicle V is detected, a travelable area can be estimated from the road end break, and high-precision driving support for automatic driving is provided. can do.

 すなわち、移動体の前方における路端の途切れを検出し、検出した路端の途切れから移動体が走行することができる走行可能領域が存在するか否かを判定することが可能な走行可能領域検出装置及びそれを用いた走行支援システムを実現することができる。 In other words, a road area detection is possible in which a road edge break in front of the moving body is detected, and it is possible to determine whether or not there is a driveable area in which the mobile body can travel from the detected road edge break. A device and a driving support system using the device can be realized.

 なお、上述した例は、本発明の走行可能領域検出装置を車両の搭載する例であるが、本発明は車両に限らず、他の移動体にも適用可能である。例えば、災害用に使用される走行ロボット(救助用ロボット)等にも本発明は適用可能である。 In addition, although the example mentioned above is an example which mounts the driving | running | working area | region detection apparatus of this invention on a vehicle, this invention is applicable not only to a vehicle but another mobile body. For example, the present invention can be applied to a traveling robot (rescue robot) used for disasters.

 また、上述した例において、ステレオカメラ装置として、CCDやCMOS等の撮像素子を例としたが、撮像装置としては、レーザレーダを用いることも可能である。 In the above-described example, an imaging element such as a CCD or CMOS is taken as an example of the stereo camera device, but a laser radar can also be used as the imaging device.

 また、上述した例においては、視差画像生成部12により視差画像を生成し、路側物を検出しているが、路側物の検出は、視差画像以外の画像認識処理により、路側物の検出は可能である。 In the above example, the parallax image generation unit 12 generates a parallax image and detects a roadside object. However, the roadside object can be detected by image recognition processing other than the parallax image. It is.

 1・・・走行可能領域検出装置、2・・・ステレオカメラ装置、3・・・走行支援制御装置、10・・・左画像取得部、11・・・右画像取得部、12・・・視差画像生成部、13・・・路端検出部、14・・・探索範囲決定部、15・・・路面勾配推定部、16・・・走行可能領域判定部、31・・・アクセル制御部、32・・・ブレーキ制御部、33・・・スピーカ制御部、34・・・ステアリング制御部、100・・・他車両や歩行者等の路側物以外の路端点群、101・・・路端点群、102・・・途切れ区間の開始点、103・・・途切れ区間の終了点、200、300・・・水平方向にひかれた直線、400・・・途切れ終了地点までの直線、500・・・画面端の直線、600・・・矩形領域(探索領域) DESCRIPTION OF SYMBOLS 1 ... Travel possible area | region detection apparatus, 2 ... Stereo camera apparatus, 3 ... Travel assistance control apparatus, 10 ... Left image acquisition part, 11 ... Right image acquisition part, 12 ... Parallax Image generation unit, 13 ... road edge detection unit, 14 ... search range determination unit, 15 ... road surface gradient estimation unit, 16 ... travelable region determination unit, 31 ... accelerator control unit, 32 ... Brake control unit, 33 ... Speaker control unit, 34 ... Steering control unit, 100 ... Road end point group other than roadside objects such as other vehicles and pedestrians, 101 ... Road end point group, 102: Start point of break section, 103: End point of break section, 200, 300 ... Straight line drawn in the horizontal direction, 400 ... Straight line to break point, 500 ... Screen edge Straight line, 600 ... rectangular area (search area)

Claims (12)

 撮像画像に基づいて、移動体が走行する道路上に存在する路端を検出し、路端情報を出力する路端検出部と、
 上記路端検出部で検出された路端情報に基づいて探索範囲を決定する探索範囲決定部と、
 上記探索範囲決定部により決定された探索範囲に基づいて、移動体から見て路端側の走行可能判定領域の路面勾配を推定する路面勾配推定部と、
 少なくとも上記路面勾配推定部により推定された上記路面勾配から、上記走行可能判定領域は上記移動体が走行可能な領域か否かを判定する走行可能領域判定部と、
 を備えることを特徴とする走行可能領域検出装置。
Based on the captured image, a road edge detection unit that detects a road edge existing on the road on which the mobile body travels and outputs road edge information;
A search range determination unit that determines a search range based on road edge information detected by the road edge detection unit;
Based on the search range determined by the search range determination unit, a road surface gradient estimation unit that estimates the road surface gradient of the road-side travelable determination region as viewed from the mobile body,
A travelable region determination unit that determines whether or not the travelable determination region is a region in which the mobile body can travel from at least the road surface gradient estimated by the road surface gradient estimation unit;
A travelable area detecting device comprising:
 請求項1に記載の走行可能領域検出装置において、
 複数の画像から視差画像を生成する視差画像生成部を、さらに備え、上記路端検出部は、上記視差画像生成部が生成した上記視差画像に基づいて、道路に存在する路端を検出することを特徴とする走行可能領域検出装置。
The travelable area detecting device according to claim 1,
A parallax image generation unit that generates a parallax image from a plurality of images is further provided, and the road edge detection unit detects a road edge present on the road based on the parallax image generated by the parallax image generation unit. A travelable area detecting device characterized by the above.
 請求項1に記載の走行可能領域検出装置において、
 上記走行可能領域判定部は、検出された上記路面勾配が、所定の値より大きい場合は、上記走行可能判定領域は上記移動体が走行不可能と判定することを特徴とする走行可能領域検出装置。
The travelable area detecting device according to claim 1,
The travelable region detection device, wherein the travelable region determination unit determines that the travelable region cannot travel when the detected road gradient is greater than a predetermined value. .
 請求項1に記載の走行可能領域検出装置において、
 上記走行可能領域判定部は、上記走行可能判定領域に、障害物が存在するか、穴が存在するか、及び上記移動体が走行できる領域であるか否かを判定し、障害物が存在するか、穴が存在するか、又は上記移動体が走行できる領域ではないと判定した場合は、上記移動体は上記走行可能判定領域を走行不可能と判定することを特徴とする走行可能領域検出装置。
The travelable area detecting device according to claim 1,
The travelable area determination unit determines whether there is an obstacle, a hole, and an area where the moving body can travel in the travelable determination area, and there is an obstacle. Or a travelable area detecting device characterized in that, when it is determined that a hole exists or the mobile body is not in an area where the mobile body can travel, the mobile body determines that the travelable determination area cannot travel. .
 請求項4に記載の走行可能領域検出装置において、
 上記走行可能領域判定部は、検出された上記路面勾配が、所定の値以下であり、且つ、上記走行可能判定領域に、障害物が存在せず、穴も存在せず、及び上記移動体が走行できる領域である場合は、上記移動体は上記走行可能判定領域を走行可能と判定することを特徴とする走行可能領域検出装置。
The travelable area detecting device according to claim 4,
The travelable area determination unit is configured such that the detected road gradient is equal to or less than a predetermined value, and there is no obstacle, no hole, and the moving body in the travelable determination area. The travelable region detecting device, wherein when the region is a travelable region, the mobile body determines that the travelable determination region can travel.
 請求項1、2、3、4、5のうちのいずれか一項に記載の走行可能領域検出装置において、
 上記移動体は車両であることを特徴とする走行可能領域検出装置。
In the travelable region detection device according to any one of claims 1, 2, 3, 4, and 5,
The travelable region detecting device, wherein the moving body is a vehicle.
 移動体の前方を撮像する撮像装置と、
 上記撮像装置が撮像した撮像画像に基づいて、移動体が走行する道路上に存在する路端を検出し、路端情報を出力する路端検出部と、上記路端検出部で検出された路端情報に基づいて探索範囲を決定する探索範囲決定部と、上記探索範囲決定部により決定された探索範囲に基づいて、移動体から見て路端側の走行可能判定領域の路面勾配を推定する路面勾配推定部と、少なくとも上記路面勾配推定部により推定された上記路面勾配から、上記走行可能判定領域は上記移動体が走行可能な領域か否かを判定する走行可能領域判定部と、を有する走行可能領域検出装置と、
 上記走行可能領域検出装置の判定結果に従って上記移動体の動作を制御して走行を支援する走行支援制御装置と、
 を備えることを特徴とする走行支援システム。
An imaging device for imaging the front of the moving body;
Based on the captured image captured by the imaging device, a road edge detection unit that detects a road edge existing on a road on which the moving body travels and outputs road edge information; and a road detected by the road edge detection unit A search range determination unit that determines a search range based on edge information, and a road surface gradient in the travelable determination region on the road edge side as viewed from the mobile body is estimated based on the search range determined by the search range determination unit. A road surface gradient estimation unit; and a travelable region determination unit that determines whether or not the travelable determination region is a region in which the mobile body can travel based on at least the road surface gradient estimated by the road surface gradient estimation unit. A travelable area detection device;
A driving support control device for supporting the driving by controlling the operation of the moving body according to the determination result of the driving region detection device;
A driving support system comprising:
 請求項7に記載の走行支援システムにおいて、
 上記走行可能領域検出装置は、複数の画像から視差画像を生成する視差画像生成部を備え、上記路端検出部は、上記視差画像生成部が生成した上記視差画像に基づいて、道路に存在する路端を検出することを特徴とする走行支援制御システム。
In the driving support system according to claim 7,
The travelable area detection device includes a parallax image generation unit that generates a parallax image from a plurality of images, and the road edge detection unit exists on a road based on the parallax image generated by the parallax image generation unit. A driving support control system for detecting a road edge.
 請求項7に記載の走行支援システムにおいて、
 上記走行可能領域判定部は、検出された上記路面勾配が、所定の値より大きい場合は、上記走行可能判定領域は上記移動体が走行不可能と判定することを特徴とする走行支援システム。
In the driving support system according to claim 7,
The travel support system according to claim 1, wherein the travelable area determination unit determines that the mobile body cannot travel when the detected road gradient is greater than a predetermined value.
 請求項7に記載の走行支援システム置において、
 上記走行可能領域判定部は、上記走行可能判定領域に、障害物が存在するか、穴が存在するか、及び上記移動体が走行できる領域であるか否かを判定し、障害物が存在するか、穴が存在するか、又は上記移動体が走行できる領域ではないと判定した場合は、上記移動体は上記走行可能判定領域を走行不可能と判定することを特徴とする走行支援システム。
In the driving support system according to claim 7,
The travelable area determination unit determines whether there is an obstacle, a hole, and an area where the moving body can travel in the travelable determination area, and there is an obstacle. Or a travel support system, wherein when it is determined that there is a hole or the region is not capable of traveling, the mobile unit determines that the travelable determination region is not travelable.
 請求項10に記載の走行支援システムにおいて、
 上記走行可能領域判定部は、検出された上記路面勾配が、所定の値以下であり、且つ、上記走行可能判定領域に、障害物が存在せず、穴も存在せず、及び上記移動体が走行できる領域である場合は、上記移動体は上記走行可能判定領域を走行可能と判定することを特徴とする走行支援システム。
The driving support system according to claim 10,
The travelable area determination unit is configured such that the detected road gradient is equal to or less than a predetermined value, and there is no obstacle, no hole, and the moving body in the travelable determination area. The travel support system according to claim 1, wherein when the vehicle is in a travelable region, the mobile body determines that the travelable determination region can travel.
 請求項7、8、9、10、11のうちのいずれか一項に記載の走行支援システムにおいて、
 上記移動体は車両であることを特徴とする走行支援システム。
In the driving support system according to any one of claims 7, 8, 9, 10, and 11,
The traveling support system, wherein the moving body is a vehicle.
PCT/JP2017/045032 2016-12-27 2017-12-15 Travelable area detection device and travel assistance system Ceased WO2018123641A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018559043A JP6837262B2 (en) 2016-12-27 2017-12-15 Travelable area detection device and travel support system
CN201780067417.1A CN110088801B (en) 2016-12-27 2017-12-15 Driving region detection device and driving assistance system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016253126 2016-12-27
JP2016-253126 2016-12-27

Publications (1)

Publication Number Publication Date
WO2018123641A1 true WO2018123641A1 (en) 2018-07-05

Family

ID=62707528

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/045032 Ceased WO2018123641A1 (en) 2016-12-27 2017-12-15 Travelable area detection device and travel assistance system

Country Status (3)

Country Link
JP (1) JP6837262B2 (en)
CN (1) CN110088801B (en)
WO (1) WO2018123641A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020086995A (en) * 2018-11-27 2020-06-04 富士ゼロックス株式会社 Autonomous mobile device and program
JP2021042990A (en) * 2019-09-06 2021-03-18 株式会社デンソー Wall shape measurement device
JP2021177135A (en) * 2020-05-07 2021-11-11 株式会社トヨタマップマスター Information processing device, information processing method and information processing program
JP2021177136A (en) * 2020-05-07 2021-11-11 株式会社トヨタマップマスター Information processing equipment, information processing methods and information processing programs

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11741675B2 (en) * 2020-03-10 2023-08-29 Niantic, Inc. Determining traversable space from single images
CN113744518B (en) * 2020-05-30 2023-04-18 华为技术有限公司 Method and device for detecting vehicle travelable area
JP7458940B2 (en) * 2020-09-01 2024-04-01 日立Astemo株式会社 Image processing device
CN115771508A (en) * 2021-09-08 2023-03-10 本田技研工业株式会社 Vehicle control device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03113678A (en) * 1989-09-28 1991-05-15 Honda Motor Co Ltd Traveling route recognizing method
JP2000276228A (en) * 1999-03-26 2000-10-06 Matsushita Electric Works Ltd Control method for movable dolly
JP2016018371A (en) * 2014-07-08 2016-02-01 株式会社デンソー On-vehicle system, information processing apparatus, and program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5389002B2 (en) * 2010-12-07 2014-01-15 日立オートモティブシステムズ株式会社 Driving environment recognition device
MX2013008808A (en) * 2011-04-13 2013-10-03 Nissan Motor Driving assistance device and adjacent vehicle detection method therefor.
JP6049541B2 (en) * 2013-05-31 2016-12-21 日立オートモティブシステムズ株式会社 Vehicle control system
JP6035207B2 (en) * 2013-06-14 2016-11-30 日立オートモティブシステムズ株式会社 Vehicle control system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03113678A (en) * 1989-09-28 1991-05-15 Honda Motor Co Ltd Traveling route recognizing method
JP2000276228A (en) * 1999-03-26 2000-10-06 Matsushita Electric Works Ltd Control method for movable dolly
JP2016018371A (en) * 2014-07-08 2016-02-01 株式会社デンソー On-vehicle system, information processing apparatus, and program

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020086995A (en) * 2018-11-27 2020-06-04 富士ゼロックス株式会社 Autonomous mobile device and program
JP2021042990A (en) * 2019-09-06 2021-03-18 株式会社デンソー Wall shape measurement device
JP7302397B2 (en) 2019-09-06 2023-07-04 株式会社デンソー Wall shape measuring device
JP2021177135A (en) * 2020-05-07 2021-11-11 株式会社トヨタマップマスター Information processing device, information processing method and information processing program
JP2021177136A (en) * 2020-05-07 2021-11-11 株式会社トヨタマップマスター Information processing equipment, information processing methods and information processing programs
JP7417466B2 (en) 2020-05-07 2024-01-18 株式会社トヨタマップマスター Information processing device, information processing method, and information processing program
JP7417465B2 (en) 2020-05-07 2024-01-18 株式会社トヨタマップマスター Information processing device, information processing method, and information processing program

Also Published As

Publication number Publication date
CN110088801A (en) 2019-08-02
CN110088801B (en) 2023-04-18
JPWO2018123641A1 (en) 2019-10-31
JP6837262B2 (en) 2021-03-03

Similar Documents

Publication Publication Date Title
JP6837262B2 (en) Travelable area detection device and travel support system
JP5399027B2 (en) A device having a system capable of capturing a stereoscopic image to assist driving of an automobile
CN110088802B (en) Object detection device
JP6548893B2 (en) Roadway recognition device and travel support system using the same
CN104335264A (en) Lane partition marking detection apparatus, and drive assist system
JP5561064B2 (en) Vehicle object recognition device
JP6331811B2 (en) Signal detection device and signal detection method
JP3562751B2 (en) Forward vehicle detection method and device
CN102396002A (en) Object detection device
EP1469442A2 (en) Vehicle drive assist system
CN108349533A (en) Method, driver assistance system and motor vehicles for determining the parking area for parking motor vehicles
CN110949254A (en) Vehicle running environment detection device and running control system
US20200193184A1 (en) Image processing device and image processing method
JP3868915B2 (en) Forward monitoring apparatus and method
JP2013161190A (en) Object recognition device
JP5717416B2 (en) Driving support control device
JP2019053685A (en) Image processing device
JP2009186301A (en) Object detection device for vehicle
KR102028964B1 (en) Driving assistant system and method thereof
JP2020095623A (en) Image processing device and image processing method
JP5950193B2 (en) Disparity value calculation device, disparity value calculation system including the same, moving surface area recognition system, disparity value calculation method, and disparity value calculation program
JP2007310591A (en) Image processing apparatus and parking lot determination method
JP6477246B2 (en) Road marking detection device and road marking detection method
US9430707B2 (en) Filtering device and environment recognition system
WO2025104883A1 (en) Image processing method and image processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17885703

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018559043

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17885703

Country of ref document: EP

Kind code of ref document: A1