WO2011158306A1 - 路面反射率分類のためのシステム - Google Patents
路面反射率分類のためのシステム Download PDFInfo
- Publication number
- WO2011158306A1 WO2011158306A1 PCT/JP2010/004092 JP2010004092W WO2011158306A1 WO 2011158306 A1 WO2011158306 A1 WO 2011158306A1 JP 2010004092 W JP2010004092 W JP 2010004092W WO 2011158306 A1 WO2011158306 A1 WO 2011158306A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road surface
- vehicle
- reflectance
- classification system
- patch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/55—Specular reflectivity
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/47—Scattering, i.e. diffuse reflection
- G01N2021/4704—Angular selective
- G01N2021/4711—Multiangle measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/47—Scattering, i.e. diffuse reflection
- G01N2021/4735—Solid samples, e.g. paper, glass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/55—Specular reflectivity
- G01N2021/556—Measuring separately scattering and specular
Definitions
- the present invention relates to a technique for detecting road reflection caused by, for example, a wet surface and specific lighting conditions, and particularly in image processing based on a driver assistance system (DAS) with lane marker detection and the like.
- DAS driver assistance system
- the present invention relates to a system for road-surface-reflectivity-classification that alleviates the limitations caused by wet surfaces and specific lighting conditions.
- Road surface reflectance classification is achieved by analyzing multiple single images by using image processing methods, bi-directional reflectance distribution functions (BRDF) and robust reflectance features. Such road surface reflectance classification can be used to adapt the chassis control system. It is known that road surface reflection is an indicator of the presence of moisture on the road surface, and the presence of moisture on the road surface has a significant effect on the friction coefficient of the road surface. It has been.
- BRDF road surface reflectance classification
- CIE International Commission on Illumination
- the main object of the present invention can be realized by using an in-vehicle camera, and the acquired data is applied in real time to a chassis control system and other useful purposes. It is to provide a system for road surface reflectance classification as obtained.
- a second object of the present invention is to provide a system for road surface reflectance classification based on BRDF suitable for an in-vehicle camera.
- a third object of the present invention is to provide a system for BRDF that is robust against optical disturbances caused by outlier reflection features and / or extreme surface conditions.
- a fourth object of the present invention is to provide a system for BRDF that can perform road surface reflectance classification well even when the field of view of a camera for acquiring data is extremely limited.
- such an object is a road surface reflectance classification system for acquiring an image of a road surface in front of a vehicle using an in-vehicle camera, and a navigation sensor for obtaining at least a position and a traveling direction of the vehicle, A clock for obtaining the current time and date, a sun position calculator for determining the position of the sun relative to the position of the vehicle and the traveling direction based on the output of the navigation sensor and the clock, and patches arranged in front and rear
- a camera control unit that acquires images of a plurality of patches defined on the road surface in front of the vehicle, and based on the images acquired by the camera control unit under the assumption of spatial invariance of the road surface patch
- BRDF bidirectional reflectance distribution
- the sun position calculation unit determines the position of the sun with respect to the position of the vehicle and the traveling direction based on a navigation sensor composed of a GPS system based on a satellite, an inertial navigation system, etc., and an output of a clock for obtaining a date as well as the current time. .
- the traveling direction of the vehicle can also be obtained by a navigation device such as a magnet.
- the in-vehicle camera measures the reflection radiance and irradiance at different reflection angles for a given incident angle.
- BRDF bidirectional reflectance distribution
- the patch array is arranged as an (n ⁇ m) matrix, where n is the number of patches arranged in the horizontal direction and m is the number of patches arranged in the front-rear direction.
- a diffusion patch can be provided on the hood surface of the vehicle to provide a reference for incident radiance.
- the incident radiation can be measured using an optical sensor or a photoelectric sensor. It is also possible to directly estimate the reference of incident radiance from the acquired image by using an image processing method.
- the main controller may be configured to analyze the output of the in-vehicle camera and provide a reference for incident radiance.
- outlier reflection characteristics may be, but are not limited to, lane markers on the road surface or raindrops on the vehicle windshield.
- the present invention further relates to a road surface reflectance classification system for acquiring an image of a road surface in front of a vehicle using an in-vehicle camera, the navigation sensor for obtaining at least the position and traveling direction of the vehicle, and the current time and date.
- a camera control unit that acquires an image of at least one patch defined on a road surface at a predetermined distance ahead, and an image acquired by the camera control unit under the assumption of spatial or temporal invariance of the road surface patch
- An evaluation unit for measuring the bidirectional reflectance distribution based on the display, and a display unit for displaying an index of the bidirectional reflectance distribution based on the output of the evaluation unit.
- FIG. 2 is a diagram illustrating the distribution of reflected radiance in the case of a perfect mirror surface and an arbitrary surface, respectively, comprising FIGS. 1a and 1b. Diagram showing a surface model based on Oren-Nayar assumptions. 3a and 3b, FIG. 3a is a diagram showing a conventional BRDF measurement method, and FIG. 3b is a diagram showing a BRDF measurement method using a vehicle according to the present invention. 3 is a three-dimensional geometric model of an automotive BRDF measuring apparatus according to the present invention.
- FIG. 3 is a three-dimensional geometric model of an automotive BRDF measuring apparatus according to the present invention.
- FIG. 2 is a block diagram illustrating a normal view viewed from an in-vehicle camera and input from a GPS satellite in relation to a diffuse patch provided on the surface of the vehicle hood to provide a reference incident radiance.
- 1 is a simplified block diagram of a system according to the present invention. One patch observed at different times (requires speedometer). A patch defined on the road surface in front of the vehicle observed at the same time (no speedometer is required).
- FIG. 9A is a diagram showing road surface reflectance classification using SVM (surface reflection classifying)
- FIG. 9B is a diagram showing how road surface reflection conditions are continuously evaluated.
- the reflection of an opaque object usually depends on the angular direction of illuminating or observing the surface.
- ⁇ i ⁇ r
- FIG. 1b For a typical surface patch, any combination of diffuse and specular reflection can occur, and light is reflected in all directions of the upper hemisphere, as shown in FIG. 1b. In general, this behavior is expressed by BRDF.
- the BRDF of the surface dA is defined by the reflected radiance Lr and incident radiance Ei for the observer and light source position given by the zenith angle ( ⁇ r, ⁇ i) and azimuth angle ( ⁇ r, ⁇ i).
- the fr ( ⁇ i, ⁇ i, ⁇ r, ⁇ r) dLr ( ⁇ r, ⁇ r) / dEi ( ⁇ i, ⁇ i) (1)
- the reflection model is usually introduced to realize a low parameter representation of the obtained BRDF measurement value.
- One powerful physical BRDF model is proposed by Oren-Nayar in Non-Patent Documents 10, 12 and 13. This assumes an isotropic surface composed of a plurality of small facets arranged as a constant width V-shaped groove, as shown in FIG.
- f r d, dir includes all the directly reflected parts
- f r d, ms covers the part reflected multiple times.
- ⁇ is the surface albedo
- is the relative azimuth
- ⁇ 1 max ( ⁇ i, ⁇ r) is the maximum zenith angle
- ⁇ 2 min ( ⁇ i, ⁇ r) Is the minimum zenith angle
- C1 (k w ), C2 (k w ), and C3 (k w ) are constants that depend only on k w (parameter for surface roughness) and the position of the observer and the light source. See Non-Patent Document 13. For clarity of explanation, all model equations do not specify the dependency of observer and light source positions ⁇ i, ⁇ r, and ⁇ .
- ⁇ is the zenith angle of the normal N ⁇ of the facet relative to the macro normal M ⁇ .
- the specular reflection portion can be expressed as follows.
- F r s (k w , n) F (n) ⁇ P (k w ) ⁇ G / 4 ⁇ cos ⁇ i ⁇ cos ⁇ r ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ (7)
- F (n) represents the Fresnel reflectance describing the photometric behavior of the light beam on the surface (see Non-Patent Document 5)
- G represents attenuation related to masking and shadowing in the facet structure. Is a factor. See Non-Patent Document 13.
- f r k d f r d (k w ) + k s f r s (k w , n) (8)
- k d and k s are weighting factors for the diffuse and specular reflection terms
- n is the refractive index of the corresponding material
- k w is a parameter for the surface roughness.
- the robust road surface reflectance classification method depends on the characteristics of a strong BRDF with high reliability. In general, the necessary information can be obtained from a single image obtained by the in-vehicle camera.
- a new framework for road surface reflectance classification based on images is provided.
- the Oren-Nayar-BRDF model is fitted to the acquired data to extract reliable BRDF features.
- these features form a low-dimensional description vector that is used to determine the road surface reflectance classification, for example by using an appropriate algorithm based on SVM (support vector machine) as shown in FIG. 9a. .
- SVM support vector machine
- FIG. 4 shows a data acquisition configuration for the illustrated embodiment.
- Incident light rays from the sun are projected onto the road surface with a certain incident radiance Ei and a certain incident angle ⁇ i toward a specific patch on the road surface.
- the incident light beam is reflected with a certain reflection angle ⁇ r and a certain reflection radiance Lr.
- azimuth angles ( ⁇ r, ⁇ i) can be considered in addition to zenith angles ( ⁇ r, ⁇ i).
- incident light from the sun having incident radiance Es enters the diffuse reflector R disposed on the engine hood of the vehicle, and the reflected light is captured by the camera with a reflected radiance Ls.
- FIG. 5 shows a typical image obtained with an in-vehicle camera, including a diffuse reflector R that can be used to provide an indicator of incident radiance.
- This index is obtained by using a photocell (photoelectric sensor) provided on the roof of the vehicle, an optical sensor incorporated as part of the raindrop detection system, or by directly estimating from an image using an image processing method. You can also Data obtained from this image is combined with azimuth information provided by GPS satellites in a system for road surface reflectance classification.
- FIG. 6 shows an example of a system for road surface reflectance classification.
- Information obtained by the in-vehicle camera 1 is sent to the main control unit 3 via the camera control unit 2.
- the main control unit 3 further receives information on the current date and time from the clock 4 and information on the vehicle azimuth and position from the GPS receiver 5 and calculates the current position of the sun relative to the vehicle azimuth.
- the main control unit 3 acquires the road surface reflectance classification by executing an algorithm described below, and supplies the result to the chassis control system or other in-vehicle system.
- Road surface reflectance classification provides a measure or indicator of the presence of moisture on the road surface, thereby providing a measure or indicator of the coefficient of friction of the road surface against the tires of the vehicle.
- the same patch is observed from a plurality of observation positions as shown in FIG.
- the patches on the road surface are spatially constant, and the radiance obtained by observing these patches from different angles. Is observed as a single image, as shown in FIG. 3b.
- the radiance measurement is realized by evaluating the average gray scale value of a 10 ⁇ 10 cm patch arranged at a point of 75 ⁇ 30 (n ⁇ m) by observation from a bird's eye view position as shown in FIG. be able to.
- FIG. 7 shows another configuration for acquiring an image of the road surface in front of the vehicle.
- the same patch defined on the road surface is observed together with the vehicle by an in-vehicle camera.
- Standard image matching techniques optical flow
- the camera acquires patch images at predetermined time intervals. This time interval may be equal or unequal as long as the camera control unit can associate each acquired image with the zenith angle of the observation axis of the road patch.
- the system comprises a speed sensor 6, as indicated by the phantom line in FIG. 6, whereby the main control unit 5 matches the image so that a particular patch is present in subsequent images. It is possible to associate.
- the method shown in FIG. 7 is as shown in FIG. 3a, as required in a conventional BRDF measurement device, based on the assumption that the road patch is (in the short term) time-invariant.
- multiple BRDF measurements can be provided for a particular patch from different observer positions.
- the radiant flux ⁇ with which each camera pixel is irradiated onto the area A over a predetermined exposure time Te is integrated.
- the relationship between the grayscale value of the camera pixel and its incident emissivity E is as follows: Is obtained as follows.
- Non-Patent Document 6 the relationship between the reflection radiance Lr of the road surface patch and the incident radiation E on the camera sensor is given by the following equation.
- ⁇ is an angle that defines the relative geometric relationship between the camera and the reflector.
- the proportionality constants of both equations depend only on the camera characteristics. Therefore, a point-by-point description between the radiance Lr reflected from the road surface and the gray scale value g of the camera pixel can be obtained from the equations (9) and (10).
- the incident radiation Ei on the road surface can be estimated by determining the incident radiation Es on the diffuse reflector R provided on the hood of the vehicle.
- the calculation is performed according to the equations (9) to (11) by taking the average of all the gray scale values g ⁇ exceeding a certain threshold in the gray scale value histogram of the reflector.
- the incident radiation Ei on the road surface is obtained by the following equation.
- the proportionality constant K can be determined by giving the camera and reflector characteristics and their relative geometric positions. Note that the proportionality constant K is common to all images acquired with the same experimental setup.
- a meaningful BRDF evaluation of the fitted road surface of the BRDF model is possible only for a partitioned area in front of the vehicle as shown in FIG. This corresponds to a range of 77 ° ⁇ r ⁇ 87 ° for the zenith angle and ⁇ 20 ° ⁇ r ⁇ + 20 ° for the azimuth angle. Since the field of view is limited in this way, the positions of the light source and the observer vary depending on the time and the direction of the vehicle at that time, and different traffic situations cannot be directly compared by BRDF measurement. Therefore, in order to perform extrapolation based on the model for the entire BRDF of the road surface, it was decided to apply the extended Oren-Nayar model.
- the robust reflective surface radiance is determined from the intensity signal value of the camera sensor.
- large changes in road albedo such as those caused by vivid lane markers on dark concrete surfaces, break the assumption of road patch invariance and deviate from BRDF measurements. Give rise to a value.
- the effects of any atmospheric phenomenon between the surface patch and the camera sensor such as raindrops on the windshield, can disturb the measurement process. Since this method improves DAS particularly in rough weather, insensitivity to such disturbances is extremely important.
- Non-Patent Documents 4 and 14 and Patent Documents 1 and 2 are used to cause disturbance by raindrops.
- the estimated Oren-Nayar models and their parameters can only distinguish different road reflection condition differences only in the case of well-conditioned images.
- the actual traffic situation is accompanied by measurement noise, especially in stormy weather, and from a very limited field of view, a BRDF model describing the reflectivity of a substance for every observer and light source position It is a very difficult problem to estimate. According to our experiments, it was found that for this reason, it is not possible to accurately classify road surface reflection conditions using only the estimated Oren-Nayar model parameters.
- the robust CIE reflection parameters S1 and S2 based on Non-Patent Documents 7 and 1 are excluded from positions other than the observer position of ⁇ r ⁇ 82 °. These parameters evaluate the entire BRDF function for two feature points.
- S2 determines the closed BRDF volume Q0 for the reflectivity in the normal direction.
- Non-Patent Document 3 To find the function f from an example for parameter fitting (see Non-Patent Document 3) using various methods such as k-Nearest-Neighbor, Decision Trees, Neural Networks and Support Vector Machines (SVM), Data-based machine learning methods have been proposed. Since SVM is simple, high speed and effective, it is adopted as a learning and classification method in this embodiment.
- SVM Support Vector Machines
- the BRDF model was visualized by a spherical coordinate system using a so-called q-body representation. See Non-Patent Document 7. If the light source is fixed, a set of vectors given by zenith and azimuth and their corresponding BRDF values as norms are used to synthesize an envelope that describes the surface reflection characteristics.
- a system for road surface reflectance classification based on estimating BRDF of road surface is proposed.
- the Oren-Nayar reflectivity model is used to enable the extraction of meaningful reflection features with high accuracy and high robustness with only low errors even under stormy conditions. Since only a single grayscale value image and GPS information acquired by an uncalibrated vehicle-mounted camera is used, the proposed framework is applicable to realistic scenarios or aspects, and any driving It can be applied to a person support system.
- Patent and non-patent literature CIE 132-1999 Design methods for lighting of roads. Technical report, Commission Internationale de L'Eclairage, 1999. R.L. Cook and K.E. Torrance. A reflectance model for computer graphics. ACM Transactions on Graphics, 1 (1): 7-24, 1981. R.O. Duda, P.E. Hart, and D.G. Stork. Pattern classification. Wiley, 2. ed. Edition, 2001. J. Halimeh and M. Roser. Raindrop detection on car windshields using geometric-photometric environment construction and intensity-based correlation. In IEEE Intelligent Vehicle Symposium (IV '09), Xi'an, China, 2009. E. Hecht. Optics. Addison-Wesley Pub. Co., 4. ed. Edition, 2002. B.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Analytical Chemistry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Automation & Control Theory (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
fr(φi, θi, φr, θr) = dLr(φr, θr) / dEi(φi, θi) ・・・・・・ (1)
入射放射度Eiは、単位面積A当たりの放射束Φを表し、
Ei = dΦ/ dA ・・・・・・ (2)
反射ラジアンスLrは以下の式により与えられる。
Lr = d2Φ/ (dA ・cos θr・dΩ) ・・・・・・ (3)
これは、伝播方向に対して直交する面上で測定された、単位立体角Ω当たりの放射束Φを表す。
fr s(kw, n) = F(n)・P(kw)・G / 4・cos θi・cos θr ・・・・・・ (7)
ここで、F(n)は表面上に於ける光線の測光的挙動を記述するフレネル反射率を示し(非特許文献5を参照。)、Gは切子面構造内でのマスキング及びシャドウイングに関する減衰ファクタである。非特許文献13を参照。
fr = kdfr d(kw) + ksfr s(kw, n) ・・・・・・ (8)
ここで、kd及びksは、拡散的及び鏡面的反射項のための重みファクタであり、nは対応する物質の屈折率であり、kwは表面粗さについてのパラメータである。
従来のBRDF測定方法によれば、図3aに示されるように、同一のパッチを複数の観測位置から観測する。それに対して、図8に示された本発明に基づく手法に於いては、路面上のパッチが空間的に一定であると仮定し、これらのパッチに対して異なる角度から観測して得られるラジアンスを図3bに示されるように、単一の画像として観測する。ラジアンスの測定は、図8に示されるように75×30(n×m)の点に配置された10×10cmのパッチの平均グレースケール値を鳥瞰位置からの観測により評価することにより、実現することができる。
画像取得過程に於いて、各カメラピクセルが所定の露出時間Teに渡って領域Aに照射される放射束Φを積分する。カメラセンサが線形特性を有するものと仮定し、画像取得過程に於いて発生するノイズを無視すると共に式(2)を用いることにより、カメラピクセルのグレースケール値とその入射放射度Eの関係が以下のように得られる。
太陽が考慮されるべき唯一の光源であるとすると、この関係は、日時の情報及び各瞬間に於ける太陽の位置を表す式から計算することができる。非特許文献16参照。
路面上の放射度Eiを決定するために、車両の周囲の太陽による放射度が一定であると仮定する。即ち、路面パッチの空間的不変性を仮定する。これにより、図4に示されるように、車両のフード上に設けられた拡散反射器R上の入射放射度Esを決定することにより路面上の入射放射度Eiを推定することができる。完全に拡散的な反射器のBRDF式は以下の式により与えられる。
Ls = (ρ/π) Es ・・・・・・ (11)
Ei = Es = K ・ (g-/Te) ・・・・・・ (12)
ここで、比例定数Kは、カメラ及び反射器の特性並びにそれらの相対的な幾何学的位置を与えることにより決定することができる。ここで、比例定数Kは同一の実験セットアップにより取得された全ての画像について共通であることに留意されたい。
路面の意味のあるBRDF評価は、図8に示すように車両の前方の区画された領域についてのみ可能である。これは、天頂角については77°<θr<87°、方位角については-20°<φr<+20°の範囲に対応する。このように視界が制限されていることから、その時の時刻や車両の方向に応じて、光源及び観測者の位置が様々に異なり、異なる交通状況をBRDF測定によって直接比較することはできない。従って、路面の全体のBRDFに対してモデルに基づく外挿を行うために、拡張されたOren-Nayarモデルを当てはめることにした。標準的な非線形Levenberg-Marquardt最適化(非特許文献8を参照。)を行うことにより、正規化された重み付けファクタ(kd + ks = 1)(非特許文献2を参照。)を用いることにより、未知のOren-Nayarパラメータを推定した。
反射表面ラジアンスは、カメラセンサの強度信号値から決定される。従って、黒っぽいコンクリート面上の鮮やかな車線マーカーにより引き起こされるもの等のような、路面アルベド(albedo)に於ける大きな変化は、路面パッチの不変性の仮定を破壊するもので、BRDF測定値に外れ値を生じさせる。しかも、ウインドシールド上の雨滴などのような表面パッチとカメラセンサとの間のあらゆる大気現象の影響は、測定プロセスを乱す原因となる。この方法は、特に荒天時に置いてDASを向上させるものであることから、このような外乱に対する鈍感さは極めて重要である。
記述ベクトルvを生成するためにOren-Nayarモデルパラメータ kw, kd, n 及び反射特徴S1、S2、S3の選択された組合せを選択し、このベクトルを路面反射率分類を決定するために利用する。この場合、バイナリ分類をクラスC={拡散面,鏡面}に適用する。このように、分類の問題が、vを記述空間DからクラスCにマップする関数f(c=f(v)、但しf:D→C)を見つける問題に帰結させることができる。
路面のBRDFを推定することに基づく路面反射率分類のためのシステムを提案する。Oren-Nayar反射率モデルを使用し、荒天条件下にあっても、低い誤差を伴なうのみで、路面反射率分類を正確かつ高いロバスト性をもって有意義な反射特徴を抽出できるようにする。較正されていない車載カメラにより取得された単一のグレースケール値画像及びGPS情報のみを用いることから、ここで提案されているフレームワークは、現実的なシナリオ即ち局面に適用可能であり、あらゆる運転者支援システムに対して適用可能である。
CIE 132-1999. Design methods for lighting of roads. Technical report, Commission Internationale de L'Eclairage, 1999. R.L. Cook and K.E. Torrance. A reflectance model for computer graphics. ACM Transactions on Graphics, 1(1):7-24, 1981. R.O. Duda, P.E. Hart, and D.G. Stork. Pattern classification. Wiley, 2. ed. edition, 2001. J. Halimeh and M. Roser. Raindrop detection on car windshields using geometric-photometric environment construction and intensity-based correlation. In IEEE Intelligent Vehicle Symposium (IV '09), Xi'an, China, 2009. E. Hecht. Optics. Addison-Wesley Pub. Co., 4. ed. edition, 2002. B. Jahne. Digital Image Processing. Springer, 6th revised and extended edition edition, 2005. W. Leins and P. von Berg. Reflexionsverhalten von Fahrbahnbelagen. Forschungsberichte des Landes Nordrhein-Westfalen ; 2752. Westdeutscher Verlag, Opladen, 1978. D. Marquardt. An algorithm for least-squares estimation of nonlinear parameters. SIAM Journal of Applied Mathematics, 11:431-441, 1963. G. Meister. Bidirectional Reflectance of Urban Surfaces. PhD thesis, Universitat Hamburg, 2000. S.K. Nayar and M. Oren. Visual appearance of matte surfaces. Science, 267:1153-1156, 1995. F.E. Nicodemus, J.C. Richmond, J.J. Hsia, I.W. Ginsberg, and T. Limperis. Geometrical considerations and nomenclature for reflectance. Technical report, US Department of Commerce, National Bureau of Standards, Washington, D.C., 1977. M. Oren and S.K. Nayar. Seeing beyond lambert's law. In IEEE European Conference on Computer Vision (ECCV '94), pages 269-280. Springer, 1994. M. Oren and S.K. Nayar. Generalization of the lambertian model and implications for machine vision. International Journal of Computer Vision, 14:227-251, 1995. M. Roser and A. Geiger. Video-based raindrop detection for improved image registration. In IEEE Workshop on Video-Oriented Object and Event Classification (in conjunction with ICCV '09), 2009. K.E. Torrance and E.M. Sparrow. Theory for off-specular reflection from roughened surfaces. Journal of the Optical Society of America, 57(9):1105-1114, September 1967. T.C. van Flandern and K.F. Pulkkinen. Low-precision formulae for planetary positions. The Astrophysical Journal Supplement Series, 41:391-411, November 1979. Bram van Ginneken, Marigo Stavridi, and Jan J. Koenderink. Diffuse and specular reflectance from rough surfaces. Appl. Opt., 37(1):130-139, 1998.
Claims (9)
- 車載カメラを用いて車両前方の路面の像を取得するための路面反射率分類システムであって、
少なくとも車両の位置及び進行方向を求めるナビゲーションセンサと、
現在の時刻及び日付を求めるクロックと、
前記ナビゲーションセンサ及び前記クロックの出力に基づき、前記車両の位置及び進行方向に対する太陽の位置を判定するための太陽位置計算部と、
前後に配列されたパッチのアレイとして、車両前方の路面上に画定された複数のパッチの像を取得するカメラ制御部と、
路面パッチの空間的不変性の仮定の下に、前記カメラ制御部により取得された像に基づいて双方向反射率分布(BRDF)を測定する主制御部とを有することを特徴とする路面反射率分類システム。 - 入射放射度の基準を提供するために前記車両のフード面上に設けられた拡散パッチを更に含むことを特徴とする請求項1に記載の路面反射率分類システム。
- 入射放射度の基準を提供するための車載光センサを更に含むことを特徴とする請求項1に記載の路面反射率分類システム。
- 前記主制御部が、前記車載カメラの出力を分析して入射放射度の基準を提供するように構成されていることを特徴とする請求項1に記載の路面反射率分類システム。
- 前記パッチアレイが、(n×m)マトリックスとして配置され、nが横方向に配置されたパッチの数であり、mが前後方向に配置されたパッチの数であることを特徴とする請求項1に記載の路面反射率分類システム。
- 前記双方向反射率分布の測定に際して、前記主制御部が、既知の外れ値的な(outlier)反射特性を除去することを特徴とする請求項1に記載の路面反射率分類システム。
- 前記ナビゲーションセンサが、衛星GPSシステムを含むことを特徴とする請求項1に記載の路面反射率分類システム。
- 車載カメラを用いて車両前方の路面の像を取得するための路面反射率分類システムであって、
少なくとも車両の位置及び進行方向を求めるナビゲーションセンサと、
現在の時刻及び日付を求めるクロックと、
前記車両の走行速度を検出する速度センサと、
前記ナビゲーションセンサ及び前記クロックの出力に基づき、前記車両の位置及び進行方向に対する太陽の位置を判定するための太陽位置計算部と、
車両前方の所定距離の路面上に画定されたパッチの像を取得するカメラ制御部と、
路面パッチの時間的不変性の仮定の下に、所定の時間間隔を於いて、前記カメラ制御部により取得された複数の像に基づいて双方向反射率分布を測定する主制御部とを有することを特徴とする路面反射率分類システム。 - 車載カメラを用いて車両前方の路面の像を取得するための路面反射率分類システムであって、
少なくとも車両の位置及び進行方向を求めるナビゲーションセンサと、
現在の時刻及び日付を求めるクロックと、
前記車両の走行速度を検出する速度センサと、
前記ナビゲーションセンサ及び前記クロックの出力に基づき、前記車両の位置及び進行方向に対する太陽の位置を判定するための太陽位置計算部と、
車両前方の所定距離の路面上に画定された少なくとも1つのパッチの像を取得するカメラ制御部と、
路面パッチの空間的又は時間的不変性の仮定の下に、前記カメラ制御部により取得された画像に基づいて双方向反射率分布を測定する評価部と、
前記評価部の出力に基づき双方向反射率分布の指標を表示する表示部とを有することを特徴とする路面反射率分類システム。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE112010005669.2T DE112010005669B4 (de) | 2010-06-18 | 2010-06-18 | System zur Straßenoberflächenreflektivitätsklassifizierung |
| PCT/JP2010/004092 WO2011158306A1 (ja) | 2010-06-18 | 2010-06-18 | 路面反射率分類のためのシステム |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2010/004092 WO2011158306A1 (ja) | 2010-06-18 | 2010-06-18 | 路面反射率分類のためのシステム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2011158306A1 true WO2011158306A1 (ja) | 2011-12-22 |
Family
ID=45347734
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2010/004092 Ceased WO2011158306A1 (ja) | 2010-06-18 | 2010-06-18 | 路面反射率分類のためのシステム |
Country Status (2)
| Country | Link |
|---|---|
| DE (1) | DE112010005669B4 (ja) |
| WO (1) | WO2011158306A1 (ja) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014124895A1 (de) * | 2013-02-12 | 2014-08-21 | Continental Teves Ag & Co. Ohg | Verfahren und strahlensensormodul zur vorausschauenden strassenzustandsbestimmung in einem fahrzeug |
| WO2014189059A1 (ja) * | 2013-05-20 | 2014-11-27 | 株式会社デンソー | 路面状態推定装置 |
| JP2020106443A (ja) * | 2018-12-28 | 2020-07-09 | スタンレー電気株式会社 | 路面状態検知システム及び路面状態検知方法 |
| CN112597666A (zh) * | 2021-01-08 | 2021-04-02 | 北京深睿博联科技有限责任公司 | 一种基于表面材质建模的路面状态分析方法及装置 |
| US20220180643A1 (en) * | 2019-03-22 | 2022-06-09 | Vergence Automation, Inc. | Vectorization for object detection, recognition, and assessment for vehicle vision systems |
| CN115346129A (zh) * | 2022-07-05 | 2022-11-15 | 中国空间技术研究院 | 一种brdf模型参数的确定方法及系统 |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102013226631A1 (de) * | 2013-12-19 | 2015-06-25 | Continental Teves Ag & Co. Ohg | Verfahren und Vorrichtung zur Ermittlung von lokalen Wetterverhältnissen und eines lokalen Fahrbahnzustands |
| DE102014100416A1 (de) | 2014-01-15 | 2015-07-16 | Hella Kgaa Hueck & Co. | Vorrichtung und Verfahren zur dynamischen Messung der Reflexionseigenschaften von Fahrbahnoberflächen |
| DE102015208429A1 (de) | 2015-05-06 | 2016-11-10 | Continental Teves Ag & Co. Ohg | Verfahren und Vorrichtung zur Erkennung und Bewertung von Fahrbahnreflexionen |
| DE102016009022A1 (de) | 2016-07-23 | 2017-02-02 | Daimler Ag | Verfahren zur Erkennung von Nässe auf einer Fahrbahn |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1089974A (ja) * | 1996-09-17 | 1998-04-10 | Toyota Motor Corp | 光学式車両位置検出装置 |
| JP2004246798A (ja) * | 2003-02-17 | 2004-09-02 | Nissan Motor Co Ltd | 車線検出装置 |
| WO2005075959A1 (ja) * | 2004-02-10 | 2005-08-18 | Nihon University | 摩擦係数推定方法及び装置 |
| JP2010036757A (ja) * | 2008-08-06 | 2010-02-18 | Fuji Heavy Ind Ltd | 車線逸脱防止制御装置 |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3000899A1 (en) | 2002-12-04 | 2016-03-30 | Applied Biosystems, LLC | Multiplex amplification of polynucleotides |
| JP5384807B2 (ja) | 2007-06-20 | 2014-01-08 | 株式会社岡村製作所 | 椅子における背板へのカバー部材取付構造 |
| JP5216010B2 (ja) * | 2009-01-20 | 2013-06-19 | 本田技研工業株式会社 | ウインドシールド上の雨滴を同定するための方法及び装置 |
-
2010
- 2010-06-18 WO PCT/JP2010/004092 patent/WO2011158306A1/ja not_active Ceased
- 2010-06-18 DE DE112010005669.2T patent/DE112010005669B4/de not_active Expired - Fee Related
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1089974A (ja) * | 1996-09-17 | 1998-04-10 | Toyota Motor Corp | 光学式車両位置検出装置 |
| JP2004246798A (ja) * | 2003-02-17 | 2004-09-02 | Nissan Motor Co Ltd | 車線検出装置 |
| WO2005075959A1 (ja) * | 2004-02-10 | 2005-08-18 | Nihon University | 摩擦係数推定方法及び装置 |
| JP2010036757A (ja) * | 2008-08-06 | 2010-02-18 | Fuji Heavy Ind Ltd | 車線逸脱防止制御装置 |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014124895A1 (de) * | 2013-02-12 | 2014-08-21 | Continental Teves Ag & Co. Ohg | Verfahren und strahlensensormodul zur vorausschauenden strassenzustandsbestimmung in einem fahrzeug |
| CN104995070A (zh) * | 2013-02-12 | 2015-10-21 | 大陆-特韦斯贸易合伙股份公司及两合公司 | 在车辆中前瞻性地确定道路状态的方法和光束传感器模块 |
| WO2014189059A1 (ja) * | 2013-05-20 | 2014-11-27 | 株式会社デンソー | 路面状態推定装置 |
| JP2014228300A (ja) * | 2013-05-20 | 2014-12-08 | 国立大学法人 東京大学 | 路面状態推定装置 |
| JP2020106443A (ja) * | 2018-12-28 | 2020-07-09 | スタンレー電気株式会社 | 路面状態検知システム及び路面状態検知方法 |
| JP7273505B2 (ja) | 2018-12-28 | 2023-05-15 | スタンレー電気株式会社 | 路面状態検知システム及び路面状態検知方法 |
| US20220180643A1 (en) * | 2019-03-22 | 2022-06-09 | Vergence Automation, Inc. | Vectorization for object detection, recognition, and assessment for vehicle vision systems |
| CN112597666A (zh) * | 2021-01-08 | 2021-04-02 | 北京深睿博联科技有限责任公司 | 一种基于表面材质建模的路面状态分析方法及装置 |
| CN112597666B (zh) * | 2021-01-08 | 2022-05-24 | 北京深睿博联科技有限责任公司 | 一种基于表面材质建模的路面状态分析方法及装置 |
| CN115346129A (zh) * | 2022-07-05 | 2022-11-15 | 中国空间技术研究院 | 一种brdf模型参数的确定方法及系统 |
Also Published As
| Publication number | Publication date |
|---|---|
| DE112010005669B4 (de) | 2017-10-26 |
| DE112010005669T5 (de) | 2013-05-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2011158306A1 (ja) | 路面反射率分類のためのシステム | |
| US10949684B2 (en) | Vehicle image verification | |
| Hautiere et al. | Automatic fog detection and estimation of visibility distance through use of an onboard camera | |
| CN105049784B (zh) | 用于基于图像的视距估计的方法和设备 | |
| Tarel et al. | Vision enhancement in homogeneous and heterogeneous fog | |
| Tatoglu et al. | Point cloud segmentation with LIDAR reflection intensity behavior | |
| Negru et al. | Image based fog detection and visibility estimation for driving assistance systems | |
| Hautière et al. | Real-time disparity contrast combination for onboard estimation of the visibility distance | |
| Ogawa et al. | Pedestrian detection and tracking using in-vehicle lidar for automotive application | |
| CN102314600B (zh) | 用于畅通路径检测的去除由基于车辆的相机捕获的图像中的阴影 | |
| US10735716B2 (en) | Vehicle sensor calibration | |
| KR101364727B1 (ko) | 촬영된 영상의 처리를 이용한 안개 감지 방법 및 장치 | |
| JP2014215877A (ja) | 物体検出装置 | |
| CA3106398A1 (en) | Method and device for recognising and analysing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies | |
| JP7448484B2 (ja) | カメラ内部パラメータのオンライン評価 | |
| WO2019163315A1 (ja) | 情報処理装置、撮像装置、及び撮像システム | |
| Hautiere et al. | Contrast restoration of foggy images through use of an onboard camera | |
| Duthon et al. | Methodology used to evaluate computer vision algorithms in adverse weather conditions | |
| US9727792B2 (en) | Method and device for tracking-based visibility range estimation | |
| Morden et al. | Driving in the rain: a survey toward visibility estimation through windshields | |
| JP7648669B2 (ja) | システム | |
| WO2018165027A1 (en) | Polarization-based detection and mapping method and system | |
| CN119693908B (zh) | 一种自动驾驶汽车避障控制方法、系统、汽车及存储介质 | |
| JP7288460B2 (ja) | 車載用物体識別システム、自動車、車両用灯具、分類器の学習方法、演算処理装置 | |
| Negru et al. | Fog assistance on smart mobile devices |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10853188 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 112010005669 Country of ref document: DE Ref document number: 1120100056692 Country of ref document: DE |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10853188 Country of ref document: EP Kind code of ref document: A1 |