[go: up one dir, main page]

WO2018212284A1 - Dispositif de mesure, procédé de mesure et programme - Google Patents

Dispositif de mesure, procédé de mesure et programme Download PDF

Info

Publication number
WO2018212284A1
WO2018212284A1 PCT/JP2018/019140 JP2018019140W WO2018212284A1 WO 2018212284 A1 WO2018212284 A1 WO 2018212284A1 JP 2018019140 W JP2018019140 W JP 2018019140W WO 2018212284 A1 WO2018212284 A1 WO 2018212284A1
Authority
WO
WIPO (PCT)
Prior art keywords
white line
unit
vehicle
self
predetermined range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/019140
Other languages
English (en)
Japanese (ja)
Inventor
多史 藤谷
岩井 智昭
加藤 正浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pioneer Corp
Original Assignee
Pioneer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corp filed Critical Pioneer Corp
Priority to JP2019518867A priority Critical patent/JPWO2018212284A1/ja
Publication of WO2018212284A1 publication Critical patent/WO2018212284A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a technique for estimating the position of a moving body based on the position of a feature.
  • Patent Document 1 describes an example of a method for estimating a vehicle position using a feature position detected using LiDAR and a feature position of map information.
  • Patent Document 2 discloses a technique for transmitting an electromagnetic wave to a road surface and detecting a white line based on the reflectance.
  • the number of data that can be measured by LiDAR varies depending on the type of white line (continuous line, broken line, etc.) and paint deterioration. For this reason, when the vehicle position is estimated using the white line, the detection accuracy of the white line changes depending on whether the number of LiDAR data used to detect the white line is small or large, and as a result, the accuracy of the vehicle position estimation changes. Come.
  • An object of the present invention is to appropriately adjust the range in which a white line is detected according to the situation, and to prevent a decrease in accuracy of the vehicle position estimation.
  • the invention according to claim 1 is a measuring apparatus, an acquisition unit that acquires output data from a sensor unit for detecting surrounding road lines, a self-position, road line position information, and the self A determination unit that determines a predetermined range based on a moving speed of the position; an extraction unit that extracts data corresponding to a detection result of the predetermined range from the output data; and a predetermined unit based on the extracted data And a processing unit that performs processing.
  • the invention according to claim 8 is a measuring method executed by the measuring device, the acquisition step of acquiring output data from the sensor unit for detecting the surrounding road surface line, the self-position, and the road line A determination step of determining a predetermined range based on the position information and the moving speed of the self-position; an extraction step of extracting data corresponding to a detection result of the predetermined range from the output data; And a processing step of performing a predetermined process based on the data.
  • the invention according to claim 9 is a program executed by a measuring apparatus including a computer, an acquisition unit that acquires output data from a sensor unit for detecting a surrounding road surface line, a self-position, and a road surface line Based on the position information and the moving speed of the self-position, a determination unit that determines a predetermined range, an extraction unit that extracts data corresponding to a detection result of the predetermined range from the output data, and extracted data
  • the computer is caused to function as a processing unit that performs predetermined processing based on the above.
  • the measuring device includes an acquisition unit that acquires output data from a sensor unit for detecting surrounding road lines, a self-position, position information on the road line, and the self A determination unit that determines a predetermined range based on a moving speed of the position; an extraction unit that extracts data corresponding to a detection result of the predetermined range from the output data; and a predetermined unit based on the extracted data A processing unit that performs processing.
  • the measuring device acquires output data from a sensor unit for detecting surrounding road lines, and sets a predetermined range based on the self-position, road line position information, and the movement speed of the self-position. decide. Then, data corresponding to the detection result in a predetermined range is extracted from the output data, and predetermined processing is performed based on the extracted data. Thereby, the predetermined range for extracting data can be appropriately set according to the moving speed of the self position.
  • the “road line” in the present specification is a marking line such as a white line or a yellow line to be measured, and a linear road marking such as a stop line or a pedestrian crossing.
  • the determination unit determines the predetermined range based on the self-position and position information of the solid line portion of the road surface line, and determines the predetermined range according to the moving speed. Shift in the direction of movement of its own position. In this aspect, the predetermined range is shifted according to the moving speed.
  • the measurement apparatus is mounted on a moving body, and the determination unit determines a plurality of predetermined ranges based on a position of the moving body, and Each of the predetermined ranges is shifted by a shift amount corresponding to the scan order of the predetermined range.
  • the plurality of predetermined ranges are shifted according to the scanning order by the sensor unit.
  • the determination unit shifts by a larger shift amount in a predetermined range after the scanning order by the sensor unit.
  • the determination unit decreases the shift amount as the scanning speed of the sensor unit increases.
  • the determination unit sets the predetermined ranges in four locations, right front, right rear, left front, and left rear, based on the position of the moving body.
  • the processing unit detects a position of the road surface line and performs a process of estimating the position of the measuring device based on the position of the road line.
  • the measurement method executed by the measurement apparatus includes an acquisition step of acquiring output data from a sensor unit for detecting a surrounding road line, a self-position, and a road line A determination step of determining a predetermined range based on the position information and the moving speed of the self-position; an extraction step of extracting data corresponding to a detection result of the predetermined range from the output data; And a processing step for performing a predetermined process based on the data.
  • a predetermined range for extracting data can be appropriately set according to the moving speed of the self-position.
  • a program executed by a measurement apparatus including a computer includes an acquisition unit that acquires output data from a sensor unit for detecting a surrounding road surface line, a self-position, and a road surface line. Based on the position information and the moving speed of the self-position, a determination unit that determines a predetermined range, an extraction unit that extracts data corresponding to a detection result of the predetermined range from the output data, and extracted data
  • the computer is caused to function as a processing unit that performs predetermined processing based on the above.
  • the above measurement apparatus can be realized by executing this program on a computer. This program can be stored and handled in a storage medium.
  • FIG. 1 is a diagram illustrating a white line extraction method.
  • White line extraction refers to detecting a white line painted on a road surface and calculating a predetermined position, for example, a center position.
  • the vehicle 5 exists in the map coordinate system (X m , Y m ), and the vehicle coordinate system (X v , Y v ) is defined based on the position of the vehicle 5. Specifically, the traveling direction of the vehicle 5 and X v-axis of the vehicle coordinate system, a direction perpendicular to it and Y v axis of the vehicle coordinate system.
  • white lines that are lane boundary lines on the left and right sides of the vehicle 5.
  • the position of the white line in the map coordinate system that is, the white line map position is included in the advanced map managed by the server or the like, and is acquired from the server or the like.
  • white line data is stored in the advanced map as a coordinate point sequence.
  • the LiDAR mounted on the vehicle 5 measures scan data along the scan line 2.
  • the scan line 2 indicates a trajectory of scanning by LiDAR.
  • the coordinates of the points constituting the white line WL1 on the left side of the vehicle 5, that is, the white line map position WLMP1 is (mx m1 , my m1 ), and the coordinates of the points constituting the white line WL2 on the right side of the vehicle 5, ie, the white line.
  • the map position WLMP2 is (mx m2 , my m2 ).
  • the predicted host vehicle position PVP in the map coordinate system is given by (x ′ m , y ′ m ), and the predicted host vehicle azimuth angle in the map coordinate system is given by ⁇ ′ m .
  • the white line predicted position WLPP (l′ x v , l′ y v ) indicating the predicted position of the white line is the white line map position WLMP (mx m , my m ) and the predicted host vehicle position PVP (x ′ m , y).
  • m the white line map position
  • PVP the predicted host vehicle position
  • the white line predicted position WLPP1 (l′ x v1 , l′ y v1 ) is obtained for the white line WL1 and the white line predicted position WLPP2 (l′ x v2 , l′ y v2 ) is obtained for the white line WL2 by Expression (1). It is done. Thus, for each of the white lines WL1 and WL2, a plurality of white line predicted positions WLPP1 and WLPP2 corresponding to the white lines WL1 and WL2 are obtained.
  • the white line predicted range WLPR is determined based on the white line predicted position WLPP.
  • the white line prediction range WLPR indicates a range in which a white line is considered to exist on the basis of the predicted vehicle position PVP.
  • the white line prediction range WLPR is set at four locations on the vehicle 5 at the right front, right rear, left front, and left rear at the maximum.
  • FIG. 2 shows a method for determining the white line prediction range WLPR.
  • A set forward reference point to any position in front of the vehicle 5 (distance alpha v forward position) to ( ⁇ v, 0 v). Then, based on the front reference point ( ⁇ v , 0 v ) and the white line predicted position WLPP, the white line predicted position WLPP closest to the front reference point ( ⁇ v , 0 v ) is searched.
  • the white line WL1 based on the forward reference point ( ⁇ v , 0 v ) and a plurality of white line predicted positions WLPP1 (l′ x v1 , l′ y v1 ) constituting the white line WL1, the following The distance D1 is calculated by the equation (2), and the white line predicted position WLPP1 at which the distance D1 is the minimum value is set as the prediction range reference point Pref1.
  • the white line WL2 based on the forward reference point ( ⁇ v , 0 v ) and a plurality of white line predicted positions WLPP2 (l′ x v2 , l′ y v2 ) constituting the white line WL2, the following formula
  • the distance D2 is calculated by (3), and the white line predicted position WLPP2 at which the distance D2 is the minimum value is set as the predicted range reference point Pref2.
  • any range based on the expected range reference point Pref for example ⁇ [Delta] X from the expected range reference point Pref in X v-axis direction, a range of ⁇ [Delta] Y to Y v-axis direction
  • the white line prediction range WLPR is set.
  • white line prediction ranges WLPR1 and WLPR2 are set at the left and right positions in front of the vehicle 5.
  • white line prediction ranges WLPR3 and WLPR4 are set at the left and right positions behind the vehicle 5 by setting the rear reference point behind the vehicle 5 and setting the prediction range reference point Pref.
  • four white line prediction ranges WLPR1 to WLPR4 are set for the vehicle 5.
  • FIG. 3 shows a method of calculating the white line center position WLCP.
  • FIG. 3A shows a case where the white line WL1 is a solid line.
  • the white line center position WLCP1 is calculated by the average value of the position coordinates of the scan data constituting the white line.
  • the white line scan data WLSD1 (wx ′ v , wy) existing in the white line prediction range WLPR1 among the scan data output from the LiDAR. ' v ) is extracted.
  • the scan data obtained on the white line is data with high reflection intensity.
  • scan data that exists within the white line prediction range WLPR1 and on the road surface and whose reflection intensity is greater than or equal to a predetermined value is extracted as white line scan data WLSD.
  • the coordinates of the white line center position WLCP1 (sx v1 , sy v1 ) are obtained by the following equation (4).
  • the white line center position WLCP2 is similarly calculated.
  • FIG. 4A shows white line prediction ranges WLPR1 and WLPR2 set in front of the vehicle 5 when the vehicle 5 is stopped.
  • the LiDAR scans one round (360 °) clockwise around the position of the vehicle 5 and accumulates scan data for one round.
  • the LiDAR scan line 2 is a circle centered on the position of the vehicle 5, so the white line prediction ranges WLPR1 and WLPR2 only need to be equidistant from the vehicle 5. That is, as shown, by arranging a white line estimated range WLPR1 and white expected range WLPR2 a position axisymmetric with respect to the X v axis, it is possible to extract the scan data by LiDAR efficiently.
  • the white line prediction range WLPR2 on the right front side of the vehicle 5 is set to a position symmetrical to the white line prediction range WLPR1 on the left front side, the white line prediction range WLPR2
  • the number of scan lines 2 included in the number decreases.
  • the position of the white line prediction range WLPR is corrected according to the speed of the vehicle 5.
  • the white line prediction ranges WLPR1 and WLPR2 are set symmetrically in the same manner as FIG. 4A by the method shown in FIG.
  • the white line prediction range WLPR2 is shifted forward by the shift amount and set to the position of the white line prediction range WLPR2x.
  • the white line prediction range WLPR2x can be set so as to appropriately include the scan line 2 that has moved forward by the movement of the vehicle 5.
  • FIG. 5 shows a white line prediction range correction method when four white line prediction ranges WLPR1 to WLPR4 are set with the position of the vehicle 5 as a reference.
  • scanning with LiDAR is clockwise.
  • the white line prediction range WLPR1 on the left front side of the vehicle 5 is considered as a reference point for scanning with LiDAR, and its scanning time is t0.
  • the time difference from when LiDAR scans the left front white line prediction range WLPR1 to the right front white line prediction range WLPR2 is ⁇ t1, and the right front white line prediction range WLPR2 is scanned and the right rear white line is scanned.
  • the time difference until the prediction range WLPR3 is scanned is denoted by ⁇ t2
  • the time difference from when the right rear white line prediction range WLPR3 is scanned until the left rear white line prediction range WLPR4 is scanned is denoted by ⁇ t3.
  • the right front of the white line prediction range WLPR2 is set symmetrically with respect to the X v-axis front left white line expected range WLPR1.
  • rear right white line estimated range WLPR3 is set symmetrically with respect to the white line expected range WLPR2 and Y v axes of the right front.
  • the white line expected range WLPR4 left rear is set symmetrically with respect to the white line expected range WLPR3 and X v-axis of the right rear.
  • the white line prediction range WLPR2 on the right front side is shifted forward by ⁇ d1 and set at the position of the white line prediction range WLPR2x.
  • the right rear white line prediction range WLPR3 is shifted forward by ⁇ d2 and set to the position of the white line prediction range WLPR3x.
  • the left rear white line prediction range WLPR4 is shifted forward by ⁇ d3 and set to the position of the white line prediction range WLPR4x.
  • the shift amounts ⁇ d1 to ⁇ d3 depend on the speed of the vehicle 5. Schematically, the shift amounts ⁇ d1 to ⁇ d3 increase as the vehicle 5 speed increases, and the shift amounts ⁇ d1 to ⁇ d3 decrease as the vehicle 5 speed decreases.
  • the shift amount ⁇ d increases as the white line prediction range comes later in the scanning order from the reference white line prediction range.
  • the shift amount is ⁇ d1 ⁇ d2 ⁇ d3. That is, the shift amount has a relationship of white line prediction range WLPR2x ⁇ white line prediction range WLPR3x ⁇ white line prediction range WLPR4x.
  • the shift amount has a relationship of white line prediction range WLPR4x ⁇ white line prediction range WLPR3x ⁇ white line prediction range WLPR2x.
  • the shift amounts ⁇ d1 to ⁇ d3 also depend on the scan speed of the LiDAR mounted on the vehicle 5. Schematically, the higher the LiDAR scanning speed, the smaller the shift amounts ⁇ d1 to ⁇ d3, and the slower the scanning speed, the larger the shift amounts ⁇ d1 to ⁇ d3. This is because the values of ⁇ t1 to ⁇ t3 in the above equations (5) to (7) become smaller as the scanning speed is higher.
  • the time differences ⁇ t1 to ⁇ t3 of the white line prediction range scan time can be calculated based on the distance between the white line prediction ranges and the LiDAR scan speed, or the angle between the white line prediction ranges and the LiDAR scan angular speed. .
  • FIG. 6 shows a schematic configuration of a host vehicle position estimation apparatus to which the measurement apparatus of the present invention is applied.
  • the own vehicle position estimation device 10 is mounted on a vehicle and configured to be able to communicate with a server 7 such as a cloud server by wireless communication.
  • the server 7 is connected to a database 8, and the database 8 stores an advanced map.
  • the advanced map stored in the database 8 stores landmark map information for each landmark.
  • white line map information including a white line map position WLMP indicating the coordinates of the point sequence constituting the white line is stored.
  • the own vehicle position estimation device 10 communicates with the server 7 and downloads white line map information related to the white line around the own vehicle position of the vehicle.
  • the own vehicle position estimation device 10 includes an inner world sensor 11, an outer world sensor 12, an own vehicle position prediction unit 13, a communication unit 14, a white line map information acquisition unit 15, a white line position prediction unit 16, and scan data extraction. Unit 17, white line center position calculation unit 18, and own vehicle position estimation unit 19.
  • the vehicle position prediction unit 13, the white line map information acquisition unit 15, the white line position prediction unit 16, the scan data extraction unit 17, the white line center position calculation unit 18, and the vehicle position estimation unit 19 are actually a CPU or the like. This is realized by a computer executing a program prepared in advance.
  • the inner world sensor 11 measures the position of the vehicle as a GNSS (Global Navigation Satellite System) / IMU (Inertia Measurement Unit) combined navigation system, and includes a satellite positioning sensor (GPS), a gyro sensor, a vehicle speed sensor, and the like. Including.
  • the own vehicle position prediction unit 13 predicts the own vehicle position of the vehicle by GNSS / IMU combined navigation based on the output of the internal sensor 11 and supplies the predicted own vehicle position PVP to the white line position prediction unit 16.
  • the external sensor 12 is a sensor that detects an object around the vehicle, and includes a stereo camera, LiDAR, and the like.
  • the external sensor 12 supplies the scan data SD obtained by the measurement to the scan data extraction unit 17.
  • the communication unit 14 is a communication unit for wirelessly communicating with the server 7.
  • the white line map information acquisition unit 15 receives white line map information related to white lines existing around the vehicle from the server 7 via the communication unit 14 and supplies the white line map position WLMP included in the white line map information to the white line position prediction unit 16. To do.
  • the white line position prediction unit 16 calculates the white line predicted position WLPP by the above-described equation (1) based on the white line map position WLMP and the predicted vehicle position PVP acquired from the vehicle position prediction unit 13. Further, the white line position prediction unit 16 determines the white line prediction range WLPR by the above-described equations (2) and (3) based on the white line prediction position WLPP. Further, the white line position prediction unit 16 corrects the white line predicted positions WLPR1 to WLPR4 as described above according to the vehicle speed, the LiDAR scan speed, and the scan direction (clockwise or counterclockwise). Then, the white line position prediction unit 16 supplies the corrected white line prediction range WLPR to the scan data extraction unit 17.
  • the scan data extraction unit 17 extracts the white line scan data WLSD based on the white line prediction range WLPR supplied from the white line position prediction unit 16 and the scan data SD acquired from the external sensor 12. Specifically, the scan data extraction unit 17 extracts scan data included in the white line prediction range WLPR and having a reflection intensity equal to or higher than a predetermined value, as white line scan data WLSD, from the scan data SD.
  • the white line center position calculation unit 18 is supplied.
  • the white line center position calculation unit 18 calculates the white line center position WLCP from the white line scan data WLSD using the equation (4). Then, the white line center position calculation unit 18 supplies the calculated white line center position WLCP to the vehicle position estimation unit 19.
  • the own vehicle position estimating unit 19 estimates the own vehicle position and the own vehicle azimuth angle based on the white line map position WLMP in the advanced map and the white line center position WLCP that is white line measurement data by the external sensor 12.
  • Japanese Patent Laid-Open No. 2017-72422 discloses an example of a method for estimating the vehicle position by matching the landmark position information of the advanced map and the measured position information of the landmark by the external sensor.
  • the external sensor 12 is an example of the sensor unit of the present invention
  • the scan data extraction unit 17 is an example of the acquisition unit and the extraction unit of the present invention
  • the white line position prediction unit 16 is the determination unit of the present invention.
  • the vehicle position estimation unit 19 is an example, and is an example of a processing unit of the present invention.
  • FIG. 7 is a flowchart of the vehicle position estimation process. This process is realized by a computer such as a CPU executing a program prepared in advance and functioning as each component shown in FIG.
  • the own vehicle position estimation apparatus 10 shall always detect the speed of the vehicle 5 by a vehicle speed sensor or the like. It is assumed that the LiDAR scan speed and scan direction are determined in advance.
  • the host vehicle position prediction unit 13 acquires the predicted host vehicle position PVP based on the output from the internal sensor 11 (step S11).
  • the white line map information acquisition part 15 connects to the server 7 through the communication part 14, and acquires white line map information from the advanced map memorize
  • the white line position prediction unit 16 calculates the white line predicted position WLPP based on the white line map position WLMP included in the white line position information obtained in step S12 and the predicted host vehicle position PVP obtained in step S11. (Step S13). Further, the white line position prediction unit 16 determines the white line prediction range WLPR based on the white line prediction position WLPP. At this time, the white line position prediction unit 16 corrects the positions of the white line prediction ranges WLPR1 to WLPR4 based on the speed of the vehicle 5, the LiDAR scan speed, and the scan direction as described above. Then, the white line position prediction unit 16 supplies the white line prediction range WLPR to the scan data extraction unit 17 (step S14).
  • the scan data extraction unit 17 converts the scan data SD obtained from the LiDAR as the external sensor 12 into the white line predicted range WLPR, the scan data on the road surface, and the reflection intensity is a predetermined value or more as a white line Extracted as scan data WLSD and supplied to the white line center position calculator 18 (step S15).
  • the white line center position calculation unit 18 calculates the white line center position WLCP based on the white line prediction range WLPR and the white line scan data WLSD, and supplies the white line center position WLCP to the own vehicle position estimation unit 19 (step S16). And the own vehicle position estimation part 19 estimates the own vehicle position using the white line center position WLCP (step S17), and outputs the own vehicle position and the own vehicle azimuth (step S18). Thus, the own vehicle position estimation process ends.
  • the white line that is the lane boundary indicating the lane is used, but the application of the present invention is not limited to this, and even if a linear road marking such as a pedestrian crossing or a stop line is used. Good. Further, a yellow line or the like may be used instead of the white line. These lane markings such as white lines and yellow lines, road markings, and the like are examples of road lines of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

Le dispositif de mesure selon l'invention acquiert des données de sortie provenant d'un ensemble capteur pour détecter des lignes de surface de chaussée dans les environs, et détermine une plage prescrite sur la base de sa position propre, d'informations de position par rapport aux lignes de surface de chaussée, et de la vitesse de déplacement de sa propre position. En outre, à partir des données de sortie, le dispositif de mesure extrait des données correspondant au résultat de détection dans la plage prescrite, et réalise un traitement prescrit sur la base des données extraites.
PCT/JP2018/019140 2017-05-19 2018-05-17 Dispositif de mesure, procédé de mesure et programme Ceased WO2018212284A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019518867A JPWO2018212284A1 (ja) 2017-05-19 2018-05-17 測定装置、測定方法およびプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017100145 2017-05-19
JP2017-100145 2017-05-19

Publications (1)

Publication Number Publication Date
WO2018212284A1 true WO2018212284A1 (fr) 2018-11-22

Family

ID=64274458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019140 Ceased WO2018212284A1 (fr) 2017-05-19 2018-05-17 Dispositif de mesure, procédé de mesure et programme

Country Status (2)

Country Link
JP (3) JPWO2018212284A1 (fr)
WO (1) WO2018212284A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024035512A (ja) * 2022-09-02 2024-03-14 本田技研工業株式会社 区画線認識装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05265547A (ja) * 1992-03-23 1993-10-15 Fuji Heavy Ind Ltd 車輌用車外監視装置
JPH07225893A (ja) * 1994-02-09 1995-08-22 Fuji Heavy Ind Ltd 車間距離制御装置
JP2000105898A (ja) * 1998-02-18 2000-04-11 Equos Research Co Ltd 車両制御装置、車両制御方法および車両制御方法をコンピュ―タに実行させるためのプログラムを記録したコンピュ―タ読み取り可能な媒体
JP2001092970A (ja) * 1999-09-22 2001-04-06 Fuji Heavy Ind Ltd 車線認識装置
JP2011073529A (ja) * 2009-09-30 2011-04-14 Hitachi Automotive Systems Ltd 車両制御装置
JP2017016226A (ja) * 2015-06-29 2017-01-19 日立オートモティブシステムズ株式会社 周辺環境認識システムおよびそれを搭載した車両制御システム

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04184603A (ja) * 1990-11-20 1992-07-01 Toyota Motor Corp 案内ライン検出装置
JP3931891B2 (ja) 2004-07-05 2007-06-20 日産自動車株式会社 車載用画像処理装置
JP5055691B2 (ja) 2004-10-22 2012-10-24 日産自動車株式会社 前方物体検出装置及び前方物体検出方法
JP6492469B2 (ja) 2014-09-08 2019-04-03 株式会社豊田中央研究所 自車走行レーン推定装置及びプログラム
JP6458651B2 (ja) * 2015-06-08 2019-01-30 日産自動車株式会社 路面標示検出装置及び路面標示検出方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05265547A (ja) * 1992-03-23 1993-10-15 Fuji Heavy Ind Ltd 車輌用車外監視装置
JPH07225893A (ja) * 1994-02-09 1995-08-22 Fuji Heavy Ind Ltd 車間距離制御装置
JP2000105898A (ja) * 1998-02-18 2000-04-11 Equos Research Co Ltd 車両制御装置、車両制御方法および車両制御方法をコンピュ―タに実行させるためのプログラムを記録したコンピュ―タ読み取り可能な媒体
JP2001092970A (ja) * 1999-09-22 2001-04-06 Fuji Heavy Ind Ltd 車線認識装置
JP2011073529A (ja) * 2009-09-30 2011-04-14 Hitachi Automotive Systems Ltd 車両制御装置
JP2017016226A (ja) * 2015-06-29 2017-01-19 日立オートモティブシステムズ株式会社 周辺環境認識システムおよびそれを搭載した車両制御システム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2024035512A (ja) * 2022-09-02 2024-03-14 本田技研工業株式会社 区画線認識装置
JP7486556B2 (ja) 2022-09-02 2024-05-17 本田技研工業株式会社 区画線認識装置

Also Published As

Publication number Publication date
JPWO2018212284A1 (ja) 2020-03-12
JP7526858B2 (ja) 2024-08-01
JP2023118759A (ja) 2023-08-25
JP2021179443A (ja) 2021-11-18

Similar Documents

Publication Publication Date Title
CN112415502B (zh) 雷达装置
US9767372B2 (en) Target detection apparatus and target detection method
JP7155284B2 (ja) 計測精度算出装置、自己位置推定装置、制御方法、プログラム及び記憶媒体
JP6806891B2 (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
JP6740470B2 (ja) 測定装置、測定方法およびプログラム
US20220113139A1 (en) Object recognition device, object recognition method and program
JP2017211307A (ja) 測定装置、測定方法およびプログラム
JP2023055257A (ja) 自己位置推定装置
JP7526858B2 (ja) 測定装置、測定方法およびプログラム
JP2025078713A (ja) 測定装置、測定方法、プログラム、及び、記憶媒体
JP2024177451A (ja) 測定装置、測定方法およびプログラム
JP2024038322A (ja) 測定装置、測定方法およびプログラム
JP2023068009A (ja) 地図情報作成方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18802670

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019518867

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18802670

Country of ref document: EP

Kind code of ref document: A1