WO2007132860A1 - 対象物認識装置 - Google Patents
対象物認識装置 Download PDFInfo
- Publication number
- WO2007132860A1 WO2007132860A1 PCT/JP2007/059979 JP2007059979W WO2007132860A1 WO 2007132860 A1 WO2007132860 A1 WO 2007132860A1 JP 2007059979 W JP2007059979 W JP 2007059979W WO 2007132860 A1 WO2007132860 A1 WO 2007132860A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- recognition
- recognized
- positioning
- road
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C7/00—Tracing profiles
- G01C7/02—Tracing profiles of land surfaces
- G01C7/04—Tracing profiles of land surfaces involving a vehicle which moves along the profile to be traced
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/0969—Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to an object recognition apparatus, and more particularly to an object recognition apparatus suitable for recognizing an object on a road in the own vehicle.
- an object recognition device for recognizing a stationary object such as a road sign or the like on a vehicle that is a moving body is known (for example, see Patent Document 1).
- This object recognition apparatus includes a camera that captures a road around the vehicle, and recognizes a stationary object on the road based on image information captured by a powerful camera.
- Patent Document 1 JP 2000-346653 A
- the stationary object recognized from the camera-captured image as described above is a road sign or road marking that is installed or drawn on the road, and is scattered on the road.
- the position of the stationary object is often determined absolutely or relative to a fixed object on the map such as an intersection.
- the vehicle-mounted side having a database that stores in advance the position information of the stationary object to be recognized by the vehicle, when the object on the road is recognized, the road around the vehicle is always photographed and the captured image
- the recognition processing load increases. As a result, it takes time to recognize an object, or an inconvenience that an expensive apparatus capable of high-speed processing is required arises.
- the present invention has been made in view of the above points, and an object thereof is to provide an object recognition apparatus that reduces the processing load required for recognition of an object on a road.
- the purpose described above is to recognize the position of the host vehicle in advance, a positioning unit that measures the position of the host vehicle, a positioning accuracy calculation unit that calculates a positioning accuracy for positioning the position of the host vehicle, and the like.
- the object recognition apparatus includes: a recognition range setting unit that sets a range; and an object recognition unit that recognizes the object in the recognition range set by the recognition range setting unit.
- the recognition range of the road on which the object is to be recognized in the own vehicle based on the position of the own vehicle, the position of the object, and the positioning accuracy in positioning the position of the own vehicle. Is set. Then, the object recognition process is performed within the set recognition range. In such a configuration, the object recognition process is not performed outside the set recognition range. For this reason, according to the present invention, it is possible to reduce the processing load required for the recognition of the object as compared with the configuration in which the object recognition process is always performed.
- the position of the object is accurate to some extent because the information is stored in the storage means in advance.
- the position of the host vehicle is determined according to a predetermined method, and therefore depends on the positioning accuracy. Error. In this regard, if no positioning error occurs, it is sufficient to perform recognition processing of the target object when the position of the measured vehicle coincides with the position of the target object. If the target vehicle is recognized only when the position of the measured vehicle matches the position of the target, the vehicle has already passed the target when the recognition is performed. May occur, and the object may not be recognized. On the other hand, in the above-described aspect of the invention, since the recognition range in which the object is to be recognized is set in consideration of the positioning accuracy of the own vehicle position, the object to be recognized by the own vehicle cannot be recognized. Can be prevented.
- the recognition range setting means may set a wider recognition range as the positioning accuracy calculated by the positioning accuracy calculation means is lower.
- the recognition range setting means is connected to the positioning means. It is only necessary to set the recognition range wider as the distance from the position of the host vehicle measured more to the position of the object stored in the storage means becomes longer.
- the positioning means determines the position of the host vehicle based on the recognition result.
- the position of the host vehicle can be accurately measured by using the recognized object.
- the positioning accuracy calculation unit corrects the positioning accuracy based on the recognition result of the object by the positioning unit. If the calculation is performed so as to decrease at least according to the distance traveled, the positioning accuracy that decreases at least according to the distance traveled is increased every time the vehicle position is corrected based on the recognition result of the object. It becomes possible.
- a specific object existing on the traveling route of the host vehicle is set as an object to be recognized among all the objects whose position information is stored in the storage unit.
- Recognition object setting means, and the object recognition means may recognize only the object to be recognized set by the recognition object setting means, and assist control according to the vehicle position If the positioning accuracy required for the support control device that executes the control needs to be satisfied at the time of execution of the support control, out of all objects whose position information is stored in the storage means, the recognition result by the positioning means Based on the correction of the own vehicle position based on the vehicle, it is necessary to recognize an object existing on the traveling route of the own vehicle that can satisfy the positioning accuracy required for the assistance control device when the assistance control is executed.
- a recognition object setting means for setting as an object, and the object recognition means may recognize only the object to be recognized set by the recognition object setting means.
- the object recognition means is connected to the object recognition device described above, and the object recognition means is configured to recognize the recognition range by the recognition range setting means photographed by an imaging means disposed at a predetermined position of the host vehicle.
- the object may be recognized using the image information in.
- the invention's effect [0014] According to the present invention, according to the present invention, it is possible to reduce the processing load required for recognizing an object on a road.
- FIG. 1 is a configuration diagram of a system mounted on a vehicle according to an embodiment of the present invention.
- FIG. 2 is a diagram showing a relationship between a moving distance of a vehicle and a positioning error.
- FIG. 3 is a diagram for explaining an error that occurs between the position of the vehicle to be measured and the position of the vehicle on an actual road.
- FIG. 4 is a diagram for explaining a method of setting a recognition range of a road where an object should be recognized according to the positioning accuracy of the own vehicle position in the system of the present embodiment.
- FIG. 5 is a diagram for explaining a method for setting a recognition range of a road in which the object should be recognized according to the relative distance between the host vehicle and the object in the system of the present embodiment.
- FIG. 6 is a diagram showing a map used for setting a recognition range of a road where an object should be recognized in the system of the present embodiment.
- FIG. 7 is a flowchart of an example of a control routine executed in the system of the present embodiment.
- FIG. 1 shows a configuration diagram of a system mounted on a vehicle according to an embodiment of the present invention.
- the system of the present embodiment as shown in FIG. 1 includes a positioning unit 12 for positioning the position of the host vehicle, and a support control unit 14 for controlling the traveling of the host vehicle.
- the positioning unit 12 measures the position of the vehicle, and the support control unit 14 executes predetermined support control for driving the host vehicle according to the position of the host vehicle.
- the positioning unit 12 receives a GPS signal transmitted from the GPS (Global Positioning System) satellite force and detects the latitude and longitude of the position where the host vehicle is present.
- GPS receiver 16 turning angle and geomagnetism
- the direction sensor 18 that detects the angle (direction) of the host vehicle using G
- the G sensor 20 that detects acceleration / deceleration
- the vehicle speed sensor 22 that detects the vehicle speed
- the dead reckoning navigation unit 24 mainly composed of a microcomputer is connected.
- the output signals of the receivers and sensors 16 to 22 are supplied to the dead reckoning navigation unit 24, respectively.
- the dead reckoning navigation unit 24 detects the latitude and longitude (initial coordinates) of the position of the host vehicle based on information from the GPS receiver 16, and also determines the traveling direction of the host vehicle based on information from the sensors 18-22. The vehicle speed and acceleration / deceleration driving conditions are detected, and a vehicle driving locus (estimated locus) from the initial coordinates of the vehicle position is created.
- the positioning unit 12 also has a map matching unit 28 mainly composed of a microcomputer connected to the dead reckoning navigation unit 24 described above.
- a map database 30 is also connected to the map matching unit 28.
- the map database 30 is composed of a hard disk (HDD), DVD, CD, etc. installed in the vehicle or provided in the center.
- the map database 30 is drawn on the road and link data of the road itself necessary for route guidance and map display. Or the stationary object to be installed, the position information of the lane, etc. are stored.
- the map database 30 includes lane shape and road type data such as latitude / longitude, curvature, gradient, number of lanes, lane width, corner presence / absence representing roads, intersections and nodes.
- Information on points, information on buildings for map display, etc. are stored, pedestrian crossings and temporary stop lines drawn on the road surface, direction arrows, rhombus markings with “crosswalk”, maximum speed
- shape data, paint data, position data, feature size, distance data with other objects before and after, data indicating the degree of friction, vehicle Target pair of direction of travel The distance data etc. with the figurine are stored.
- the map database 30 can be updated to the latest map data by exchanging disks and establishing update conditions.
- the map matching unit 28 is supplied with the initial coordinates of the vehicle position detected and created by the dead reckoning unit 24 and information on the estimated trajectory from the initial coordinates.
- the map matching unit 28 determines the location of the vehicle itself stored in the map database 30 based on the initial coordinates and the estimated locus based on the GPS of the vehicle supplied with information from the dead reckoning unit 24.
- Map matching (first map matching) is performed on the road link using the link information of the vehicle, and the position accuracy level indicating the accuracy of the position of the vehicle detected as a result of the first map matching is set. It has a function to calculate.
- the map matching unit 28 reads out the object data on the road surface within the predetermined range from the current position of the host vehicle obtained as a result of the first map matching as described above from the map database 30, and the host vehicle An object within a predetermined road range from the position is set as an object to be recognized by the host vehicle. Then, after the setting, it is determined whether or not the set object is to be recognized using an image captured by the back camera 32 described later, based on the position of the vehicle whose position is detected.
- the positioning unit 12 also has a back camera 32 connected to the map matching unit 28.
- the back camera 32 is disposed on the rear bumper of the vehicle or the like, and can photograph the outside of a predetermined area including the road surface on the rear side of the vehicle from the disposed position.
- the image captured by the knock camera 32 is supplied to the map matching unit 28.
- the map matching unit 28 determines that the back camera 32 should recognize the captured image, when the captured image is supplied from the back camera 32, the map matching unit 28 performs image processing such as edge extraction on the captured image. By doing so, the above-mentioned objects and traveling lanes drawn on the road surface are detected, and the relative positional relationship between the objects and the traveling lane and the host vehicle is grasped.
- the map matching unit 28 calculates the position of the own lane relative to the own vehicle on the road on which the own vehicle actually travels based on the detection result of the travel lane from the captured image of the back camera 32. Also, based on the detection result of the object, the vehicle and the road behind the vehicle Measure the relative relationship with the recognized object that exists (specifically, the distance from the vehicle to the recognized object), and the measurement result and the position of the recognized object stored in the map database 30 Based on the data, map matching (second map matching) is performed to correct the position of the host vehicle (especially the position in the longitudinal direction of the vehicle) to a position relative to the recognition target.
- the map matching unit 28 displays the current position of the host vehicle on the road link stored in the map database 30 each time information on the estimated trajectory is supplied from the dead reckoning unit 24 as described above.
- the first map matching is performed, and when the object to be recognized is recognized from the image captured by the Nok camera 32, the position of the host vehicle is corrected to the position based on the recognition object based on the recognition result.
- Perform the second map matching The map matching unit 28 calculates the accuracy (that is, the degree of confidence) indicating the accuracy of the current position of the host vehicle that is measured as a result of the map matching described above.
- the map matching unit 28 also assists control ahead of a predetermined range in the traveling direction of the host vehicle by collating the position of the host vehicle measured by the map matching with the map data stored in the map database 30.
- target objects for example, stop lines, intersections, curve entrances, etc.
- the remaining distance on the road Calculated
- the positioning unit 12 and the support control unit 14 are connected to each other.
- Information on the position of the host vehicle and the distance remaining on the road detected by the positioning unit 12 is supplied to, for example, a display that can be seen by passengers in the vehicle, and is displayed on the road map displayed on the screen. It is typically superimposed and displayed and supplied to the support control unit 14.
- the support control unit 14 includes an electronic control unit (hereinafter referred to as a support ECU) 40 mainly composed of a microcomputer.
- the support ECU 40 is used to drive the host vehicle on the road. Assistance control for assisting the driver's operation is executed. This support control is performed according to the position of the host vehicle, specifically according to the remaining distance along the road from the host vehicle to the target object. For example, the driver does not perform a brake operation in particular.
- Pause control which is driving support control for stopping the vehicle in front of the target line on the road, such as the stop line and the crossing line, when the vehicle is late or late Intersection control, which is driving support control to prevent crossing with other vehicles that are predicted to intersect at the target intersection, and speed at which the vehicle is driven at an appropriate speed with respect to the target curve (corner)
- Intersection control which is driving support control to prevent crossing with other vehicles that are predicted to intersect at the target intersection, and speed at which the vehicle is driven at an appropriate speed with respect to the target curve (corner)
- control and guidance control for voice guidance for the relative distance to the target object.
- the support ECU 40 includes a brake actuator 42 for generating an appropriate braking force on the host vehicle, a slot actuator 44 for applying an appropriate driving force to the host vehicle, Shift actuator 46 for switching the gear position of the automatic transmission, steer actuator 48 for giving the appropriate steering angle to the host vehicle, buzzer sounding and warning output toward the vehicle interior, speaker A buzzer alarm 50 for output is connected.
- the support ECU 40 determines the position of the host vehicle measured by the positioning unit 12 that executes the above-described support control, based on the relative relationship between the host vehicle and the target object, for each of the actuators 42 to 50. An appropriate drive command is issued.
- Each of the actuators 42 to 50 is driven in accordance with a drive command supplied from the support ECU 40.
- the vehicle driver desires to execute the support control by the support control unit 14, the vehicle driver operates the system of the present embodiment in an operable state.
- the positioning unit 12 creates a travel locus from the initial coordinates of the own vehicle based on the output signals of the receivers and the sensors 16 to 22 at a predetermined time in the dead reckoning unit 24.
- the map matching unit 28 compares the initial coordinate position and the estimated trajectory of the host vehicle by the dead reckoning unit 24 with road link information stored as map data in the map database 30, thereby The first map matching is performed to correct the position of on the road link.
- the map matching unit 28 When the map matching unit 28 detects the position of the host vehicle, the map matching unit 28 displays the position of the host vehicle on the display screen of the display display that can be visually recognized by the occupant, and further displays the position from the host vehicle position.
- the object of the road range (all lanes in the case of multiple lanes) up to the position when the vehicle travels for a predetermined time or a predetermined distance in the future or to the position of the target object that is the control target of the support control Read data from map database 30 The Then, the object in the predetermined road range is set as an object to be recognized by the back camera 32, and thereafter, the position of the set object to be recognized as described later in detail is automatically updated.
- the power of the knock camera 32 is processed by determining whether or not the vehicle position has reached the vicinity of the position of the object to be recognized. It is determined whether or not the object to be recognized should be recognized.
- the map matching unit 28 recognizes the object to be recognized as a result of the above determination, the map matching unit 28 receives a captured image of the rear of the vehicle imaged by the back camera 32 and receives the captured image. Image processing such as edge extraction is performed. Then, the result of the image processing is compared with feature data such as shape data and position data of the object to be recognized, and it is determined whether or not the object to be recognized is recognized by the image processing.
- the map matching unit 28 recognizes an object to be recognized
- the map matching unit 28 exists behind the vehicle based on the relative relationship between the vehicle identified by the image processing and the recognition object.
- the relative position and distance to the recognition target object to be detected are detected, and the map data base 30 is accessed to read the position data of the recognition target object.
- the position of the host vehicle especially the position in the vehicle front-rear direction
- the map matching unit 28 accesses the map database 30 and obtains the distance on the road from the recognition target object to the target target object of support control. Then, based on the position of the own vehicle and the distance from the recognition object to the target object, an initial value of the remaining distance along the road from the own vehicle to the target object is calculated.
- the map matching unit 28 When the map matching unit 28 recognizes an object to be recognized within a predetermined road range, the map matching unit 28 performs image processing of a captured image from the back camera 32 to obtain information on a traveling lane on the road. Acquisition ⁇ Recognize and understand the relative relationship of the lane to the vehicle. Then, the map database 30 is accessed to obtain the lane width, number of lanes, shape, etc. of the driving lane in the vicinity of the own vehicle position, and the lanes acquired from the relative relationship of the driving lane to the own vehicle and the map database 30 The road on which the vehicle is currently traveling based on the number, etc. Identify the position of your lane at. The target object may be different for each driving lane.
- the target object ahead of the traveling direction to which the vehicle existing on the own lane should pass is specifically Since it is specified, it is possible to calculate the above-mentioned road calculation distance based on the target object on the own lane.
- the dead reckoning unit 24 creates an estimated trajectory of the vehicle position using the GPS receiver 16 and the various sensors 18 to 22 every predetermined time, and transmits the trajectory information to the map matching unit 28. .
- the map matching unit 28 performs the second map matching associated with the object recognition as described above, each time it receives the estimated trajectory information from the dead reckoning navigation unit 24, the map matching unit 28 first estimates from the second map matching point of time. Based on the trajectory and the position of the own lane, the position of the own vehicle (particularly the distance in the front-rear direction) with respect to the position coordinates of the recognition object on the center line of the own lane is calculated. Then, based on the distance in the front-rear direction and the distance between the recognition object and the target object on the own lane, the remaining road distance from the current position of the own vehicle to the target object is calculated.
- the information on the vehicle position and the information on the remaining road distance detected by the positioning unit 12 are supplied to the display display and also to the support control unit 14.
- the display display schematically displays the vehicle position and the road remaining distance superimposed on the road map displayed on the display screen.
- the assistance ECU 40 of the assistance control unit 14 performs each assistance control based on the distance on the road traveling to a target object such as a stop line or an intersection that is the object of assistance control supplied from the positioning unit 12. Then, it is determined whether or not the control start condition defined for the control is satisfied. When the control start condition is satisfied, the support control is started.
- the distance from the vehicle to be measured to the temporary stop line that is the target object is, for example, 30 meters (the distance may vary depending on the vehicle speed). ),
- the automatic brake by the brake actuator 42 is disengaged and the vehicle stops at its temporary stop line.
- a voice guidance etc. to inform the driver that the automatic braking brake is performed.
- route guidance by voice For example, when the distance from the vehicle to be measured to the target object such as an intersection reaches 100 meters, for example, the buzzer alarm 50 outputs the speaker to the driver in front of the vehicle. Guidance that there is an object is given.
- the positioning unit 12 determines the position of the host vehicle based on the GPS signal received by the GPS receiver 16 and the estimated trajectory using the outputs of the various sensors 18-22. It is possible to measure on the road link of the map data stored in the map database 30. In addition, it recognizes the object drawn or installed on the road by processing the image captured by the back camera 32, and recognizes the position and information of the vehicle stored in advance in the map database 30 of the recognized object and its recognition. The position of the host vehicle can be measured based on the relative relationship with the object.
- the GPS receiver 16, the direction sensor 18, the G sensor 20, the vehicle speed sensor 22, the back camera are used for the positioning of the own vehicle and the calculation of the remaining distance along the road from the own vehicle to the target object. 32 is used.
- this positioning calculation causes errors in detection parameters in the receiver, various sensors 16 to 22, and camera 32, and errors included in various calculations during positioning (for example, rounding errors in timing).
- An error occurs in the positioning result.
- This positioning error includes what is accumulated with the movement of the vehicle, so the longer the moving distance of the vehicle, the larger the error in positioning the vehicle position and the lower the positioning accuracy (Fig. 2).
- the position information of the object stored in the map database 30 is generally actually measured and has extremely high accuracy. Therefore, if the position information of the object is used, the position information of the subject vehicle can be measured. The error becomes smaller and the positioning accuracy becomes higher.
- the positioning of the vehicle position is performed according to the above-described method, and specifically, usually based on the estimated trajectory of the vehicle using GPS or sensor output.
- the first map matching is performed.
- the second map matching is performed based on the recognition result, so that the object drawn or placed on the road.
- the support control is performed according to the own vehicle position (specifically, the distance from the own vehicle to the target object that is the target of the support control) measured by the positioning unit 12. Can be executed.
- the assist control is not performed before the own vehicle reaches a predetermined relative positional relationship with the target object.
- the assist control can be performed.
- the suspension control, the intersection control, the speed control, and the guidance control according to the positioning result of the own vehicle are carried out to drive the own vehicle on the road. By assisting the user's operation, it is possible to drive the vehicle safely and appropriately.
- the force for recognizing the object on the road by processing the image captured by the back camera 32 to correct the vehicle position.
- the object is a stop line, pedestrian crossing, arrow, rotation. Forbidden, diamond-shaped markings, character strings, deceleration zones, etc., which are scattered on the road, there is no need to always perform the object recognition process.
- the position information of the object to be recognized is stored in the map database 30 in advance, and its position has a certain degree of accuracy, while the position of the host vehicle is measured according to the above method, so that the positioning accuracy is improved. It has a corresponding error.
- the camera image is taken at the timing when the measured vehicle position coincides with the position of the object stored in the map database 30 or near the position. It is sufficient to perform object recognition processing using images. Actually, a positioning error as shown in Fig. 3 occurs at the position of the vehicle as described above. Therefore, object recognition processing is performed only at the above timing. If this is the case, the vehicle may have already passed the target object during the recognition process, and a situation may occur in which the target object cannot be recognized.
- the positioning of the vehicle position can be performed based on the estimated trajectory using the output of the GPS receiver or the various sensors 16-22, the positioning error of the vehicle position increases the moving distance of the vehicle. The bigger it gets.
- the position measurement is performed after the setting. Even when it is determined that the selected vehicle has entered the set range, there may actually be a situation where the set vehicle does not enter the set range due to a positioning error that increases as the vehicle moves, and as a result, the object is not recognized. There is also a fear.
- the system of the present embodiment uses the recognition range (this implementation) to recognize the object based on the relative relationship between the vehicle position and the position of the object to be recognized and the positioning accuracy of the vehicle position.
- the characteristic part of a present Example is demonstrated with reference to FIG. 4 thru
- FIG. 4 is a diagram for explaining a method of setting a road recognition range in which the object should be recognized according to the positioning accuracy of the vehicle position in the system of the present embodiment.
- FIG. 5 is a diagram for explaining a method of setting a road recognition range in which the object should be recognized according to the relative distance between the host vehicle and the object in the system of the present embodiment.
- FIG. 6 is a diagram showing a map used for setting the recognition range of the road on which the object should be recognized in the system of this embodiment.
- FIG. 7 shows a flowchart of an example of a control routine executed by the map matching unit 28 in the system of the present embodiment.
- the positioning accuracy of the host vehicle position is extremely high, the position of the host vehicle to be measured is accurate, and therefore, the position of the target object to be recognized stored in the map database 30 is not determined. From this relationship, it is possible to accurately grasp the timing until the subject vehicle can recognize the target object on the actual road. In this respect, the target object to be recognized using the captured image of the back camera 32 can be determined. For recognition, it is sufficient that the road recognition range for recognition is very narrow. On the other hand, the lower the positioning accuracy of the vehicle position, the larger the error in the position of the vehicle that is positioned, so the positioning position force is also the timing until the target object can be recognized on the actual road. In this respect, it is necessary to widen the road recognition range for recognition in order to recognize the object to be recognized using the captured image of the back camera 32. (See Figure 4).
- the own vehicle position at that time will be The farther from the position of the object on the front of the vehicle, the larger the positioning error until the vehicle approaches the object.
- the road recognition range for recognition is set so that the relative distance (travel distance on the road) between the vehicle position and the object position on the travel lane is small (short It is appropriate to make it narrower as the relative distance becomes larger and wider as the relative distance becomes longer (see Fig. 5).
- the map matching unit 28 of the positioning unit 12 is based on the estimated trajectory of the vehicle using GPS or sensor output or based on the recognition result of the object using the camera captured image. Measure the position of the vehicle and obtain the position of the vehicle (step 100). From the vehicle position to the position where the vehicle has traveled for a predetermined time or a predetermined distance in the future, or to the position of the nearest target object that is the control target of the support control that can be executed by the vehicle. The data of all the object candidates on the road range is read from the map database 30, and the objects in the road range are set as objects to be recognized by the back camera 32 (step 102).
- the objects to be recognized using the camera 32 set out of all object candidates are all objects on the road range in which the position data is stored in the map database 30.
- it may be limited to those within a predetermined distance (e.g., lkm or 700m) from the target object of support control. It may be only those that are less likely to cause marking rubbing, those that follow an arrangement pattern according to the type of road that runs from the vehicle position to the target object for assistance control, or that the object is recognized by a camera image. If the position of the host vehicle is corrected, only the positioning accuracy required for the support control that can be executed in the host vehicle is maintained, and the host vehicle can reach the target object by executing the support control. Good record, also.
- the correction of the own vehicle position is performed.
- the map matching unit 28 recognizes the object to be recognized by the back camera 32 in step 102 above.
- position data of the object to be recognized is obtained from the map database 30 (step 104).
- the position data of the object closest to the vehicle position is acquired. Then, the relative distance along the road between the own vehicle position acquired in step 100 and the position of the object to be recognized acquired in step 104 is calculated (step 106).
- the map matching unit 28 calculates the current positioning accuracy for positioning the vehicle position (step 108). This calculation can be performed by substituting the parameters into a predetermined formula that has been experimentally determined in advance.For example, the initial value corresponds to a GPS accuracy error that is not accumulated according to the moving distance of the vehicle. In addition, the highest accuracy is obtained each time the vehicle position is corrected (second map matching) based on object recognition in the camera image, and the vehicle travel distance after the second map matching. Accordingly, it is only necessary to obtain an accuracy that decreases with a predetermined gradient (which may be determined in advance) and decreases as the moving distance becomes longer.
- a predetermined gradient which may be determined in advance
- the map matching unit 28 controls the back camera 32.
- the recognition range of the road where the object on the road should be recognized is set (step 110).
- the map matching unit 28 is used in advance to set the recognition range of the road on which the object should be recognized from the relative distance between the host vehicle and the object to be recognized and the positioning accuracy, as shown in FIG. I remember the map.
- This map shows that the higher the positioning accuracy of the vehicle position, the narrower the road recognition range, and the lower the positioning accuracy, the wider the road recognition range, and the travel lane between the vehicle and the object to be recognized.
- the wide road recognition range that can be set is not limited to the one that corresponds to the positioning error corresponding to the positioning accuracy, and the relative distance from the object to be recognized by the host vehicle is also run. If it corresponds to the maximum positioning error that can occur.
- the map matching unit 28 sets the road recognition range in step 110 with reference to the map shown in FIG. [0058]
- the map matching unit 28 sets the recognition range of the road in which the object is to be recognized in step 110 as described above, the map matching unit 28 then displays the updated vehicle position and the set recognition range. In comparison, it is determined whether or not the vehicle has entered the set road recognition range, and it is determined whether or not it is repeated until an affirmative determination is made.
- it is determined that the host vehicle has entered the set road recognition range it is determined that the object to be recognized is in a situation to be recognized by processing the captured image of the back camera 32.
- image processing such as edge extraction is performed on the captured image (step 112), and the image processing result is compared with the feature data of the object to be recognized. Then, a process for recognizing the object to be recognized is performed (step 114).
- the map matching unit 28 recognizes an object to be recognized in a situation where the host vehicle is located within the set road recognition range (when an affirmative determination is made in step 114), the road recognition range. (Step 116), grasp the relative relationship between the vehicle identified by the image processing and the recognition object, and determine the position of the recognition object The second map matching is performed to correct the vehicle position to a relative position.
- the map matching unit 28 does not recognize an object to be recognized in a situation where the vehicle is located within the set road recognition range, and recognizes the updated vehicle position and the set road.
- the map matching unit 28 determines that the vehicle has advanced from the set road recognition range based on the result of comparison with the range (when a negative determination is made in step 114), there is no object to be recognized within the road recognition range. Judgment is made (step 118), and the process is terminated without performing the second map matching described above.
- the map matching unit 28 acquires position data for each set object to be recognized until the host vehicle reaches the target object for support control or the vicinity thereof, and performs the recognition process. Set the recognition range of the road to be executed and execute the same processing as above.
- the positioning accuracy of the own vehicle position, the own vehicle position actually measured, and the object to be recognized stored in advance in the map database 30 are as follows.
- Position specifically, the relative distance on the driving lane between the vehicle and the object to be recognized
- the recognition range of the road on which the recognition processing of the object on the road to be recognized can be set using the captured image of the back camera 32.
- the narrower road recognition range is set as the positioning accuracy is higher, and the wider road recognition range is set as the positioning accuracy is lower.
- the shorter the relative distance on the travel lane between the subject vehicle and the object to be recognized the narrower the road recognition range is set, and the longer the relative distance, the wider road recognition range is set. Then, the object recognition process using the image captured by the back camera 32 is performed in the road recognition range.
- the object recognition process is performed using the image captured by the knock camera 32 as described above, because the relative distance between the vehicle and the object to be recognized and the vehicle position are as described above. This is limited to when the vehicle travels within a predetermined road recognition range set based on the positioning accuracy of the vehicle, and the recognition process for the target object that is not within that range is not performed. For this reason, according to the system of the present embodiment, it is possible to reduce the processing load required for the recognition of the object as compared with the system that always performs the object recognition process using the image captured by the back camera 32. It has become.
- the range of the positioning error of the own vehicle position is small when the positioning accuracy is high, but is large when the positioning accuracy is low.
- the position of the host vehicle is measured based on the travel trajectory based on the vehicle speed, steering angle, etc.
- the longer the travel distance of the vehicle the lower the positioning accuracy and the wider the range of positioning errors. Therefore, the lower the positioning accuracy of the vehicle position as in the system of the present embodiment, the wider the road recognition range is set, and the longer the relative distance on the travel lane between the vehicle and the object to be recognized is, If a wider road recognition range is set, even if the object recognition process is not always performed and the timing is limited, the object to be recognized by the host vehicle cannot be recognized. It is possible to prevent the deterioration of the performance of object recognition.
- the object stored in advance in the map database 30 is used.
- the second map matching is performed to correct the position of the vehicle using the object position data.
- the position data of the object stored in the map database 30 is data with extremely high accuracy and almost no error. Therefore, the book If the vehicle position is corrected based on the recognition object as in the system of the embodiment, the position data of the object that also recognizes the captured image power of the back camera 32 is used for the correction. Therefore, it is possible to accurately measure the position of the host vehicle, and it is possible to increase the positioning accuracy of the host vehicle for each recognition of the object.
- every object on the road to the target object that the host vehicle will travel in the future is recognized by processing the captured image of the back camera 32 each time, and each recognition of all the objects is performed.
- the vehicle position may be corrected.However, instead of this, only a specific part of all the objects is recognized by image processing, and only for each recognition of the specific object. As the vehicle position correction will be carried out.
- the positioning accuracy required to properly execute the support control for the target necessary to ensure the positioning accuracy required for appropriately performing the support control executable on the host vehicle.
- the selected object is recognized by image processing to correct the position of the host vehicle (second map). It is a standard for carrying out (matching).
- the map database 30 storing the position data of the object is described in the claims in the “object recognition device” described in the claims by the positioning unit 12.
- the back camera 32 is added to the “imaging means” described in the claims based on both the GPS and the traveling locus of the host vehicle, and based on the recognition result of the object using the camera captured image. Measuring the position of the vehicle is equivalent to the “predetermined method” described in the claims.
- the map matching unit 28 executes the process of step 100 in the routine shown in FIG. 7, so that the “positioning means” described in the claims is the process of step 108.
- Positioning accuracy calculation '' described in the scope of claims The "recognition range setting means” described in the claims by executing the processing of step 110 recognizes the object to be recognized in the road range set in step 112.
- the “recognition target setting unit” described in the claims is realized by executing the processing of step 102 by the “target recognition unit” described in the range.
- the object on the road is recognized using the image captured by the back camera 32 disposed at the rear of the vehicle, but the present invention is not limited to this.
- Such an object may be recognized based on a captured image of a camera disposed in the front of the vehicle or information sent from an external infrastructure.
- the setting of the recognition range of the road on which the recognition process of the object to be recognized using the captured image of the back camera 32 is performed is calculated at a certain time. This is based on the positioning accuracy of the vehicle, the position of the vehicle to be positioned, and the position of the object to be recognized that is stored in advance in the map database 30 (specifically, the relative distance of both on the traveling lane).
- the recognition range is not reset even before the own vehicle enters the recognition range, but after the setting, the own vehicle is set. It may be possible to reset the above-mentioned recognition range based on the positioning accuracy of the updated vehicle position, the updated vehicle position, and the position of the object, for example, every predetermined time until the vehicle enters the recognition range. Good.
- an object on the road on which the host vehicle will travel in the future is set as an object to be recognized for correcting the position of the host vehicle.
- longitudinal correction along the road lane and road driving The correction in the horizontal direction perpendicular to the lane may be performed separately and separately.
- the types of objects that are effective for correcting the vehicle position in the front-rear direction are often different from the types of objects that are effective for correcting the left-right direction. Therefore, if the recognition target is set for the vehicle position correction by distinguishing the front-rear direction correction and the left-right direction correction in this way, the efficiency and processing burden of the vehicle position correction are reduced. Mitigation is possible.
- the object on the road recognized using the camera image is corrected based on the vehicle position based on the traveling locus by GPS, vehicle speed, steering angle, etc. 2 Power to be used as necessary for map matching
- the present invention is not limited to this.
- the object is provided as a road marking indicating an area where entry of vehicles is prohibited It may be used for purposes other than map matching, such as a system that issues a vehicle warning when a vehicle enters the area.
- the map database 30 is mounted on the vehicle. However, the map database 30 is provided in the center, and the vehicle accesses the communication each time and stores data stored in the map database. Let it be readable.
- the suspension control, the intersection control, the speed control, and the guidance control are cited as the assist control.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP07743414.0A EP2019288B1 (en) | 2006-05-17 | 2007-05-15 | Object recognition device |
| US12/067,100 US7898437B2 (en) | 2006-05-17 | 2007-05-15 | Object recognition device |
| CN2007800009771A CN101346603B (zh) | 2006-05-17 | 2007-05-15 | 对象物识别装置 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2006138256A JP4724043B2 (ja) | 2006-05-17 | 2006-05-17 | 対象物認識装置 |
| JP2006-138256 | 2006-05-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2007132860A1 true WO2007132860A1 (ja) | 2007-11-22 |
Family
ID=38693949
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2007/059979 Ceased WO2007132860A1 (ja) | 2006-05-17 | 2007-05-15 | 対象物認識装置 |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US7898437B2 (ja) |
| EP (1) | EP2019288B1 (ja) |
| JP (1) | JP4724043B2 (ja) |
| KR (1) | KR101018620B1 (ja) |
| CN (1) | CN101346603B (ja) |
| WO (1) | WO2007132860A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102150015B (zh) * | 2008-10-17 | 2013-09-25 | 三菱电机株式会社 | 导航装置 |
| CN108242163A (zh) * | 2016-12-23 | 2018-07-03 | 卢卡斯汽车股份有限公司 | 机动车的驾驶员辅助系统 |
Families Citing this family (102)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008009915A (ja) * | 2006-06-30 | 2008-01-17 | Pioneer Electronic Corp | 道路標識判定装置、方法及びプログラム |
| JP4724079B2 (ja) * | 2006-09-13 | 2011-07-13 | トヨタ自動車株式会社 | 対象物認識装置 |
| AU2007306939B2 (en) | 2006-10-11 | 2012-06-07 | Tagmotion Pty Limited | Method and apparatus for managing multimedia files |
| JP4980118B2 (ja) * | 2007-04-06 | 2012-07-18 | アルパイン株式会社 | 車載用ナビゲーション装置 |
| JP2009043186A (ja) * | 2007-08-10 | 2009-02-26 | Denso Corp | 情報記憶装置、及び走行環境情報認識装置 |
| JP2009180631A (ja) * | 2008-01-31 | 2009-08-13 | Denso It Laboratory Inc | ナビゲーション装置、ナビゲーション方法およびプログラム |
| CN101681549A (zh) * | 2008-03-11 | 2010-03-24 | 松下电器产业株式会社 | 标签传感器系统、传感器装置、物体位置推测装置及物体位置推测方法 |
| JP5001911B2 (ja) * | 2008-06-30 | 2012-08-15 | アイシン・エィ・ダブリュ株式会社 | 停止義務地点学習装置及び停止義務地点学習プログラム、並びにこれを用いたナビゲーション装置 |
| TWI387775B (zh) * | 2008-12-18 | 2013-03-01 | Ind Tech Res Inst | 定位系統與定位方法 |
| JP5548212B2 (ja) * | 2009-09-29 | 2014-07-16 | パナソニック株式会社 | 横断歩道標示検出方法および横断歩道標示検出装置 |
| JP5062497B2 (ja) * | 2010-03-31 | 2012-10-31 | アイシン・エィ・ダブリュ株式会社 | 風景画像認識を用いた自車位置検出システム |
| US9140792B2 (en) * | 2011-06-01 | 2015-09-22 | GM Global Technology Operations LLC | System and method for sensor based environmental model construction |
| JP5742559B2 (ja) * | 2011-08-01 | 2015-07-01 | アイシン・エィ・ダブリュ株式会社 | 位置判定装置およびナビゲーション装置並びに位置判定方法,プログラム |
| US9182761B2 (en) * | 2011-08-25 | 2015-11-10 | Nissan Motor Co., Ltd. | Autonomous driving control system for vehicle |
| US9235766B2 (en) | 2011-10-20 | 2016-01-12 | International Business Machines Corporation | Optimizing the detection of objects in images |
| KR101919366B1 (ko) * | 2011-12-22 | 2019-02-11 | 한국전자통신연구원 | 차량 내부 네트워크 및 영상 센서를 이용한 차량 위치 인식 장치 및 그 방법 |
| US9043133B2 (en) * | 2011-12-29 | 2015-05-26 | Intel Corporation | Navigation systems and associated methods |
| JP2013217799A (ja) * | 2012-04-10 | 2013-10-24 | Honda Elesys Co Ltd | 物体検知装置、物体検知方法、物体検知プログラム、及び動作制御システム |
| US9111173B2 (en) | 2012-04-23 | 2015-08-18 | Honda Motor Co., Ltd. | Learning part-based models of objects |
| US9123152B1 (en) | 2012-05-07 | 2015-09-01 | Google Inc. | Map reports from vehicles in the field |
| JP5957359B2 (ja) * | 2012-10-19 | 2016-07-27 | 日立オートモティブシステムズ株式会社 | ステレオ画像処理装置及びステレオ画像処理方法 |
| JP5895815B2 (ja) * | 2012-10-30 | 2016-03-30 | トヨタ自動車株式会社 | 残距離算出装置、残距離算出方法及び運転支援装置 |
| US9495602B2 (en) * | 2013-10-23 | 2016-11-15 | Toyota Motor Engineering & Manufacturing North America, Inc. | Image and map-based detection of vehicles at intersections |
| DE102014002150B3 (de) * | 2014-02-15 | 2015-07-23 | Audi Ag | Verfahren zur Ermittlung der absoluten Position einer mobilen Einheit und mobile Einheit |
| US9465388B1 (en) | 2014-03-03 | 2016-10-11 | Google Inc. | Remote assistance for an autonomous vehicle in low confidence situations |
| US12510890B2 (en) | 2014-03-03 | 2025-12-30 | Waymo Llc | Remote assistance for autonomous vehicles in predetermined situations |
| US9720410B2 (en) | 2014-03-03 | 2017-08-01 | Waymo Llc | Remote assistance for autonomous vehicles in predetermined situations |
| US9547989B2 (en) | 2014-03-04 | 2017-01-17 | Google Inc. | Reporting road event data and sharing with other vehicles |
| JP6496982B2 (ja) * | 2014-04-11 | 2019-04-10 | 株式会社デンソー | 認知支援システム |
| RU2633641C1 (ru) * | 2014-05-20 | 2017-10-16 | Ниссан Мотор Ко., Лтд. | Устройство обнаружения цели и способ обнаружения цели |
| WO2015177864A1 (ja) * | 2014-05-20 | 2015-11-26 | 日産自動車株式会社 | 信号機認識装置及び信号機認識方法 |
| JP5984154B2 (ja) * | 2014-09-24 | 2016-09-06 | 三菱電機株式会社 | 運転支援装置 |
| EP3211374B1 (en) * | 2014-10-22 | 2020-12-16 | Nissan Motor Co., Ltd | Travel route calculation device |
| CN107076565B (zh) * | 2014-10-22 | 2020-03-17 | 日产自动车株式会社 | 行驶路径运算装置 |
| US9569693B2 (en) * | 2014-12-31 | 2017-02-14 | Here Global B.V. | Method and apparatus for object identification and location correlation based on received images |
| EP3842747A1 (en) * | 2015-02-10 | 2021-06-30 | Mobileye Vision Technologies Ltd. | Sparse map for autonomous vehicle navigation |
| US20160259034A1 (en) * | 2015-03-04 | 2016-09-08 | Panasonic Intellectual Property Management Co., Ltd. | Position estimation device and position estimation method |
| CN107615201B (zh) * | 2015-05-28 | 2018-11-20 | 日产自动车株式会社 | 自身位置估计装置及自身位置估计方法 |
| KR20180012811A (ko) | 2015-06-26 | 2018-02-06 | 닛산 지도우샤 가부시키가이샤 | 차량 위치 판정 장치 및 차량 위치 판정 방법 |
| JP6376059B2 (ja) * | 2015-07-06 | 2018-08-22 | トヨタ自動車株式会社 | 自動運転車両の制御装置 |
| KR20180021159A (ko) * | 2015-07-13 | 2018-02-28 | 닛산 지도우샤 가부시키가이샤 | 신호기 인식 장치 및 신호기 인식 방법 |
| CN106569245B (zh) * | 2015-10-10 | 2021-01-05 | 腾讯科技(深圳)有限公司 | 一种车辆定位方法及装置 |
| CN105355071B (zh) * | 2015-11-16 | 2017-09-29 | 北京握奇智能科技有限公司 | 一种车载终端实路动态定位精度的自动化评估系统和方法 |
| US9494438B1 (en) | 2015-12-15 | 2016-11-15 | Honda Motor Co., Ltd. | System and method for verifying map data for a vehicle |
| CN106996785B (zh) * | 2016-01-25 | 2019-12-10 | 北京四维图新科技股份有限公司 | 一种对导航数据进行更新的方法及装置 |
| RU2719497C2 (ru) * | 2016-01-29 | 2020-04-20 | Ниссан Мотор Ко., Лтд. | Способ для управления движением транспортного средства и устройство для управления движением транспортного средства |
| EP3410418B1 (en) | 2016-01-29 | 2020-04-15 | Nissan Motor Co., Ltd. | Vehicle travel control method and vehicle travel control device |
| US10670418B2 (en) * | 2016-05-04 | 2020-06-02 | International Business Machines Corporation | Video based route recognition |
| DE102016208488A1 (de) * | 2016-05-18 | 2017-11-23 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Lokalisierung eines Fahrzeugs |
| JP6432116B2 (ja) * | 2016-05-23 | 2018-12-05 | 本田技研工業株式会社 | 車両位置特定装置、車両制御システム、車両位置特定方法、および車両位置特定プログラム |
| JP6645910B2 (ja) * | 2016-05-31 | 2020-02-14 | 株式会社デンソー | 位置推定装置 |
| DE102016211420A1 (de) * | 2016-06-27 | 2017-12-28 | Robert Bosch Gmbh | Verfahren zum Bereitstellen einer Lokalisierungsinformation zum Lokalisieren eines Fahrzeugs an einem Lokalisierungsort und Verfahren zum Bereitstellen zumindest einer Information zum Lokalisieren eines Fahrzeugs durch ein anderes Fahrzeug |
| US10133942B2 (en) | 2016-07-05 | 2018-11-20 | Nauto Global Limited | System and method for automatic driver identification |
| US10209081B2 (en) * | 2016-08-09 | 2019-02-19 | Nauto, Inc. | System and method for precision localization and mapping |
| US10733460B2 (en) | 2016-09-14 | 2020-08-04 | Nauto, Inc. | Systems and methods for safe route determination |
| US10246014B2 (en) | 2016-11-07 | 2019-04-02 | Nauto, Inc. | System and method for driver distraction determination |
| WO2018138782A1 (ja) * | 2017-01-24 | 2018-08-02 | 富士通株式会社 | 情報処理装置、特徴点抽出プログラムおよび特徴点抽出方法 |
| JP6579119B2 (ja) * | 2017-01-24 | 2019-09-25 | トヨタ自動車株式会社 | 車両制御装置 |
| GB2559196B (en) * | 2017-01-31 | 2021-11-17 | Sony Europe Bv | Determining a position of a vehicle on a track |
| CN108303103B (zh) * | 2017-02-07 | 2020-02-07 | 腾讯科技(深圳)有限公司 | 目标车道的确定方法和装置 |
| JP6589926B2 (ja) * | 2017-04-07 | 2019-10-16 | トヨタ自動車株式会社 | 物体検出装置 |
| US11008039B2 (en) * | 2017-04-12 | 2021-05-18 | Toyota Jidosha Kabushiki Kaisha | Lane change assist apparatus for vehicle |
| US10262234B2 (en) * | 2017-04-24 | 2019-04-16 | Baidu Usa Llc | Automatically collecting training data for object recognition with 3D lidar and localization |
| EP3637053A4 (en) * | 2017-05-19 | 2021-03-03 | Pioneer Corporation | MEASURING DEVICE, MEASURING PROCESS AND PROGRAM |
| WO2018221453A1 (ja) * | 2017-05-31 | 2018-12-06 | パイオニア株式会社 | 出力装置、制御方法、プログラム及び記憶媒体 |
| JP2018203017A (ja) * | 2017-06-02 | 2018-12-27 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
| WO2018229550A1 (en) | 2017-06-16 | 2018-12-20 | Nauto Global Limited | System and method for adverse vehicle event determination |
| CN107339996A (zh) * | 2017-06-30 | 2017-11-10 | 百度在线网络技术(北京)有限公司 | 车辆自定位方法、装置、设备及存储介质 |
| EP3428577A1 (en) | 2017-07-12 | 2019-01-16 | Veoneer Sweden AB | A driver assistance system and method |
| KR102138094B1 (ko) * | 2017-07-27 | 2020-07-27 | 닛산 지도우샤 가부시키가이샤 | 운전 지원 차량의 자기 위치 보정 방법 및 자기 위치 보정 장치 |
| JP7013727B2 (ja) * | 2017-08-25 | 2022-02-01 | 株式会社デンソー | 車両制御装置 |
| JP6533269B2 (ja) * | 2017-10-27 | 2019-06-19 | 三菱電機株式会社 | 車両走行制御装置および車両走行制御方法 |
| JP6859927B2 (ja) * | 2017-11-06 | 2021-04-14 | トヨタ自動車株式会社 | 自車位置推定装置 |
| CN108010355B (zh) * | 2018-01-02 | 2020-08-04 | 湖北汽车工业学院 | 交通灯匹配过程中运动车辆定位滤波及预测方法 |
| WO2019169031A1 (en) | 2018-02-27 | 2019-09-06 | Nauto, Inc. | Method for determining driving policy |
| JP6881369B2 (ja) * | 2018-03-26 | 2021-06-02 | トヨタ自動車株式会社 | 自車位置推定装置 |
| JP7054878B2 (ja) * | 2018-03-28 | 2022-04-15 | パナソニックIpマネジメント株式会社 | 管理装置、管理システム、および位置補正方法 |
| US12060074B2 (en) * | 2018-03-30 | 2024-08-13 | Toyota Motor Europe | System and method for adjusting external position information of a vehicle |
| CN110553639B (zh) * | 2018-06-04 | 2021-04-27 | 百度在线网络技术(北京)有限公司 | 用于生成位置信息的方法和装置 |
| JP2019212187A (ja) * | 2018-06-08 | 2019-12-12 | スズキ株式会社 | 車両用運転支援装置 |
| CN110717350B (zh) * | 2018-07-11 | 2024-07-26 | 沈阳美行科技股份有限公司 | 一种行车轨迹的校正方法及校正装置 |
| US11829143B2 (en) * | 2018-11-02 | 2023-11-28 | Aurora Operations, Inc. | Labeling autonomous vehicle data |
| CN110379191B (zh) * | 2018-11-08 | 2020-12-22 | 北京京东尚科信息技术有限公司 | 一种为无人设备推送道路信息的方法和装置 |
| CN111212375B (zh) * | 2018-11-20 | 2021-08-03 | 华为技术有限公司 | 定位位置调整方法及其装置 |
| US11630197B2 (en) * | 2019-01-04 | 2023-04-18 | Qualcomm Incorporated | Determining a motion state of a target object |
| JP7120036B2 (ja) * | 2019-01-16 | 2022-08-17 | トヨタ自動車株式会社 | 自動駐車管理装置 |
| WO2020158262A1 (ja) * | 2019-01-30 | 2020-08-06 | 日本電気株式会社 | 劣化診断装置、劣化診断システム、劣化診断方法、記録媒体 |
| JP2020138653A (ja) * | 2019-02-28 | 2020-09-03 | 本田技研工業株式会社 | 車両の車線逸脱防止支援装置 |
| CN112149659B (zh) * | 2019-06-27 | 2021-11-09 | 浙江商汤科技开发有限公司 | 定位方法及装置、电子设备和存储介质 |
| CN112229417B (zh) * | 2019-07-17 | 2023-03-24 | 北京国家新能源汽车技术创新中心有限公司 | 车辆定位方法、装置、计算机设备和存储介质 |
| KR102332494B1 (ko) * | 2019-09-11 | 2021-11-30 | 한국도로공사 | 영상과 정밀지도에 기반한 정밀 측위정보와 gnss 측위정보 간의 측위차이에 대한 배포정보 생성 장치 및 방법 |
| EP3825731B1 (de) * | 2019-11-21 | 2022-01-05 | Sick Ag | Optoelektronischer sicherheitssensor und verfahren zur sicheren bestimmung der eigenen position |
| JP7505203B2 (ja) * | 2020-02-25 | 2024-06-25 | トヨタ自動車株式会社 | 運転支援装置 |
| ES2912058T3 (es) * | 2020-03-05 | 2022-05-24 | Sick Ag | Navegación de un vehículo y dispositivo de guía de pistas virtual |
| US11782451B1 (en) | 2020-04-21 | 2023-10-10 | Aurora Operations, Inc. | Training machine learning model for controlling autonomous vehicle |
| JP7384131B2 (ja) * | 2020-08-31 | 2023-11-21 | トヨタ自動車株式会社 | 車両の運転支援装置、車両の運転支援方法、およびプログラム |
| JP7467299B2 (ja) * | 2020-09-17 | 2024-04-15 | 株式会社東芝 | 位置管理システム、位置特定装置、および、位置特定方法 |
| JP7287373B2 (ja) * | 2020-10-06 | 2023-06-06 | トヨタ自動車株式会社 | 地図生成装置、地図生成方法及び地図生成用コンピュータプログラム |
| CN113096150A (zh) * | 2021-03-31 | 2021-07-09 | 北京万集科技股份有限公司 | 行驶轨迹的生成方法和系统、存储介质及电子装置 |
| CN115098605B (zh) * | 2022-05-17 | 2025-10-03 | 高德软件有限公司 | 距离真值库的构建方法和装置、电子设备及存储介质 |
| DE102023211084A1 (de) * | 2023-11-08 | 2025-05-08 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zum Erzeugen von Trainingsdatensätzen zum Trainieren eines Bewertungsalgorithmus, Verfahren zum Trainieren eines Bewertungsalgorithmus und Verfahren zum Bewerten einer Angleichung zweier Kartendatensätze |
| CN118155166B (zh) * | 2024-02-28 | 2024-10-01 | 小米汽车科技有限公司 | 减速带识别方法、装置、存储介质及车辆 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1137776A (ja) * | 1997-07-24 | 1999-02-12 | Denso Corp | 車両用ナビゲーション装置 |
| JP2000246653A (ja) | 1999-02-24 | 2000-09-12 | Amada Co Ltd | バリ取り具 |
| JP2005265494A (ja) * | 2004-03-17 | 2005-09-29 | Hitachi Ltd | 車両位置推定装置およびこれを用いた運転支援装置 |
| JP2006138256A (ja) | 2004-11-12 | 2006-06-01 | Piolax Inc | フューエルゲージ取付け構造 |
Family Cites Families (41)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5469360A (en) * | 1991-03-10 | 1995-11-21 | Matsushita Electric Industrial Co., Ltd. | Vehicle position detecting apparatus |
| US5638116A (en) * | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
| KR100224326B1 (ko) * | 1995-12-26 | 1999-10-15 | 모리 하루오 | 차량용 네비게이션장치 |
| JP3866328B2 (ja) * | 1996-06-06 | 2007-01-10 | 富士重工業株式会社 | 車両周辺立体物認識装置 |
| JPH11137776A (ja) | 1997-11-08 | 1999-05-25 | Takasago Electric Ind Co Ltd | スロットマシン |
| DE19842176A1 (de) * | 1998-09-15 | 2000-03-16 | Bosch Gmbh Robert | Verfahren und Vorrichtung zur Verkehrszeichenerkennung und Navigation |
| DE19855400A1 (de) * | 1998-12-01 | 2000-06-15 | Bosch Gmbh Robert | Verfahren und Vorrichtung zur Bestimmung eines zukünftigen Kursbereichs eines Fahrzeugs |
| US6246363B1 (en) * | 1998-12-10 | 2001-06-12 | Hughes Electronics Corporation | Method and system for incorporating two-way ranging navigation as a calibration reference for GPS |
| JP2000346653A (ja) | 1999-06-01 | 2000-12-15 | Matsushita Electric Ind Co Ltd | 車両走行支援方法及び車両走行支援装置 |
| JP2003506785A (ja) * | 1999-08-06 | 2003-02-18 | ロードリスク テクノロジーズ エルエルシー | 静止物体検出の方法および装置 |
| US6587692B1 (en) * | 2000-03-30 | 2003-07-01 | Lucent Technologies Inc. | Location determination using weighted ridge regression |
| JP2001289654A (ja) * | 2000-04-11 | 2001-10-19 | Equos Research Co Ltd | ナビゲーション装置、ナビゲーション装置の制御方法、及びそのプログラムを記録した記録媒体 |
| JP4624594B2 (ja) * | 2000-06-28 | 2011-02-02 | パナソニック株式会社 | 物体認識方法および物体認識装置 |
| US20050149251A1 (en) * | 2000-07-18 | 2005-07-07 | University Of Minnesota | Real time high accuracy geospatial database for onboard intelligent vehicle applications |
| US6891960B2 (en) * | 2000-08-12 | 2005-05-10 | Facet Technology | System for road sign sheeting classification |
| CN1372127A (zh) * | 2001-01-23 | 2002-10-02 | 林清芳 | 改进的定位和数据集成方法及其系统 |
| JP3598986B2 (ja) * | 2001-03-22 | 2004-12-08 | 日産自動車株式会社 | 地図情報提供装置及び地図情報提供方法 |
| JP4613451B2 (ja) | 2001-06-29 | 2011-01-19 | 信越半導体株式会社 | エピタキシャルウェーハの製造方法 |
| JP2003123197A (ja) | 2001-10-16 | 2003-04-25 | Alpine Electronics Inc | 道路標示等認識装置 |
| EP1502079A2 (en) * | 2002-04-30 | 2005-02-02 | Telmap Ltd. | Dynamic navigation system |
| KR100495635B1 (ko) * | 2002-09-02 | 2005-06-16 | 엘지전자 주식회사 | 네비게이션 시스템의 위치오차 보정방법 |
| US6927699B2 (en) * | 2002-12-05 | 2005-08-09 | Denso Corporation | Object recognition apparatus for vehicle, and inter-vehicle distance control unit |
| JP3772838B2 (ja) * | 2003-02-12 | 2006-05-10 | トヨタ自動車株式会社 | 車両用運転支援装置及び車両用制御装置 |
| JP4086298B2 (ja) * | 2003-06-17 | 2008-05-14 | アルパイン株式会社 | 物体検出方法及び装置 |
| JP2005017054A (ja) | 2003-06-25 | 2005-01-20 | Shimadzu Corp | 3点曲げ試験による破壊靱性値測定装置 |
| JP4321142B2 (ja) | 2003-07-02 | 2009-08-26 | 日産自動車株式会社 | 標識認識装置 |
| US20050107946A1 (en) * | 2003-11-13 | 2005-05-19 | Takanori Shimizu | Vehicle navigation apparatus |
| JP4134894B2 (ja) | 2003-12-09 | 2008-08-20 | 株式会社デンソー | 車両運転支援装置 |
| CN1641712A (zh) * | 2004-01-15 | 2005-07-20 | 于君 | 车辆定位系统及其专用的车载装置和路边设施 |
| JP4483305B2 (ja) * | 2004-01-16 | 2010-06-16 | トヨタ自動車株式会社 | 車両周辺監視装置 |
| DE102004010197B4 (de) * | 2004-03-02 | 2015-04-16 | Sick Ag | Verfahren zur Funktionskontrolle einer Positionsermittlungs- oder Umgebungserfassungseinrichtung eines Fahrzeugs oder zur Kontrolle einer digitalen Karte |
| US7486802B2 (en) * | 2004-06-07 | 2009-02-03 | Ford Global Technologies Llc | Adaptive template object classification system with a template generator |
| JP4557288B2 (ja) * | 2005-01-28 | 2010-10-06 | アイシン・エィ・ダブリュ株式会社 | 画像認識装置及び画像認識方法、並びにそれを用いた位置特定装置、車両制御装置及びナビゲーション装置 |
| JP2006208223A (ja) * | 2005-01-28 | 2006-08-10 | Aisin Aw Co Ltd | 車両位置認識装置及び車両位置認識方法 |
| KR101047719B1 (ko) * | 2005-02-16 | 2011-07-08 | 엘지전자 주식회사 | 네비게이션 시스템에서 이동체의 주행경로 안내방법 및 장치 |
| US7451041B2 (en) * | 2005-05-06 | 2008-11-11 | Facet Technology Corporation | Network-based navigation system having virtual drive-thru advertisements integrated with actual imagery from along a physical route |
| US20070055441A1 (en) * | 2005-08-12 | 2007-03-08 | Facet Technology Corp. | System for associating pre-recorded images with routing information in a navigation system |
| KR100721560B1 (ko) * | 2005-11-30 | 2007-05-23 | 한국전자통신연구원 | 임의 시점을 가지는 3차원 차량정보 제공 시스템 및 그방법 |
| JP4935145B2 (ja) * | 2006-03-29 | 2012-05-23 | 株式会社デンソー | カーナビゲーション装置 |
| US20080243378A1 (en) * | 2007-02-21 | 2008-10-02 | Tele Atlas North America, Inc. | System and method for vehicle navigation and piloting including absolute and relative coordinates |
| JP2011511281A (ja) * | 2008-02-04 | 2011-04-07 | テレ アトラス ノース アメリカ インコーポレイテッド | センサにより検出されたオブジェクトとマップマッチングする方法 |
-
2006
- 2006-05-17 JP JP2006138256A patent/JP4724043B2/ja active Active
-
2007
- 2007-05-15 WO PCT/JP2007/059979 patent/WO2007132860A1/ja not_active Ceased
- 2007-05-15 EP EP07743414.0A patent/EP2019288B1/en not_active Ceased
- 2007-05-15 CN CN2007800009771A patent/CN101346603B/zh not_active Expired - Fee Related
- 2007-05-15 US US12/067,100 patent/US7898437B2/en not_active Expired - Fee Related
- 2007-05-15 KR KR1020087006354A patent/KR101018620B1/ko not_active Expired - Fee Related
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1137776A (ja) * | 1997-07-24 | 1999-02-12 | Denso Corp | 車両用ナビゲーション装置 |
| JP2000246653A (ja) | 1999-02-24 | 2000-09-12 | Amada Co Ltd | バリ取り具 |
| JP2005265494A (ja) * | 2004-03-17 | 2005-09-29 | Hitachi Ltd | 車両位置推定装置およびこれを用いた運転支援装置 |
| JP2006138256A (ja) | 2004-11-12 | 2006-06-01 | Piolax Inc | フューエルゲージ取付け構造 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2019288A4 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102150015B (zh) * | 2008-10-17 | 2013-09-25 | 三菱电机株式会社 | 导航装置 |
| CN108242163A (zh) * | 2016-12-23 | 2018-07-03 | 卢卡斯汽车股份有限公司 | 机动车的驾驶员辅助系统 |
| CN108242163B (zh) * | 2016-12-23 | 2022-04-15 | 采埃孚主动安全股份有限公司 | 驾驶员辅助系统、机动车、输出交通信息的方法和介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2019288A4 (en) | 2012-03-28 |
| EP2019288B1 (en) | 2013-12-18 |
| CN101346603B (zh) | 2011-08-03 |
| KR101018620B1 (ko) | 2011-03-03 |
| JP2007309757A (ja) | 2007-11-29 |
| CN101346603A (zh) | 2009-01-14 |
| US20100061591A1 (en) | 2010-03-11 |
| JP4724043B2 (ja) | 2011-07-13 |
| KR20080037712A (ko) | 2008-04-30 |
| EP2019288A1 (en) | 2009-01-28 |
| US7898437B2 (en) | 2011-03-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP4724043B2 (ja) | 対象物認識装置 | |
| JP5162103B2 (ja) | 支援制御装置 | |
| JP4938351B2 (ja) | 車両用測位情報更新装置 | |
| JP4680131B2 (ja) | 自車位置測定装置 | |
| CN107783535B (zh) | 车辆控制装置 | |
| JP6036371B2 (ja) | 車両用運転支援システム及び運転支援方法 | |
| JP4977218B2 (ja) | 自車位置測定装置 | |
| JP6943127B2 (ja) | 位置補正方法、車両制御方法及び位置補正装置 | |
| JP2011012965A (ja) | 車線判定装置及びナビゲーションシステム | |
| JP2018159752A (ja) | 地図情報学習方法及び地図情報学習装置 | |
| JP2007309670A (ja) | 車両位置検出装置 | |
| JP4724079B2 (ja) | 対象物認識装置 | |
| JP2008201393A (ja) | 車速制御装置 | |
| JP4703544B2 (ja) | 運転支援装置 | |
| JP2018189462A (ja) | 走行車線特定装置 | |
| CN118302804B (zh) | 停车辅助方法及停车辅助装置 | |
| JP2006321421A (ja) | 車両の走行制御装置 | |
| JP2023151311A (ja) | 走行制御方法及び走行制御装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200780000977.1 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2007743414 Country of ref document: EP Ref document number: 1020087006354 Country of ref document: KR |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 12067100 Country of ref document: US |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07743414 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |