[go: up one dir, main page]

US20160239973A1 - Moving body position estimation device and moving body position estimation method - Google Patents

Moving body position estimation device and moving body position estimation method Download PDF

Info

Publication number
US20160239973A1
US20160239973A1 US15/031,295 US201415031295A US2016239973A1 US 20160239973 A1 US20160239973 A1 US 20160239973A1 US 201415031295 A US201415031295 A US 201415031295A US 2016239973 A1 US2016239973 A1 US 2016239973A1
Authority
US
United States
Prior art keywords
edge
image
moving body
evaluation value
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/031,295
Other versions
US9424649B1 (en
Inventor
Shinya Tanaka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nissan Motor Co Ltd
Original Assignee
Nissan Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd filed Critical Nissan Motor Co Ltd
Assigned to NISSAN MOTOR CO., LTD. reassignment NISSAN MOTOR CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TANAKA, SHINYA
Publication of US20160239973A1 publication Critical patent/US20160239973A1/en
Application granted granted Critical
Publication of US9424649B1 publication Critical patent/US9424649B1/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T7/0044
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06K9/4604
    • G06T7/0085
    • G06T7/204
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams

Definitions

  • the present invention relates to a moving body position estimation device and a moving body position estimation method.
  • a matching of the edges between an edge image generated from an image captured by a camera and a virtual image generated from a known three dimensional map using a particle filter is carried out for each particle, and a position of a moving body is stochastically estimated from a likelihood distribution, in which the likelihood is increased, as the amount of overlapping edges is increased, and the likelihood is decreased, as the amount of overlapping edges is decreased.
  • the object of the present invention is to provide a moving body position estimation device and a moving body position estimation method that can stably estimate the position of a moving body.
  • an edge image and a virtual image for each particle are compared, a higher evaluation value is assigned if there are more overlapping edges between the images, and a higher evaluation value is assigned if there are more edges that are not overlapping edges and for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a predetermined value.
  • the position of a moving body can be stably estimated.
  • FIG. 1 is a block diagram of a moving body position estimation device in accordance with a first embodiment.
  • FIG. 2 is a Control block diagram of a self-position estimation in the computer 3 of the first embodiment.
  • FIG. 3 is an explanatory view illustrating the dispersal method of particles by the moving body position estimation device of the first embodiment.
  • FIGS. 4A and 4B are explanatory views illustrating the calculation method for the evaluation correction value corresponding to the number of pixels of the first embodiment.
  • FIGS. 5A and 5B are explanatory views illustrating the calculation method for the evaluation correction value corresponding to the actual distance of the first embodiment.
  • FIG. 6 is a flowchart illustrating the flow of the steps of the computer 3 of the first embodiment.
  • FIG. 7 is a flowchart illustrating the flow of the self-position estimation steps of the first embodiment.
  • FIGS. 8A and 8B are diagrams illustrating the self-position estimation effect of the first embodiment.
  • FIGS. 9A and 9B are an explanatory views illustrating the calculation method of the evaluation correction value corresponding to the edge density of the second embodiment.
  • FIG. 1 is a block view of the moving body position estimation device of the first embodiment.
  • a vehicle 1 comprises a camera (image capturing means) 2 , a computer 3 , and a storage unit (storing means) 4 .
  • the camera 2 is attached at a front end portion of the vehicle 1 , at a height of h, at ⁇ degrees downward from horizontal, and captures images of the region on the front side of the vehicle 1 .
  • the computer 3 carries out a matching step between the map data stored in the storage unit 4 and the images captured by the camera 2 , and estimates the position and orientation of the vehicle 1 .
  • the storage unit 4 stores a three dimensional map data comprising edge information and position information of structures existing in the surroundings of the vehicle 1 .
  • FIG. 2 is a control block view of the self-position estimation in the computer 3 of the first embodiment.
  • the lens of the camera 2 is a common lens with no distortion, but, for example, can be a fish-eye lens to capture a wide range.
  • the self-position of the vehicle can be estimated by the same framework by setting the camera model used in the virtual image generating unit 14 described below to be a fish-eye camera. Additionally, if a fish-eye lens is mounted in order to capture a wider range around the vehicle, multiple cameras may be mounted to the car.
  • a feature extraction unit (edge image generating means) 11 extracts edges from the image captured by the camera 2 and generates an edge image.
  • a known method for detecting an edge such as the Canny method, can be used to extract the edge.
  • a method such as the Sobel filter
  • being able to detect the edge with a method such as the Sobel filter is adequate, and, in short, being able to observe an edge in the image that is sufficiently accurate for matching a parameter of a certain given position and orientation from the map data and a virtual image generated from the model of the camera 2 , is adequate.
  • An initialization processing unit 12 carries out an Initialization step of a particle filter used to estimate the position and orientation of the vehicle.
  • an initialization of the (position and orientation) parameters of the particles within the expected range of the position and orientation of the vehicle is carried out.
  • the total number of degrees of freedom shall be six, which are the three degrees of freedom (x, y, z) representing the vehicle position and the three degrees of freedom (yaw, pitch, roll) representing the vehicle orientation (refer to FIG. 3 ).
  • a rough position range may be set using a GPS in order to set the range for initialization.
  • the number of particles need to be set at this time, which may be set to an appropriate number according to the problem. Since the particle filter is a known method, details are omitted.
  • a position and orientation candidate generating unit 13 sets a parameter of the particles of the current moment from the position and orientation parameter of the particles of a single moment prior, using a vehicle system model set in advance (immediately after exiting the Initialization processing unit, the parameter of the current moment is set from the initialized value).
  • the above framework is in the category of particle filters, so the details are omitted, but in the first embodiment, the system model shall be a random walk (The vehicle movement is randomly assumed within a predetermined range.).
  • the system model may be a constant velocity linear motion model, or the like. For example, as illustrated in FIG.
  • the particle P and the surrounding particles P 1 -P 5 of the position and orientation angle of the vehicle V (t 1 ) estimated one loop prior is moved by the amount of the odometer, and the existence distribution range of the particles is set and amended. Then, particles P 10 -P 15 are set to estimate a new position and orientation angle of the vehicle V (t 2 ).
  • a virtual image generating unit (virtual image generating means) 14 references the storage unit 4 , and generates a virtual image using the position and orientation parameters set to some particles at the previously described position and orientation candidate generating unit 13 and the camera model of the camera 2 .
  • the camera model of the camera 2 is known (measuring in advance suffices; alternatively, a design value suffices), and the three dimensional data of the storage unit 4 is known, the three dimensional map data can be converted to a two dimensional image (referred to as virtual image).
  • the virtual image generating unit 14 In the first embodiment, matching between an edge of the virtual image generated by the virtual image generating unit 14 in an evaluation value calculation unit 15 that will be described below and an edge in the extracted image extracted by the feature extraction unit 11 from the image captured by the camera 2 is carried out. Therefore, in the virtual image generating unit 14 , projecting only the edge information of the three dimensional map corresponding to the component that is extracted as an edge from the captured image by the feature extraction unit 11 is sufficient. Specifically, projecting only the edge portion of buildings in the map, white lines on the road surface, and the like, is sufficient, and the projecting of information such as the texture or color of the buildings and the road surface is not required. As has been described above, the number of particles may be set according to the problem in the previously described position and orientation candidate generating unit 13 , but if 100 particles were to be generated, the steps would be repeated 100 times.
  • the evaluation value calculation unit (evaluation value calculation means) 15 carries out the matching between the edge image outputted from the feature extraction unit 11 and the edge component of the virtual image outputted from the virtual image generating unit 14 to evaluate the degree of overlapping between the two, in which a higher evaluation value (likelihood) e is calculated, as the degree of overlapping is increased.
  • the aim is to improve the accuracy and stabilization of the position and orientation estimation method that uses a particle filter, and when the evaluation value is calculated, even if the edges do not overlap, if the distance between the closest edges (edge-to-edge distance) is close (less than or equal to a predetermined distance), an evaluation correction value corresponding to the closeness of the two edges is added.
  • the calculating of the evaluation correction value is carried out by an evaluation correction value calculation unit 15 a in the evaluation value calculation unit 15 .
  • the evaluation correction value can be set in accordance with the number of pixels (number of pixels) between the edge in the virtual image and the edge in the edge image. For example, with respect to the edge in the virtual image, the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) in the edge image that is shifted one or two pixels is set to 0.5, the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) that is shifted three or four pixels is set to 0.25, and the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) that is shifted five or more pixels is set to 0.0.
  • the evaluation correction value may be calculated using the three dimensional distance in real space, and not simply the number of pixels.
  • the distance in real space corresponding to one pixel varies in accordance with the distance of depth from the vehicle, namely, because the vertical and horizontal resolutions per pixel differ between a three-dimensional object positioned in the front and a three-dimensional object positioned in the rear.
  • the three-dimensional object enclosed with the dashed line is the same size on the front surface and the back surface in real space, but the size thereof projected on the virtual image changes in accordance with the depth.
  • an edge vicinity region is provided taking into account the real distance, and an evaluation correction value ed(x i , y i ) corresponding to the real distance is obtained.
  • the evaluation correction value ed(x i , y i ) corresponding to the real distance is obtained from the following formula.
  • alpha is an adjustment parameter
  • d(x i , y i ) is the shortest distance from the edge of when the coordinate (x i , y i ) is three dimensionally projected.
  • ed(x i , y i ) may be set to zero.
  • the evaluation correction value calculation unit 15 a the above steps are repeated as many times as the number of particles set in the position and orientation candidate generating unit 13 .
  • the evaluation value calculation unit 15 calculates the evaluation value for each pixel based on the following formula.
  • an evaluation correction value ep(x i , y i ) corresponding to the number of pixels is set, for the portions where the number of pixels between the edge in the virtual image and the edge in the edge image is up to two pixels, one half of the original points are added, and for the portions up to four pixels, one quarter of the original points are added.
  • an evaluation correction value ed(x i , y i ) corresponding to the real distance is set, if the reciprocal of the distance is added at the time of adding, a higher value will be added, as the distance between the edge in the virtual image and the edge in the edge image is decreased.
  • the steps should be repeated as many times as the number of particles set in the position and orientation candidate generating unit 13 .
  • the evaluation correction value is calculated in accordance with the number of pixels or the real distance between the edge in the virtual image and the edge in the edge image, with the edge in the virtual image as a reference, but the same results will be obtained even if the evaluation correction value is calculated in accordance with the number of pixels or the real distance between the edge in the edge image and the edge in the virtual image, with the edge in the edge image as a reference.
  • a position and orientation estimation unit (position estimation means) 16 estimates the position and orientation of a vehicle based on the evaluation value for each particle set in the evaluation value calculation unit 15 .
  • the step itself is a framework of a particle filter, but a larger weighting is set as the evaluation value is increased.
  • a predicted position and an orientation angle candidate with the highest likelihood is calculated as the actual position and orientation of the vehicle.
  • the likelihood of each predicted position and orientation angle candidate is used to obtain the weighted average of the predicted positions and orientation angles, and the obtained value may be set as the final position and orientation angle of the vehicle.
  • the position and orientation estimation unit 16 the position and orientation is estimated in the framework of a particle filter, and an evaluation value at the estimated parameters (position and orientation) is calculated (As a step, the position and orientation candidate generating unit 13 and the evaluation value calculation unit 15 are executed. In the steps described so far, the above series of flow needs to be executed as many times as the number of particles, but, here, only the flow for one parameter of the estimation result will be executed).
  • the above evaluation value is outputted to an initialization determination unit 17 in a subsequent step, and is used for an Initialization step of when the estimation is wrong.
  • the initialization determination unit 17 determines whether or not the estimated position and orientation are errant detections. If a determination of an errant detection is made, an initialization order is sent to the Initialization processing unit 12 , and the steps are executed again. As a determination method, when the evaluation value based on the estimation result outputted from the position and orientation estimation unit 16 is lower than a threshold set in advance, a determination of an errant detection is made.
  • FIG. 6 is a flowchart illustrating the flow of the entire steps of the computer 3 of the first embodiment.
  • Step S 1 an Initialization step of the particle filter is carried out in the Initialization processing unit 12 .
  • the above steps suffice to be executed once at the time of activating the system.
  • Step S 5 if a reinitialization is determined to be required in the initialization determination unit 17 , the particles are initialized again.
  • Step S 2 an image forward of the vehicle is captured by the camera 2 .
  • Step S 3 edges are extracted from the image acquired with the camera 2 in the feature extraction unit 11 .
  • Step S 4 a self-position estimation step is carried out. Details will be described below.
  • Step S 5 whether or not the estimated position and orientation is an errant detection, such that an initialization is required again, is determined, in the initialization determination unit 17 . If YES, the steps proceed to Step S 1 , and if NO, the steps proceed to Step S 6 .
  • Step S 6 whether or not a system off has been inputted with the hands of the user is determined; if YES, the steps are ended, and if NO, the steps proceed to Step S 2 .
  • FIG. 7 is a flowchart illustrating the flow of the self-position estimation steps of the first embodiment.
  • Step S 41 in the position and orientation candidate generating unit 13 , a parameter of the particles of the current moment (the six degrees of freedom of position and orientation) is set from the position and orientation parameter of the particles of a single moment prior, using a vehicle system model set in advance.
  • Step S 42 in the virtual image generating unit 14 , a virtual image is generated based on the position and orientation parameters set to the particle.
  • Step S 43 in the evaluation correction value calculation unit 15 a of the evaluation value calculation unit 15 , an evaluation correction value (ep or ed) corresponding to the shift amount (number of pixels or real distance) between the edge in the edge image and the edge in the virtual image is calculated.
  • Step S 44 in the evaluation value calculation unit 15 , the degree of overlapping between the edge image and the virtual image is evaluated. At this time, the evaluation value is calculated, taking into account the evaluation correction value (ep or ed).
  • Step S 45 in the evaluation value calculation unit 15 , whether or not the evaluation value calculation is completed for all the particles generated at the Initialization processing unit 12 is determined; if YES, the steps proceed to Step S 46 , and if NO, the process proceeds to Step S 41 .
  • Step S 46 in the position and orientation estimation unit 16 , weighting for the evaluation value of each particle is calculated, and the position and orientation of the current moment is estimated.
  • the likelihood (evaluation value) is increased, as the number of pixels for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a prescribed value is increased. Accordingly, as illustrated in FIG. 8B , an extreme decrease of likelihood can be suppressed, for particles with a small shift amount of the position and orientation parameters with respect to the true position and orientation of the host vehicle. In other words, even if a small shifting occurs between the position and orientation parameters of the particles and the true position and orientation of the host vehicle, a likelihood is given, and therefore a position and orientation close to the true position and orientation of the host vehicle can be estimated. Therefore, the position and orientation of the host vehicle can be safely estimated.
  • the likelihood when the edge-to-edge distance is less than or equal to a predetermined distance, the likelihood is increased as the edge-to-edge distance is decreased. Accordingly, since the likelihood of the particles with a smaller shift amount of the position and orientation parameters with respect to the true position and orientation of the host vehicle are increased, the region close to the true position and orientation in the likelihood distribution of the parameter space can be densified. Therefore, a more adequate likelihood can be obtained, and the accuracy and stability of the position and orientation estimation can be improved.
  • the predetermined distance is the predetermined number of pixels. Accordingly, the range in which to increase the evaluation value can be set to a number of pixels that the designer sets in advance. Therefore, the range of the number of pixels from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • the predetermined distance is the three dimensional distance in real space. Accordingly, the range in which to increase the evaluation value can be set to distance in real space that the designer sets in advance. Therefore, the range of the distance in real space from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • the first embodiment exerts the effects listed below.
  • the device comprises: a camera 2 which captures forward of the host vehicle to acquire a captured image, a feature extraction unit 11 which extracts an edge from the captured image to generate an edge image, a storage unit 4 for storing a map data comprising edge information and position information of structures existing in the host vehicle surroundings, a virtual image generating unit 14 that sets multiple particles which are assumed positions and orientations of the host vehicle, and converts the edge information of the map data for each particle to a virtual image captured from the assumed position and orientation, an evaluation value calculation unit 15 which compares the edge image and virtual image for each particle, and assigns a higher evaluation value if there are more overlapping edges between the images, and, assigns a higher evaluation value if there are more edges that are not overlapping edges and for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a predetermined value, and a position and orientation estimation unit 16 which estimates the position of the host vehicle based on the evaluation value for each particle. Therefore, the
  • the evaluation value calculation unit 15 when the edge-to-edge distance is less than or equal to a predetermined distance, increases the estimation value, as the edge-to-edge distance is decreased. Therefore, a more adequate likelihood can be obtained, and the accuracy and stability of the position and orientation estimation of the host vehicle can be improved.
  • the predetermined distance is the predetermined number of pixels. Therefore, the range of the number of pixels from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • the predetermined distance is the three dimensional distance in real space. Therefore, the range of the distance in real space from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • the device captures forward of the host vehicle to acquire a captured image, extracts an edge from the captured image to generate an edge image, sets multiple particles which are assumed positions and orientations of the host vehicle, converts the edge information of a map data comprising edge information and position information of structures existing in the host vehicle surroundings for each particle to a virtual image captured from the assumed position and orientation, compares the edge image and virtual image for each particle, assigns a higher evaluation value if there are more overlapping edges between the images, and, assigns a higher evaluation value if there are more edges that are not overlapping edges and for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a predetermined value, and estimates the position of the host vehicle based on the evaluation value for each particle. Therefore, the position and orientation of the host vehicle can be safely estimated.
  • FIG. 9 is an explanatory view illustrating the calculation method of the evaluation correction value corresponding to the edge density.
  • the other configurations are the same as the first embodiment, thus, are given the same codes, and the descriptions thereof are omitted.
  • FIG. 9A is an edge image generated by extracting an edge from an image captured by the camera 2 , in the feature extraction unit (edge image generating means) 11 .
  • the edge image shown in FIG. 9A since a shoulder 9 b exists in the vicinity parallel to a lane 9 a, due to the position and orientation of the particle, the edge of the lane in the virtual image and the edge of the shoulder 9 b in the edge image become a close state (a state in which the edge density is high), and the evaluation value is partially increased.
  • the distance between the closest edges edge-to-edge distance
  • the predetermined distance for determination is decreased.
  • the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) in the edge image that is shifted one or two pixels is set to 0.5
  • the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) that is shifted three or four pixels is set to 0.25
  • the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) that is shifted five or more pixels is set to 0.0.
  • the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) in the edge image that is shifted one pixel is set to 0.5
  • the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) that is shifted two pixels is set to 0.25
  • the evaluation correction value ep(x i , y i ) corresponding to the number of pixels of the pixel (x i , y i ) that is shifted 3 or more pixels is set to 0.0.
  • an edge vicinity region of two pixels on both sides of the edge of the lane 9 a which is a region where the edge density of the edge image is high, as illustrated in FIG. 9B , the edge vicinity region can be narrowed compared to other regions, and the increasing of the evaluation correction value due to the edge of the shoulder 9 b can be suppressed.
  • the second embodiment exerts the following effects, in addition to the effects 1-5 of the first embodiment.
  • the evaluation value calculation unit 15 decreases the predetermined distance, as the edge density of the edge image is increased. Therefore, the erroneous increasing of the evaluation correction value in regions where the edge density is high can be suppressed, and the accuracy and stability of the position and orientation estimation of the host vehicle can be improved.
  • an evaluation function is set (maximization problem) so that the evaluation value increases, as the amount of the edges overlapping between the edge image and the edge in the virtual image is increased, but the evaluation function may be set so that error is reduced, as the amount of the edges overlapping is increased (minimization problem).
  • the present invention is applicable to the estimation of the relative position (lateral position) of a vehicle, with respect to the left and right white lines of the traveling path.
  • the position and orientation parameters are, the one degree of freedom of the lateral position as the position parameter, and the two degrees of freedom of the pitch and yaw as the orientation parameter; a total of three degrees of freedom.
  • the width of white lines may be added as an estimation parameter.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An evaluation value calculation unit is configured to compare an edge image and virtual image for each particle, assigns a higher evaluation value if there are more overlapping edges between the images, and assigns a higher evaluation value if there are more edges that are not overlapping edges and for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a prescribed value. A position and orientation estimation unit is configured to estimate the position of a vehicle based on the evaluation value for each particle.

Description

  • This application is a U.S. National stage application of International Application No. PCT/JP2014/073432, filed Sep. 5, 2014, which claims priority to Japanese Patent Application No. 2013-234516 filed in the Japan Patent Office on Nov. 13, 2013, the contents of each of which is hereby incorporation herein by reference.
  • BACKGROUND
  • 1. Field of the Invention
  • The present invention relates to a moving body position estimation device and a moving body position estimation method.
  • 2. Background Information
  • In Japanese Laid-Open Patent Application No. 2010-60451, a matching of the edges between an edge image generated from an image captured by a camera and a virtual image generated from a known three dimensional map using a particle filter is carried out for each particle, and a position of a moving body is stochastically estimated from a likelihood distribution, in which the likelihood is increased, as the amount of overlapping edges is increased, and the likelihood is decreased, as the amount of overlapping edges is decreased.
  • SUMMARY
  • However, in the prior art described above, since the likelihood is calculated based on the degree of overlapping of the edges, even if the parameter of the particle is close to the true position of the moving body, if the parameter of the position of the particle is even slightly shifted from the true position, there is the problem that the likelihood of the particle is extremely reduced, causing the estimation to become unstable.
  • The object of the present invention is to provide a moving body position estimation device and a moving body position estimation method that can stably estimate the position of a moving body.
  • In the present invention, an edge image and a virtual image for each particle are compared, a higher evaluation value is assigned if there are more overlapping edges between the images, and a higher evaluation value is assigned if there are more edges that are not overlapping edges and for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a predetermined value.
  • Therefore, the position of a moving body can be stably estimated.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Referring now to the attached drawings which form a part of this original disclosure.
  • FIG. 1 is a block diagram of a moving body position estimation device in accordance with a first embodiment.
  • FIG. 2 is a Control block diagram of a self-position estimation in the computer 3 of the first embodiment.
  • FIG. 3 is an explanatory view illustrating the dispersal method of particles by the moving body position estimation device of the first embodiment.
  • FIGS. 4A and 4B are explanatory views illustrating the calculation method for the evaluation correction value corresponding to the number of pixels of the first embodiment.
  • FIGS. 5A and 5B are explanatory views illustrating the calculation method for the evaluation correction value corresponding to the actual distance of the first embodiment.
  • FIG. 6 is a flowchart illustrating the flow of the steps of the computer 3 of the first embodiment.
  • FIG. 7 is a flowchart illustrating the flow of the self-position estimation steps of the first embodiment.
  • FIGS. 8A and 8B are diagrams illustrating the self-position estimation effect of the first embodiment.
  • FIGS. 9A and 9B are an explanatory views illustrating the calculation method of the evaluation correction value corresponding to the edge density of the second embodiment.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Selected embodiments will now be explained with reference to the drawings.
  • First Embodiment
  • FIG. 1 is a block view of the moving body position estimation device of the first embodiment. A vehicle 1 comprises a camera (image capturing means) 2, a computer 3, and a storage unit (storing means) 4. The camera 2 is attached at a front end portion of the vehicle 1, at a height of h, at θ degrees downward from horizontal, and captures images of the region on the front side of the vehicle 1. The computer 3 carries out a matching step between the map data stored in the storage unit 4 and the images captured by the camera 2, and estimates the position and orientation of the vehicle 1. The storage unit 4 stores a three dimensional map data comprising edge information and position information of structures existing in the surroundings of the vehicle 1.
  • FIG. 2 is a control block view of the self-position estimation in the computer 3 of the first embodiment. The lens of the camera 2 is a common lens with no distortion, but, for example, can be a fish-eye lens to capture a wide range. In the latter case, the self-position of the vehicle can be estimated by the same framework by setting the camera model used in the virtual image generating unit 14 described below to be a fish-eye camera. Additionally, if a fish-eye lens is mounted in order to capture a wider range around the vehicle, multiple cameras may be mounted to the car.
  • A feature extraction unit (edge image generating means) 11 extracts edges from the image captured by the camera 2 and generates an edge image. A known method for detecting an edge, such as the Canny method, can be used to extract the edge. Of course, depending on the problem, being able to detect the edge with a method such as the Sobel filter is adequate, and, in short, being able to observe an edge in the image that is sufficiently accurate for matching a parameter of a certain given position and orientation from the map data and a virtual image generated from the model of the camera 2, is adequate.
  • An initialization processing unit 12 carries out an Initialization step of a particle filter used to estimate the position and orientation of the vehicle. In the first embodiment, an initialization of the (position and orientation) parameters of the particles within the expected range of the position and orientation of the vehicle is carried out. Here, since the problem is to estimate the position and orientation of a vehicle within a limited section, the total number of degrees of freedom shall be six, which are the three degrees of freedom (x, y, z) representing the vehicle position and the three degrees of freedom (yaw, pitch, roll) representing the vehicle orientation (refer to FIG. 3). In the case of the first embodiment in which the self-position is estimated using a map data, a rough position range may be set using a GPS in order to set the range for initialization. Additionally, the number of particles need to be set at this time, which may be set to an appropriate number according to the problem. Since the particle filter is a known method, details are omitted.
  • A position and orientation candidate generating unit 13 sets a parameter of the particles of the current moment from the position and orientation parameter of the particles of a single moment prior, using a vehicle system model set in advance (immediately after exiting the Initialization processing unit, the parameter of the current moment is set from the initialized value). The above framework is in the category of particle filters, so the details are omitted, but in the first embodiment, the system model shall be a random walk (The vehicle movement is randomly assumed within a predetermined range.). Of course, according to the problem, the system model may be a constant velocity linear motion model, or the like. For example, as illustrated in FIG. 3, the particle P and the surrounding particles P1-P5 of the position and orientation angle of the vehicle V (t1) estimated one loop prior is moved by the amount of the odometer, and the existence distribution range of the particles is set and amended. Then, particles P10-P15 are set to estimate a new position and orientation angle of the vehicle V (t2).
  • A virtual image generating unit (virtual image generating means) 14, references the storage unit 4, and generates a virtual image using the position and orientation parameters set to some particles at the previously described position and orientation candidate generating unit 13 and the camera model of the camera 2. In general, if the position and orientation parameters of the vehicle is given, the camera model of the camera 2 is known (measuring in advance suffices; alternatively, a design value suffices), and the three dimensional data of the storage unit 4 is known, the three dimensional map data can be converted to a two dimensional image (referred to as virtual image).
  • In the first embodiment, matching between an edge of the virtual image generated by the virtual image generating unit 14 in an evaluation value calculation unit 15 that will be described below and an edge in the extracted image extracted by the feature extraction unit 11 from the image captured by the camera 2 is carried out. Therefore, in the virtual image generating unit 14, projecting only the edge information of the three dimensional map corresponding to the component that is extracted as an edge from the captured image by the feature extraction unit 11 is sufficient. Specifically, projecting only the edge portion of buildings in the map, white lines on the road surface, and the like, is sufficient, and the projecting of information such as the texture or color of the buildings and the road surface is not required. As has been described above, the number of particles may be set according to the problem in the previously described position and orientation candidate generating unit 13, but if 100 particles were to be generated, the steps would be repeated 100 times.
  • The evaluation value calculation unit (evaluation value calculation means) 15 carries out the matching between the edge image outputted from the feature extraction unit 11 and the edge component of the virtual image outputted from the virtual image generating unit 14 to evaluate the degree of overlapping between the two, in which a higher evaluation value (likelihood) e is calculated, as the degree of overlapping is increased. In the conventional evaluation value calculation method, if two images are scanned and both pixels of interest (xi, yi) have an edge, an evaluation value is added (Eval (xi, yi)=1), and if not, the value is not added (Eval (xi, yi)=0), thus being set so that the evaluation value is increased as overlapping is increased, whereas, in the first embodiment, the aim is to improve the accuracy and stabilization of the position and orientation estimation method that uses a particle filter, and when the evaluation value is calculated, even if the edges do not overlap, if the distance between the closest edges (edge-to-edge distance) is close (less than or equal to a predetermined distance), an evaluation correction value corresponding to the closeness of the two edges is added. The calculating of the evaluation correction value is carried out by an evaluation correction value calculation unit 15 a in the evaluation value calculation unit 15.
  • The evaluation correction value can be set in accordance with the number of pixels (number of pixels) between the edge in the virtual image and the edge in the edge image. For example, with respect to the edge in the virtual image, the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) in the edge image that is shifted one or two pixels is set to 0.5, the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) that is shifted three or four pixels is set to 0.25, and the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) that is shifted five or more pixels is set to 0.0. Namely, in a virtual image such as that of FIG. 4A, by providing an edge vicinity region of four pixels on both sides of the edge, as illustrated in FIG. 4B, the edge width in the virtual image is virtually thickened, and when the edge in the edge image overlaps with the edge vicinity region, an evaluation correction value ep(xi, yi) corresponding to the edge-to-edge distance is given.
  • As another example, the evaluation correction value may be calculated using the three dimensional distance in real space, and not simply the number of pixels. When the predetermined distance is defined based on the number of pixels as described above, the distance in real space corresponding to one pixel varies in accordance with the distance of depth from the vehicle, namely, because the vertical and horizontal resolutions per pixel differ between a three-dimensional object positioned in the front and a three-dimensional object positioned in the rear. For example, in the virtual image illustrated in FIG. 5A, the three-dimensional object enclosed with the dashed line is the same size on the front surface and the back surface in real space, but the size thereof projected on the virtual image changes in accordance with the depth. Therefore, when reflecting real space, in the same manner as the method in the virtual image generating unit 14, using a map data whose three dimensional information is known, as illustrated in FIG. 5B, an edge vicinity region is provided taking into account the real distance, and an evaluation correction value ed(xi, yi) corresponding to the real distance is obtained. The evaluation correction value ed(xi, yi) corresponding to the real distance is obtained from the following formula.

  • ed (x i , y i)=1/(1+alpha×d (x i , y i))
  • Here, alpha is an adjustment parameter, and d(xi, yi) is the shortest distance from the edge of when the coordinate (xi, yi) is three dimensionally projected. When the above distance is equal to or greater than a predetermined value (the edge is far), ed(xi, yi) may be set to zero. In the evaluation correction value calculation unit 15 a, the above steps are repeated as many times as the number of particles set in the position and orientation candidate generating unit 13.
  • The evaluation value calculation unit 15 calculates the evaluation value for each pixel based on the following formula.
    • (i) When an evaluation correction value ep(xi, yi) corresponding to the number of pixels is set

  • e=Σ(Eval(x i , y i)+ep(x i , y i)

  • Eval(xi, yi)=1 (edges overlap)

  • Eval(xi, yi)=0 (edges do not overlap (otherwise))
    • (ii) When an evaluation correction value ed(xi, yi) corresponding to the real distance is set

  • e=Σ(Eval(x i , y i)+ep(x i , y i)

  • Eval(xi, yi)=1 (edges overlap)

  • Eval(xi, yi)=0 (edges do not overlap (otherwise))
  • Accordingly, when an evaluation correction value ep(xi, yi) corresponding to the number of pixels is set, for the portions where the number of pixels between the edge in the virtual image and the edge in the edge image is up to two pixels, one half of the original points are added, and for the portions up to four pixels, one quarter of the original points are added. On the other hand, when an evaluation correction value ed(xi, yi) corresponding to the real distance is set, if the reciprocal of the distance is added at the time of adding, a higher value will be added, as the distance between the edge in the virtual image and the edge in the edge image is decreased. In the evaluation value calculation unit 15 also, in the same manner, the steps should be repeated as many times as the number of particles set in the position and orientation candidate generating unit 13. In the above example, the evaluation correction value is calculated in accordance with the number of pixels or the real distance between the edge in the virtual image and the edge in the edge image, with the edge in the virtual image as a reference, but the same results will be obtained even if the evaluation correction value is calculated in accordance with the number of pixels or the real distance between the edge in the edge image and the edge in the virtual image, with the edge in the edge image as a reference.
  • A position and orientation estimation unit (position estimation means) 16 estimates the position and orientation of a vehicle based on the evaluation value for each particle set in the evaluation value calculation unit 15. The step itself is a framework of a particle filter, but a larger weighting is set as the evaluation value is increased. For example, a predicted position and an orientation angle candidate with the highest likelihood is calculated as the actual position and orientation of the vehicle. Additionally, the likelihood of each predicted position and orientation angle candidate is used to obtain the weighted average of the predicted positions and orientation angles, and the obtained value may be set as the final position and orientation angle of the vehicle. In the position and orientation estimation unit 16, the position and orientation is estimated in the framework of a particle filter, and an evaluation value at the estimated parameters (position and orientation) is calculated (As a step, the position and orientation candidate generating unit 13 and the evaluation value calculation unit 15 are executed. In the steps described so far, the above series of flow needs to be executed as many times as the number of particles, but, here, only the flow for one parameter of the estimation result will be executed). The above evaluation value is outputted to an initialization determination unit 17 in a subsequent step, and is used for an Initialization step of when the estimation is wrong.
  • The initialization determination unit 17 determines whether or not the estimated position and orientation are errant detections. If a determination of an errant detection is made, an initialization order is sent to the Initialization processing unit 12, and the steps are executed again. As a determination method, when the evaluation value based on the estimation result outputted from the position and orientation estimation unit 16 is lower than a threshold set in advance, a determination of an errant detection is made.
  • Overall Steps
  • FIG. 6 is a flowchart illustrating the flow of the entire steps of the computer 3 of the first embodiment. In Step S1, an Initialization step of the particle filter is carried out in the Initialization processing unit 12. The above steps suffice to be executed once at the time of activating the system. However, in Step S5, if a reinitialization is determined to be required in the initialization determination unit 17, the particles are initialized again. In Step S2, an image forward of the vehicle is captured by the camera 2. In Step S3, edges are extracted from the image acquired with the camera 2 in the feature extraction unit 11.
  • In Step S4, a self-position estimation step is carried out. Details will be described below. In Step S5, whether or not the estimated position and orientation is an errant detection, such that an initialization is required again, is determined, in the initialization determination unit 17. If YES, the steps proceed to Step S1, and if NO, the steps proceed to Step S6. In Step S6, whether or not a system off has been inputted with the hands of the user is determined; if YES, the steps are ended, and if NO, the steps proceed to Step S2.
  • Self-Position Estimation Steps
  • FIG. 7 is a flowchart illustrating the flow of the self-position estimation steps of the first embodiment. In Step S41, in the position and orientation candidate generating unit 13, a parameter of the particles of the current moment (the six degrees of freedom of position and orientation) is set from the position and orientation parameter of the particles of a single moment prior, using a vehicle system model set in advance. In Step S42, in the virtual image generating unit 14, a virtual image is generated based on the position and orientation parameters set to the particle. In Step S43, in the evaluation correction value calculation unit 15 a of the evaluation value calculation unit 15, an evaluation correction value (ep or ed) corresponding to the shift amount (number of pixels or real distance) between the edge in the edge image and the edge in the virtual image is calculated.
  • In Step S44, in the evaluation value calculation unit 15, the degree of overlapping between the edge image and the virtual image is evaluated. At this time, the evaluation value is calculated, taking into account the evaluation correction value (ep or ed). In Step S45, in the evaluation value calculation unit 15, whether or not the evaluation value calculation is completed for all the particles generated at the Initialization processing unit 12 is determined; if YES, the steps proceed to Step S46, and if NO, the process proceeds to Step S41. In Step S46, in the position and orientation estimation unit 16, weighting for the evaluation value of each particle is calculated, and the position and orientation of the current moment is estimated.
  • Next, the effects will be described. In the conventional moving body position estimation device, even if the parameter of the particle is close to the true position of the host vehicle, if the parameter of the position is even slightly shifted from the true position and orientation, there is the problem that the likelihood of the particle is extremely reduced, causing the estimation of the position and orientation of the host vehicle to become unstable, as illustrated in FIG. 8A.
  • In contrast, in the first embodiment, the likelihood (evaluation value) is increased, as the number of pixels for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a prescribed value is increased. Accordingly, as illustrated in FIG. 8B, an extreme decrease of likelihood can be suppressed, for particles with a small shift amount of the position and orientation parameters with respect to the true position and orientation of the host vehicle. In other words, even if a small shifting occurs between the position and orientation parameters of the particles and the true position and orientation of the host vehicle, a likelihood is given, and therefore a position and orientation close to the true position and orientation of the host vehicle can be estimated. Therefore, the position and orientation of the host vehicle can be safely estimated.
  • In the first embodiment, when the edge-to-edge distance is less than or equal to a predetermined distance, the likelihood is increased as the edge-to-edge distance is decreased. Accordingly, since the likelihood of the particles with a smaller shift amount of the position and orientation parameters with respect to the true position and orientation of the host vehicle are increased, the region close to the true position and orientation in the likelihood distribution of the parameter space can be densified. Therefore, a more adequate likelihood can be obtained, and the accuracy and stability of the position and orientation estimation can be improved.
  • In the first embodiment, the predetermined distance is the predetermined number of pixels. Accordingly, the range in which to increase the evaluation value can be set to a number of pixels that the designer sets in advance. Therefore, the range of the number of pixels from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • In the first embodiment, the predetermined distance is the three dimensional distance in real space. Accordingly, the range in which to increase the evaluation value can be set to distance in real space that the designer sets in advance. Therefore, the range of the distance in real space from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • The first embodiment exerts the effects listed below.
  • (1) The device comprises: a camera 2 which captures forward of the host vehicle to acquire a captured image, a feature extraction unit 11 which extracts an edge from the captured image to generate an edge image, a storage unit 4 for storing a map data comprising edge information and position information of structures existing in the host vehicle surroundings, a virtual image generating unit 14 that sets multiple particles which are assumed positions and orientations of the host vehicle, and converts the edge information of the map data for each particle to a virtual image captured from the assumed position and orientation, an evaluation value calculation unit 15 which compares the edge image and virtual image for each particle, and assigns a higher evaluation value if there are more overlapping edges between the images, and, assigns a higher evaluation value if there are more edges that are not overlapping edges and for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a predetermined value, and a position and orientation estimation unit 16 which estimates the position of the host vehicle based on the evaluation value for each particle. Therefore, the position and orientation of the host vehicle can be safely estimated. .
  • (2) The evaluation value calculation unit 15, when the edge-to-edge distance is less than or equal to a predetermined distance, increases the estimation value, as the edge-to-edge distance is decreased. Therefore, a more adequate likelihood can be obtained, and the accuracy and stability of the position and orientation estimation of the host vehicle can be improved.
  • (3) The predetermined distance is the predetermined number of pixels. Therefore, the range of the number of pixels from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • (4) The predetermined distance is the three dimensional distance in real space. Therefore, the range of the distance in real space from the edge in which the likelihood of the particle is evaluated higher can be set in accordance with the problem.
  • (5) The device captures forward of the host vehicle to acquire a captured image, extracts an edge from the captured image to generate an edge image, sets multiple particles which are assumed positions and orientations of the host vehicle, converts the edge information of a map data comprising edge information and position information of structures existing in the host vehicle surroundings for each particle to a virtual image captured from the assumed position and orientation, compares the edge image and virtual image for each particle, assigns a higher evaluation value if there are more overlapping edges between the images, and, assigns a higher evaluation value if there are more edges that are not overlapping edges and for which an edge-to-edge distance, which is the distance between an edge in the edge image and an edge in the virtual image, is less than or equal to a predetermined value, and estimates the position of the host vehicle based on the evaluation value for each particle. Therefore, the position and orientation of the host vehicle can be safely estimated.
  • Second Embodiment
  • Next, the moving body position estimation device according to the second embodiment will be described with reference to the drawings. FIG. 9 is an explanatory view illustrating the calculation method of the evaluation correction value corresponding to the edge density. Meanwhile, the other configurations are the same as the first embodiment, thus, are given the same codes, and the descriptions thereof are omitted.
  • FIG. 9A is an edge image generated by extracting an edge from an image captured by the camera 2, in the feature extraction unit (edge image generating means) 11. In the edge image shown in FIG. 9A, since a shoulder 9 b exists in the vicinity parallel to a lane 9 a, due to the position and orientation of the particle, the edge of the lane in the virtual image and the edge of the shoulder 9 b in the edge image become a close state (a state in which the edge density is high), and the evaluation value is partially increased. Thus, in the second embodiment, when the distance between the closest edges (edge-to-edge distance) is close (less than or equal to a predetermined distance), in a region where the edge density of the edge image is high, the predetermined distance for determination is decreased.
  • For example, in a region where the edge density of the edge image is low, as in the first embodiment, with respect to the edges in the virtual image, the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) in the edge image that is shifted one or two pixels is set to 0.5, the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) that is shifted three or four pixels is set to 0.25, and the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) that is shifted five or more pixels is set to 0.0. On the other hand, in a region where the edge density of the edge image is high, with respect to the edge in the virtual image, the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) in the edge image that is shifted one pixel is set to 0.5, the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) that is shifted two pixels is set to 0.25, and the evaluation correction value ep(xi, yi) corresponding to the number of pixels of the pixel (xi, yi) that is shifted 3 or more pixels is set to 0.0. Namely, in an edge image such as that of FIG. 9A, by providing an edge vicinity region of two pixels on both sides of the edge of the lane 9 a, which is a region where the edge density of the edge image is high, as illustrated in FIG. 9B, the edge vicinity region can be narrowed compared to other regions, and the increasing of the evaluation correction value due to the edge of the shoulder 9 b can be suppressed.
  • The second embodiment exerts the following effects, in addition to the effects 1-5 of the first embodiment.
  • (6) The evaluation value calculation unit 15 decreases the predetermined distance, as the edge density of the edge image is increased. Therefore, the erroneous increasing of the evaluation correction value in regions where the edge density is high can be suppressed, and the accuracy and stability of the position and orientation estimation of the host vehicle can be improved.
  • Other Embodiments
  • A preferred embodiment of the present invention was described above based on one embodiment, but specific configurations of the present invention are not limited by the embodiment, and changes to the design made without departing from the scope of the invention are also included in the present invention. For example, in the embodiment, an evaluation function is set (maximization problem) so that the evaluation value increases, as the amount of the edges overlapping between the edge image and the edge in the virtual image is increased, but the evaluation function may be set so that error is reduced, as the amount of the edges overlapping is increased (minimization problem).
  • The present invention is applicable to the estimation of the relative position (lateral position) of a vehicle, with respect to the left and right white lines of the traveling path. In this case, the position and orientation parameters are, the one degree of freedom of the lateral position as the position parameter, and the two degrees of freedom of the pitch and yaw as the orientation parameter; a total of three degrees of freedom. Meanwhile, since the width of white lines vary from place to place, the width of white lines may be added as an estimation parameter.

Claims (7)

1. A moving body position estimation device comprising:
a camera configured to capture surroundings of a moving body to acquire a captured image;
a storage-configured to store a map data comprising edge information and position information of structures existing in the surroundings of the moving body;
a computer programmed to include
an edge image generating unit which extracts an edge from the captured image to generate an edge image;
a virtual image generating unit that sets multiple particles which are assumed positions and orientations of the moving body and converts the edge information of the map data for each of the particles to a virtual image captured from the assumed position and orientation;
an evaluation value calculation unit which compares the edge image and the virtual image for each of the particles, and assigns a higher evaluation value upon determining there are more overlapping edges between the edge and virtual images, and assigns a higher evaluation value upon determining there are more edges for which an edge-to-edge distance of when an edge in the edge image and an edge in the virtual image are three-dimensionally projected onto real space is less than or equal to a predetermined distance on an image that reflects a three-dimensional distance in real space with respect to edges that are not overlapping;
a position and orientation estimation unit which estimates a position of the moving body based on the evaluation value for each of the particles.
2. A moving body position estimation device comprising:
a camera configured to capture surroundings of a moving body to acquire a captured image;
a storage-configured to store a map data comprising edge information and position information of structures existing in the surroundings of the moving body;
a computer programmed to include
an edge image generating unit which extracts an edge from the captured image to generate an edge image;
a virtual image generating unit that sets multiple particles which are assumed positions and orientations of the moving body and converts the edge information of the map data for each of the particles to a virtual image captured from the assumed positions and orientations;
an evaluation value calculation unit which compares the edge image and the virtual image for each of the particles, and assigns a higher evaluation value upon determining there are more overlapping edges between the images, and assigns a higher evaluation value upon determining there are more edges that are not overlapping edges and for which an edge-to-edge distance is less than or equal to a predetermined distance, where the edge-to-edge distance is a distance between an edge in the edge image and an edge in the virtual image;
a position and orientation estimation unit which estimates a position of the moving body based on the evaluation value for each of the particles,
the evaluation value calculation unit reducing the predetermined distance as an edge density of the edge image increases.
3. The moving body position estimation device according to claim 2, wherein
the evaluation value calculation unit increases the evaluation value as the edge-to-edge distance decreases, when the edge-to-edge distance is less than or equal to the-predetermined distance.
4. The moving body position estimation device according to claim 2, wherein
the predetermined distance is a predetermined number of pixels.
5. The moving body position estimation device according to claim 3, wherein
the predetermined distance is a predetermined number of pixels.
6. A moving body position estimation method comprising:
capturing surroundings of a moving body to acquire a captured image;
extracting an edge from the captured image to generate an edge image;
setting multiple particles, which are assumed positions and orientations of the moving body, and converting edge information of map data for each of the particles to a virtual image captured from the assumed position and orientation;
comparing the edge image and the virtual image for each of the particles, and assigning a higher evaluation value upon determining there are more overlapping edges between the images, and assigning a higher evaluation value upon determining there are more edges that are not overlapping edges and for which an edge-to-edge distance is less than or equal to a predetermined distance, the distance being between an edge in the edge image and an edge in the virtual image;
the position of the moving body being estimated based on the evaluation value for each particle; and
reducing the predetermined distance as an edge density of the edge image is increased.
7. A moving body position estimation method comprising:
capturing surroundings of a moving body to acquire a captured image;
extracting an edge from the captured image to generate an edge image;
storing a map data comprising edge information and position information of structures existing in the surroundings of the moving body;
setting multiple particles which are assumed positions and orientations of the moving body and converting edge information of map data for each of the particles to a virtual image captured from the assumed position and orientation;
comparing the edge image and the virtual image for each particle, and assigning a higher evaluation value upon determining there are more overlapping edges between the images, and assigning a higher evaluation value upon determining there are more edges for which an edge-to-edge distance of when an edge in the edge image and an edge in the virtual image are three-dimensionally projected onto real space is less than a predetermined distance in real space, with respect to edges that are not overlapping; and
the position of the moving body being estimated based on the evaluation value for each of each of the particles.
US15/031,295 2013-11-13 2014-09-05 Moving body position estimation device and moving body position estimation method Expired - Fee Related US9424649B1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013-234516 2013-11-13
JP2013234516 2013-11-13
PCT/JP2014/073432 WO2015072217A1 (en) 2013-11-13 2014-09-05 Moving body position estimation device and moving body position estimation method

Publications (2)

Publication Number Publication Date
US20160239973A1 true US20160239973A1 (en) 2016-08-18
US9424649B1 US9424649B1 (en) 2016-08-23

Family

ID=53057162

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/031,295 Expired - Fee Related US9424649B1 (en) 2013-11-13 2014-09-05 Moving body position estimation device and moving body position estimation method

Country Status (8)

Country Link
US (1) US9424649B1 (en)
EP (1) EP3070430B1 (en)
JP (1) JP6112221B2 (en)
CN (1) CN105723180B (en)
BR (1) BR112016010089B1 (en)
MX (1) MX355313B (en)
RU (1) RU2621480C1 (en)
WO (1) WO2015072217A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170123425A1 (en) * 2015-10-09 2017-05-04 SZ DJI Technology Co., Ltd Salient feature based vehicle positioning
US20170161571A1 (en) * 2015-12-03 2017-06-08 GM Global Technology Operations LLC Snow covered path of travel surface condition detection
US9849591B2 (en) * 2015-10-02 2017-12-26 X Development Llc Localization of a robot in an environment using detected edges of a camera image from a camera of the robot and detected edges derived from a three-dimensional model of the environment
GB2568286A (en) * 2017-11-10 2019-05-15 Horiba Mira Ltd Method of computer vision based localisation and navigation and system for performing the same

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2760166B2 (en) 1991-03-15 1998-05-28 住友金属工業株式会社 Hot eddy current flaw detection method for wires
JP2606043Y2 (en) 1993-11-15 2000-09-11 三菱重工業株式会社 Eddy current flaw detector
US9727793B2 (en) * 2015-12-15 2017-08-08 Honda Motor Co., Ltd. System and method for image based vehicle localization
JP6782903B2 (en) * 2015-12-25 2020-11-11 学校法人千葉工業大学 Self-motion estimation system, control method and program of self-motion estimation system
JP6795379B2 (en) * 2016-03-10 2020-12-02 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Operation control device, operation control method and operation control program
JP6552448B2 (en) * 2016-03-31 2019-07-31 株式会社デンソーアイティーラボラトリ Vehicle position detection device, vehicle position detection method, and computer program for vehicle position detection
JP6929183B2 (en) * 2017-09-29 2021-09-01 株式会社デンソーテン Radar device and target detection method
WO2019111702A1 (en) * 2017-12-05 2019-06-13 ソニー株式会社 Information processing device, information processing method, and program
CN109935108A (en) * 2017-12-18 2019-06-25 姜鹏飞 A kind of traffic security early warning method of traffic control and device based on accurate location
US20200311455A1 (en) * 2019-03-27 2020-10-01 GM Global Technology Operations LLC Methods and systems for correcting sensor information
CN115004246B (en) * 2020-02-04 2025-06-03 发那科株式会社 Image processing device
RU2769918C1 (en) 2021-05-18 2022-04-08 Общество с ограниченной ответственностью "ЭвоКарго" Ground transport vehicle positioning method
WO2024185134A1 (en) * 2023-03-09 2024-09-12 日本電気株式会社 Camera orientation estimating device, camera orientation estimating method, and computer-readable recording medium

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8350850B2 (en) * 2008-03-31 2013-01-08 Microsoft Corporation Using photo collections for three dimensional modeling
JP5111210B2 (en) * 2008-04-09 2013-01-09 キヤノン株式会社 Image processing apparatus and image processing method
JP5297727B2 (en) 2008-09-04 2013-09-25 トヨタ自動車株式会社 Robot apparatus and object position / orientation estimation method
US8385591B1 (en) * 2009-04-28 2013-02-26 Google Inc. System and method of using images to determine correspondence between locations
US8164543B2 (en) * 2009-05-18 2012-04-24 GM Global Technology Operations LLC Night vision on full windshield head-up display
JP5512258B2 (en) * 2009-12-25 2014-06-04 本田技研工業株式会社 Orientation measuring apparatus, orientation measuring system, orientation measuring method, and orientation measuring program
WO2011153624A2 (en) * 2010-06-11 2011-12-15 Ambercore Software Inc. System and method for manipulating data having spatial coordinates
JP5703801B2 (en) * 2011-02-04 2015-04-22 富士通株式会社 Robot, position estimation method and program
JP2012243051A (en) * 2011-05-19 2012-12-10 Fuji Heavy Ind Ltd Environment recognition device and environment recognition method
WO2012172870A1 (en) * 2011-06-14 2012-12-20 日産自動車株式会社 Distance measurement device and environment map generation apparatus
US8704882B2 (en) * 2011-11-18 2014-04-22 L-3 Communications Corporation Simulated head mounted display system and method
WO2013133129A1 (en) * 2012-03-06 2013-09-12 日産自動車株式会社 Moving-object position/attitude estimation apparatus and method for estimating position/attitude of moving object
JP5867176B2 (en) * 2012-03-06 2016-02-24 日産自動車株式会社 Moving object position and orientation estimation apparatus and method
EP2662828B1 (en) * 2012-05-11 2020-05-06 Veoneer Sweden AB A vision system and method for a motor vehicle
JP6079076B2 (en) * 2012-09-14 2017-02-15 沖電気工業株式会社 Object tracking device and object tracking method
JP6197388B2 (en) * 2013-06-11 2017-09-20 富士通株式会社 Distance measuring device, distance measuring method, and program
WO2015159547A1 (en) * 2014-04-18 2015-10-22 日本電気株式会社 Information processing system, control method, and program recording medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9849591B2 (en) * 2015-10-02 2017-12-26 X Development Llc Localization of a robot in an environment using detected edges of a camera image from a camera of the robot and detected edges derived from a three-dimensional model of the environment
US20170123425A1 (en) * 2015-10-09 2017-05-04 SZ DJI Technology Co., Ltd Salient feature based vehicle positioning
US10599149B2 (en) * 2015-10-09 2020-03-24 SZ DJI Technology Co., Ltd. Salient feature based vehicle positioning
US20170161571A1 (en) * 2015-12-03 2017-06-08 GM Global Technology Operations LLC Snow covered path of travel surface condition detection
US10013617B2 (en) * 2015-12-03 2018-07-03 Gm Global Technology Operations Snow covered path of travel surface condition detection
GB2568286A (en) * 2017-11-10 2019-05-15 Horiba Mira Ltd Method of computer vision based localisation and navigation and system for performing the same
WO2019092418A1 (en) * 2017-11-10 2019-05-16 Horiba Mira Limited Method of computer vision based localisation and navigation and system for performing the same
GB2568286B (en) * 2017-11-10 2020-06-10 Horiba Mira Ltd Method of computer vision based localisation and navigation and system for performing the same
US11393216B2 (en) 2017-11-10 2022-07-19 Horiba Mira Limited Method of computer vision based localisation and navigation and system for performing the same

Also Published As

Publication number Publication date
US9424649B1 (en) 2016-08-23
JP6112221B2 (en) 2017-04-12
JPWO2015072217A1 (en) 2017-03-16
BR112016010089A2 (en) 2020-11-10
RU2621480C1 (en) 2017-06-06
EP3070430B1 (en) 2019-08-14
CN105723180B (en) 2017-08-15
MX355313B (en) 2018-04-16
EP3070430A4 (en) 2017-01-11
CN105723180A (en) 2016-06-29
BR112016010089B1 (en) 2021-06-08
MX2016005904A (en) 2016-07-13
EP3070430A1 (en) 2016-09-21
WO2015072217A1 (en) 2015-05-21

Similar Documents

Publication Publication Date Title
US9424649B1 (en) Moving body position estimation device and moving body position estimation method
WO2021093240A1 (en) Method and system for camera-lidar calibration
US9741130B2 (en) Method and apparatus for detecting object
CN113156421A (en) Obstacle detection method based on information fusion of millimeter wave radar and camera
CN112292711A (en) Correlating LIDAR data and image data
US20100315505A1 (en) Object motion detection system based on combining 3d warping techniques and a proper object motion detection
KR101551026B1 (en) Method of tracking vehicle
JP5834933B2 (en) Vehicle position calculation device
WO2020154990A1 (en) Target object motion state detection method and device, and storage medium
CN105981086B (en) Self-position computing device and self-position computational methods
KR102167835B1 (en) Apparatus and method of processing image
Erbs et al. Moving vehicle detection by optimal segmentation of the dynamic stixel world
JP2022045947A5 (en)
CN111553342B (en) A visual positioning method, device, computer equipment and storage medium
CN115494856B (en) Obstacle avoidance method, device, drone and electronic equipment
CN111260709B (en) Ground-assisted visual odometer method for dynamic environment
JP7315216B2 (en) Corrected Distance Calculation Device, Corrected Distance Calculation Program, and Corrected Distance Calculation Method
Ibisch et al. Towards highly automated driving in a parking garage: General object localization and tracking using an environment-embedded camera system
CN117408935A (en) Obstacle detection method, electronic device, and storage medium
Catalin et al. Object tracking from stereo sequences using particle filter
KR101980899B1 (en) Apparatus for detecting of inside wall frame in single image using orthogonal vanishing points and method thereof
JP5891802B2 (en) Vehicle position calculation device
JP5903901B2 (en) Vehicle position calculation device
CN110488320B (en) A Method of Using Stereo Vision to Detect Vehicle Distance
Pfeiffer et al. Ground truth evaluation of the Stixel representation using laser scanners

Legal Events

Date Code Title Description
AS Assignment

Owner name: NISSAN MOTOR CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TANAKA, SHINYA;REEL/FRAME:038348/0919

Effective date: 20160331

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20240823