WO2012086029A1 - Autonomous movement system - Google Patents
Autonomous movement system Download PDFInfo
- Publication number
- WO2012086029A1 WO2012086029A1 PCT/JP2010/073131 JP2010073131W WO2012086029A1 WO 2012086029 A1 WO2012086029 A1 WO 2012086029A1 JP 2010073131 W JP2010073131 W JP 2010073131W WO 2012086029 A1 WO2012086029 A1 WO 2012086029A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- prediction error
- unit
- autonomous mobile
- self
- shape model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
Definitions
- the present invention relates to an autonomous mobile system that moves an autonomous mobile body to a destination while estimating a self-position based on data of a measuring device.
- the autonomous mobile system in the prior art is provided with a map in which object shapes in a real environment that can be used as landmarks (marks) used for estimating the self-location are stored.
- Self-position estimation has been realized by matching (positioning) this map with the shape of surrounding objects existing in the actual environment measured using a measuring device such as a laser scanner.
- the autonomous mobile system may actually deviate from the route.
- the autonomous mobile system collides with obstacles (static obstacles such as buildings, roadside trees, and power poles, and dynamic obstacles such as people and other vehicles that are moving objects) or breaks at intersections. There was a possibility of going on the wrong path.
- obstacles static obstacles such as buildings, roadside trees, and power poles, and dynamic obstacles such as people and other vehicles that are moving objects
- breaks at intersections There was a possibility of going on the wrong path.
- the autonomous mobile system completely loses its own position and it is difficult to continue the subsequent movement.
- An object of the present invention is to provide an autonomous mobile system that plans a route that allows the autonomous mobile system to move appropriately without losing sight of its own position and reaches the destination.
- the above-mentioned purpose is to input a travel environment shape model in an autonomous mobile system in which an autonomous mobile body travels to a destination while estimating its own position by referring to a map based on measurement data from a peripheral object shape measurement unit.
- An environmental shape model input unit, a self-position error prediction unit that calculates a prediction error of self-position estimation based on information from the driving environment shape model input unit, and the prediction error in an area where the autonomous mobile body can travel A prediction error map generation storage unit that generates and stores a predicted error map, and a route plan unit that performs route planning based on the prediction error corresponding to the route traveled by the autonomous mobile body with reference to the prediction error map This is achieved by providing
- the object is that the travel environment shape model input unit, the self-position error prediction unit, and the prediction error map generation storage unit are installed in a management system, the route planning unit is installed in a terminal system, and the management system And the terminal system are preferably connected by a wireless network.
- the object is that the prediction error map generated and stored by the prediction error map generation storage unit divides the travelable area into a plurality of small areas and holds the prediction error corresponding to the small areas. Is preferred.
- the above object holds the prediction error corresponding to each of a plurality of directions in the small area.
- the prediction error map generation storage unit extracts a shape having a predetermined height from the shape model input to the driving environment shape model input unit, and the self-position error prediction unit generates the prediction error map. It is preferable to calculate the prediction error using the shape extracted by the storage unit.
- an autonomous mobile system that reaches a destination by planning a route that allows the autonomous mobile system to move appropriately without losing sight of its own position.
- road structures such as a building, a roadside tree, and a telephone pole, and a runnable field (road) concerning one example.
- FIG. 6 is a plan view of landmark data that has been coordinate-transformed and an environmental shape model according to the present embodiment. It is a top view of the measurement range and environmental shape model which extracted the landmark data in a certain node concerning a present Example.
- FIG. 6 is a plan view of landmark data that has been coordinate-transformed and an environmental shape model according to the present embodiment. It is a top view of the measurement range and environmental shape model which extracted the landmark data in a certain node concerning a present Example.
- FIG. 6 is a plan view of landmark data that has been coordinate-transformed and an environmental shape model according to the present embodiment.
- FIG. 1 is a block diagram of an autonomous mobile system according to an embodiment of the present invention.
- an autonomous mobile system 1 is mounted on a vehicle v such as an autonomous mobile robot or an autonomous mobile vehicle and provides an autonomous mobile function.
- the autonomous mobile system 1 includes a traveling environment shape model input unit 2, a prediction error map generation storage unit 3, a self-position error prediction unit 4, a landmark map storage unit 5, a peripheral object shape measurement unit 6, a self-position estimation unit 7,
- the destination input unit 8, the route planning unit 9, and the traveling unit 10 are configured.
- the traveling environment shape model input unit 2 is a portion for inputting a shape model of a traveling environment such as a building in a city, a road structure such as a roadside tree, and a power pole.
- the environmental shape model m to be input is used in recent car navigation systems, or uses CityGML (http://www.citygml.org/), which is standardized worldwide by the Open Geospatial Consortium (OGC). .
- These environmental shape models m have information such as the shape and texture of road structures represented by polygons and the like, and information such as sidewalks / roadways as attribute information.
- the prediction error map generation storage unit 3 is a part that generates and stores a prediction error map in the travelable region using the environment shape model m input from the travel environment shape model input unit 2.
- processing is performed in cooperation with the self-position error prediction unit 4 (specific processing contents will be described later).
- the self-position error prediction unit 4 is a part that predicts in advance a self-position estimation error that occurs at each position in the travelable area on the map by using a geometric calculation method. Data necessary for the calculation is acquired from the prediction error map generation storage unit 3, and the calculated prediction error of the self-position estimation is returned to the prediction error map generation storage unit 3 (specific processing contents will be described later).
- the landmark map storage unit 5 is a part that extracts a portion that can be used as a landmark (mark) from the environmental shape model m input from the traveling environment shape model input unit 2 and stores it as a landmark map. If a landmark map can be acquired separately in advance, the separately acquired landmark map may be stored without extracting the landmark from the environmental shape model m.
- the peripheral object shape measuring unit 6 is a part that measures the shape of an object (a building, a road tree such as a tree, a telephone pole, a person, or another vehicle) existing around the vehicle v.
- a laser scanner, a stereo camera, a “Time of Flight”, a Distance image camera, or the like can be used.
- the self-position estimation unit 7 detects a portion to be a landmark (mark) from the shape of the surrounding object measured by the surrounding object shape measurement unit 6 at the current position, and the landmark map stored in the landmark map storage unit 5. This is a part for estimating the current self-position on the map by performing matching (position alignment).
- a landmark mark
- the method described in the book “Probabilistic Robotics” (author: Sebastian Thrun, Wolfram Burgard, and Dieter Fox, publisher: The MIT Press, publication year: 2005) can be used.
- the destination input unit 8 is a part where the user of the vehicle v inputs a destination to the autonomous mobile system 1. For example, as in the car navigation system, the user inputs a destination by operating a touch panel displaying a map.
- the route planning unit 9 is a part that plans a route from the current self-position estimated by the self-position estimation unit 7 to the destination input from the destination input unit 8 using the prediction error of the self-position estimation as a cost. .
- the prediction error map stored in the prediction error map generation storage unit 3 is used (specific processing contents will be described later).
- the traveling unit 10 is a part for driving and driving the wheels of the vehicle v on which the autonomous mobile system 1 is mounted. Control is performed so that the vehicle v travels according to the route planned by the route planning unit 9. In addition, the movement form by a crawler and a leg may be sufficient instead of a wheel.
- FIG. 2 is a block diagram of another form of the autonomous mobile system according to one embodiment.
- the traveling environment shape model input unit 2, the prediction error map generation storage unit 3, the self-position error prediction unit 4, and the landmark map storage unit 5 in the autonomous mobile system 1 are not on the vehicle v.
- a terminal system 1b is installed in the vehicle v, and the autonomous mobile system 1 is configured by combining the management system 1a and the terminal system 1b.
- each component of the autonomous mobile system 1 is the same as that of the embodiment of FIG. 1, but between the prediction error map generation storage unit 3 and the route plan unit 9, and the landmark map storage unit 5 and the self-position estimation unit 7 is different in that communication between 7 is performed via a wireless network.
- FIG. 3 is a plan view in which a travelable area is extracted from an environmental shape model according to an embodiment and divided by a grid.
- a travelable region (road) is extracted from the environment shape model m input from the travel environment shape model input unit 2.
- the top view of FIG. 3 shows a plan view of the result of extracting the travelable area from the environmental shape model m.
- the environmental shape model m not only has a shape and a texture, but also has information such as a sidewalk / roadway as attribute information. For example, when an autonomous mobile robot traveling on a sidewalk is targeted, By extracting, it is possible to extract the travelable area.
- the travelable area is divided by a grid 11 or the like to express discretely.
- a grid 11 In the lower part of FIG. 3, an example is shown in which the grid 11 is divided by a uniform size. Actually, for example, the travelable area is divided by a grid 11 of 1 m square.
- how to divide the travelable area is not limited to the grid 11 of uniform size, but the book “Introduction to Intelligent Robots—Solution of Motion Planning Problems” (authors: Junjun Ohta, Daisuke Kurabayashi, Tomio Arai, Publishing) (A company: Corona, publication year: 2001).
- FIG. 4 is a plan view showing a result of constructing a graph structure with nodes and links by dividing a travelable area according to an embodiment by a grid.
- a node 12 is generated at the center of each divided grid 11.
- one round 360 deg is divided into one grid, and a plurality of nodes 12 are generated.
- one round 360 deg is divided into eight every 45 deg to generate eight nodes 12. Therefore, each node 12 becomes a node 12 in a three-dimensional space of (x, y, ⁇ ).
- the measurement range 14 of the peripheral object shape measurement unit 6 cannot cover the entire circumference, and the measurable region changes depending on the azimuth ⁇ .
- some peripheral object shape measuring units 6 such as an omnidirectional stereo camera
- many peripheral object shape measuring units 6 such as a laser scanner cannot cover the entire circumference, so each node 12 has information on the direction ⁇ . It is effective to have
- a graph structure is constructed by connecting nodes corresponding to 8 neighbors as viewed in the two-dimensional plane of (x, y) with links 13.
- This graph structure is the basic data structure used by the route planning unit 9.
- the route planning unit 9 that does not use the link 13 (details will be described in the description of the route planning unit 9).
- the prediction error map generation storage unit 3 generates data for passing to the self-position error prediction unit 4.
- the self-position error prediction unit 4 generates data necessary for calculation for each node 12 in order to predict an error of self-position estimation for each node 12.
- FIG. 5 is a plan view of an environmental shape model having information on road structures such as buildings, roadside trees, utility poles, and travelable areas (roads) according to an embodiment.
- FIG. 6 is a diagram illustrating an example of extracting landmark data from measurement data in a region higher than a person according to the present embodiment.
- the node 12a in FIG. 5 will be described as an example.
- measurement data is generated by simulating an object that exists at the position (x, y, ⁇ ) of the node 12 a and exists in the measurement range 14 of the peripheral object shape measurement unit 6.
- the measurement data can be simulated by performing ray casting, which is a computer graphics technique, and obtaining the reflection point of the laser irradiated from the laser scanner. It is. Note that the measurement data simulated at this time may be thinned out evenly. For example, thinning is performed at intervals of 0.1 m.
- landmark data z used as a landmark is extracted from the measured measurement data.
- the landmark data is extracted by extracting a predetermined height using the height information of each measurement data. Extract z.
- data used as a landmark may be extracted from the result of clustering measurement data or extracting features.
- the above processing is performed for each node 12 and the landmark data z is extracted for each node 12.
- the landmark data z extracted here and the environment shape model m input from the traveling environment shape model input unit 2 are data used by the self-position error prediction unit 4 for calculation.
- FIG. 7 is a flowchart illustrating a processing procedure of the self-position error prediction unit according to the embodiment.
- the self-position error prediction unit 4 obtains a prediction error of self-position estimation as a covariance matrix ⁇ by geometric calculation using the landmark data z and the environment shape model m for each node 12. .
- the basic idea is to move the position of the landmark data z by a small amount and evaluate the degree of overlap with the environmental shape model m at that time to obtain the nature of the error (deviation) during matching, From there, the covariance matrix ⁇ is calculated.
- the degree of overlap with the environmental shape model m changes greatly when the landmark data z is moved, it means that the uniqueness of the shape of the landmark data z is high. Has the property of becoming smaller. Conversely, if the degree of overlap does not change much when the landmark data z is moved, this means that the uniqueness is low, and as a result, the matching error becomes large. Further, since the nature of how the error changes when moved, the nature of the error can be obtained not as a value but as a distribution represented by the covariance matrix ⁇ . First, the movement amount (x T , y T , ⁇ T ) for moving the landmark data z is determined (S11).
- the maximum value and the minimum value of the movement amount are set in advance, and the range is equally divided at regular intervals.
- the maximum value and the minimum value may be determined based on the error magnitude of the self-position estimation unit 7. For example, for x T and y T respectively 10 divided for each 0.5m range of + 2.5 m from -2.5m, for theta T is 12 divided for each 10deg a range of + 60 deg from -60Deg. In this case, there are 1200 combinations of movement amounts of 10 ⁇ 10 ⁇ 12.
- the landmark data z is coordinate-converted with the movement amount (x T , y T , ⁇ T ) (S12).
- the homogeneous transformation matrix representing the coordinate transformation by the movement amount (x T , y T , ⁇ T ) is T
- the landmark data after the coordinate transformation is expressed as Tz.
- the degree of overlap is calculated as the matching evaluation value DT using the coordinate-converted landmark data Tz and the environmental shape model m (S13).
- Equation (1) and Equation (2) are used. *
- the correspondence c between the point group of the landmark data Tz and the polygon vertex group of the environmental shape model m is obtained by the nearest neighbor search of Expression (1).
- the number of point groups of the landmark data Tz is N.
- the matching evaluation value DT is obtained by the root mean square of the distance between corresponding points in Expression (2) using the obtained correspondence c.
- n is a normal direction unit vector of the polygon surface of the environmental shape model m.
- the likelihood L T calculates the likelihood L T from the matching evaluation value D T (S14).
- the likelihood L T here approximates the likelihood of overlapping with the environmental shape model m when the landmark data z is coordinate-transformed with the movement amount (x T , y T , ⁇ T ) by a normal distribution. Is.
- the calculation of likelihood L T for example using equation (3).
- ⁇ is a constant representing the nature of the error of the peripheral object shape measuring unit 6.
- a specific value may be determined based on the magnitude of the error of the peripheral object shape measurement unit 6.
- the calculation of the likelihood L T for the above movement amounts (x T , y T , ⁇ T ) is performed for all combinations of movement amounts (S15). In the above example, there are 1200 combinations.
- Equation (4) each amount of movement (x T, y T, ⁇ T) was weighted likelihood L T for all the combinations of a calculation of the covariance matrix ⁇ by weighted least squares method.
- sigma T is a mathematical symbol indicates a total for each amount of movement.
- the processing of the self-position error prediction unit 4 is finished (S17, S18). Thereby, the prediction error of the self-position estimation in each node 12 in the travelable area is obtained as a covariance matrix ⁇ . Further, the obtained prediction error of the self-position estimation in each node 12 is returned to the prediction error map generation storage unit 3.
- the prediction error map generation storage unit 3 can generate and store a prediction error map with a data structure of each node 12 in the travelable area and a prediction error of self-position estimation associated with each node 12. .
- FIG. 8 is a plan view of a measurement range and an environmental shape model obtained by extracting landmark data at a certain node according to one embodiment.
- the processing from S11 to S14 in the flowchart of the self-position error prediction unit 4 shown in FIG. 7 in such a situation will be described as an example.
- the movement amount (x T , y T , ⁇ T ) is determined to be (+1.0 m, +2.0 m, 0 deg) (S11).
- the landmark data z1 extracted from the measurement range 14 is coordinate-converted with the movement amount (S12).
- FIG. 9 is a plan view of the landmark data subjected to coordinate conversion and the environmental shape model according to the present embodiment.
- the landmark data Tz1 subjected to coordinate conversion and the environmental shape model m do not overlap (the deviation is large).
- the degree of overlap between the landmark data z1 and the environmental shape model m is greatly changed. If the degree of overlap with the environmental shape model m changes greatly when the landmark data z1 is moved, it means that the uniqueness of the shape of the landmark data z1 is high. It has the property that the matching error becomes smaller with respect to the direction.
- a matching evaluation value DT is calculated in order to quantitatively determine the degree of overlap between the coordinate-converted landmark data Tz1 and the environmental shape model m (S13).
- the matching evaluation value DT which is the mean square of the distance between corresponding points, is a large value.
- the likelihood L T is obtained by approximating the likelihood of overlap in the normal distribution, the likelihood L T as the matching evaluation value D T is large (low degree of overlap) is a small value.
- FIG. 10 shows a plan view of the measurement range 14b from which the landmark data z is extracted and the environmental shape model m, as seen from above, in a certain node 12c in a situation different from that in FIG.
- FIG. 11 is a plan view of the landmark data subjected to coordinate conversion and the environmental shape model according to one embodiment.
- the landmark data Tz2 subjected to coordinate conversion and the environmental shape model m are relatively overlapped (the deviation is small). That is, it can be seen that there is a deviation in the x direction, but there appears to be no deviation in the y direction.
- the degree of overlap between the landmark data z2 and the environmental shape model m does not change much. If the degree of overlap with the environmental shape model m does not change much when the landmark data z2 is moved, it means that the uniqueness of the shape of the landmark data z2 is low. It has the property that the matching error increases with respect to the direction.
- a matching evaluation value DT is calculated in order to quantitatively determine the degree of overlap between the landmark data Tz2 subjected to coordinate conversion and the environmental shape model m (S13).
- the matching evaluation value DT which is the mean square of the distance between corresponding points, is a small value.
- the likelihood L T is obtained by approximating the likelihood of overlap in the normal distribution, the likelihood L T as the matching evaluation value D T is smaller (higher degree of overlap) is a large value.
- the route plan unit 9 plans a route using the prediction error map stored in the prediction error map generation storage unit 3.
- the prediction error map has a data structure of each node 12 obtained by dividing the travelable area shown in FIG. 4 by the grid 11 and a prediction error of self-position estimation associated with each node 12.
- Each node 12 is a node 12 in a three-dimensional space of (x, y, ⁇ ), and there are a plurality of nodes 12 having different azimuths ⁇ in the same place as seen in the two-dimensional plane of (x, y). .
- the route planning unit 9 As an embodiment of the route planning unit 9, at least two forms are conceivable.
- the first is a form in which a route plan is performed by a shortest route search algorithm in the graph theory for a graph structure constructed by connecting nodes with links 13 as described above.
- the shortest path search is performed using the prediction error of the self-position estimation of each node 12 as a cost.
- the travelable region is discretely expressed in a three-dimensional (x, y, ⁇ ) configuration space, and the target route in the region is represented. Can be requested.
- the current self-position estimated by the self-position estimation unit 7 is set as the start position
- the destination input from the destination input unit 8 is set as the goal position
- the path planning is performed using the prediction error of the self-position estimation of each node 12 as the cost. Do.
- the path planning unit 9 calculates the cost, the plurality of nodes 12 through which the path passes have.
- the covariance matrix ⁇ By fusing the covariance matrix ⁇ by maximum likelihood estimation, the covariance at the time of passing through each node 12 is obtained.
- simulation of covariance when traveling along a route is performed using the driving model described in the book “Probabilistic Robotics” (Author: Sebastian Thrun, Wolfram Burgard, and Dieter Fox, Publisher: The MIT Press, Publication year: 2005)
- the cost can be obtained by fusing the covariance matrix ⁇ of the node 12 with the maximum likelihood estimation.
- the prediction error represented by the fused covariance As the cost of the path plan, it is determined whether the covariance at the time of passing through each node 12 exceeds a predetermined threshold, thereby increasing the prediction error. It is possible to plan a route to the destination considering that the route does not pass. To determine whether the covariance exceeds a predetermined threshold, the elements of the matrix may be compared, or the eigenvalues of the matrix may be compared.
- each node 12 has a cost as a value, and each node 12 represents the cost as a covariance matrix ⁇ , instead of evaluating the route by the sum of the costs of a plurality of nodes 12 through which the route passes. And the route is evaluated by covariance when the route passes through each node 12. This makes it possible to plan a route in consideration of differences in the nature (size and direction of deviation) of errors that occur in self-position estimation at each position.
- an error in self-position estimation at each position in the travelable region is predicted in advance, and a route that increases the prediction error does not pass. It is possible to plan the route to the destination in consideration of the above.
- this invention can acquire the outstanding effect of implement
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本発明は、計測装置のデータに基づき自己位置を推定しながら自律移動体を目的地まで移動させる自律移動システムに関する。 The present invention relates to an autonomous mobile system that moves an autonomous mobile body to a destination while estimating a self-position based on data of a measuring device.
従来、車両に搭載された計測装置(外界センサや内界センサ)のデータに基づいて、実環境に対応した地図上での自己位置を推定しながら、計画された経路に従って実環境を自律移動することで目的地に到達する自律移動システム(例えば、特許文献1~3参照)が知られている。 Conventionally, based on the data of a measurement device (external sensor or internal sensor) mounted on a vehicle, it self-moves in the real environment according to a planned route while estimating its own position on a map corresponding to the real environment. Thus, autonomous mobile systems that reach their destinations are known (see, for example, Patent Documents 1 to 3).
従来技術における自律移動システムは、自己位置を推定するために用いるランドマーク(目印)として利用可能な実環境の物体形状が記憶されている地図を備えている。この地図とレーザスキャナなどの計測装置を用いて計測した実環境中に存在する周辺物体の形状とをマッチング(位置合わせ)することにより自己位置推定を実現していた。 The autonomous mobile system in the prior art is provided with a map in which object shapes in a real environment that can be used as landmarks (marks) used for estimating the self-location are stored. Self-position estimation has been realized by matching (positioning) this map with the shape of surrounding objects existing in the actual environment measured using a measuring device such as a laser scanner.
しかしながら、レーザスキャナなどの計測装置の計測範囲は限られているため、移動する経路によっては自己位置を正確に推定するのに充分なランドマークを検出することができない。あるいは幾何学的にマッチングの誤差(ずれ)が大きくなってしまう性質を持つランドマーク形状しか計測できず、自律移動システムで推定される自己位置と当該自律移動システムが実際にいる実位置との間に大きな誤差が生まれる場合があった。 However, since the measurement range of a measurement device such as a laser scanner is limited, it is not possible to detect a landmark sufficient for accurately estimating the self position depending on the moving path. Alternatively, only landmark shapes that have geometrically large matching errors (displacements) can be measured, and between the self-position estimated by the autonomous mobile system and the actual position where the autonomous mobile system is actually located. There was a case where a big error was born.
その結果、自律移動システム自体が目標経路に沿って移動していると認識していても、実際には自律移動システムが前記経路から逸脱する場合があった。その結果、自律移動システムが障害物(建物や街路樹や電柱などの静的障害物や、移動体である人や他の車両などの動的障害物)と衝突して破損したり、交差点において誤った道に進んでしまう可能性があった。また、自律移動システムが自己位置を完全に見失ってしまい、それ以降の移動の継続が困難になるなどの問題点がある。 As a result, even when the autonomous mobile system itself recognizes that it is moving along the target route, the autonomous mobile system may actually deviate from the route. As a result, the autonomous mobile system collides with obstacles (static obstacles such as buildings, roadside trees, and power poles, and dynamic obstacles such as people and other vehicles that are moving objects) or breaks at intersections. There was a possibility of going on the wrong path. In addition, there is a problem that the autonomous mobile system completely loses its own position and it is difficult to continue the subsequent movement.
本発明の目的は、自律移動システムが自己位置を見失うことなく、適切に移動することができる経路を計画して目的地に到達する自律移動システムを提供することにある。 An object of the present invention is to provide an autonomous mobile system that plans a route that allows the autonomous mobile system to move appropriately without losing sight of its own position and reaches the destination.
上記目的は、周辺物体形状計測部による計測データに基づいて地図を参照することで自己位置を推定しながら自律移動体が目的地まで走行する自律移動システムにおいて、走行環境の形状モデルを入力する走行環境形状モデル入力部と、この走行環境形状モデル入力部からの情報に基づき自己位置推定の予測誤差を計算する自己位置誤差予測部と、前記自律移動体が走行可能領域において前記予測誤差を対応付けた予測誤差地図を生成して記憶する予測誤差地図生成記憶部と、前記予測誤差地図を参照して前記自律移動体が走行する経路に対応した前記予測誤差に基づいて経路計画を行う経路計画部とを備えたことにより達成される。 The above-mentioned purpose is to input a travel environment shape model in an autonomous mobile system in which an autonomous mobile body travels to a destination while estimating its own position by referring to a map based on measurement data from a peripheral object shape measurement unit. An environmental shape model input unit, a self-position error prediction unit that calculates a prediction error of self-position estimation based on information from the driving environment shape model input unit, and the prediction error in an area where the autonomous mobile body can travel A prediction error map generation storage unit that generates and stores a predicted error map, and a route plan unit that performs route planning based on the prediction error corresponding to the route traveled by the autonomous mobile body with reference to the prediction error map This is achieved by providing
また上記目的は、前記走行環境形状モデル入力部と前記自己位置誤差予測部と前記予測誤差地図生成記憶部は管理システムに設置されており、前記経路計画部は端末システムに設置され、前記管理システムと前記端末システムとが無線ネットワークによって接続されていることが好ましい。 Further, the object is that the travel environment shape model input unit, the self-position error prediction unit, and the prediction error map generation storage unit are installed in a management system, the route planning unit is installed in a terminal system, and the management system And the terminal system are preferably connected by a wireless network.
また上記目的は、前記予測誤差地図生成記憶部が生成して記憶する前記予測誤差地図は前記走行可能領域を複数の小領域に区切り、前記小領域に対応した前記予測誤差を保持していることが好ましい。 Further, the object is that the prediction error map generated and stored by the prediction error map generation storage unit divides the travelable area into a plurality of small areas and holds the prediction error corresponding to the small areas. Is preferred.
また上記目的は、前記小領域に複数の方位毎に対応した前記予測誤差を保持していることが好ましい。 Also, it is preferable that the above object holds the prediction error corresponding to each of a plurality of directions in the small area.
また上記目的は、前記予測誤差地図生成記憶部は前記走行環境形状モデル入力部に入力される前記形状モデルから所定の高さの形状を抽出し、前記自己位置誤差予測部は前記予測誤差地図生成記憶部により抽出された前記形状を用いて前記予測誤差を計算することが好ましい。 Further, the object is that the prediction error map generation storage unit extracts a shape having a predetermined height from the shape model input to the driving environment shape model input unit, and the self-position error prediction unit generates the prediction error map. It is preferable to calculate the prediction error using the shape extracted by the storage unit.
本発明によれば、自律移動システムが自己位置を見失うことなく、適切に移動することができる経路を計画して目的地に到達する自律移動システムを提供できる。 According to the present invention, it is possible to provide an autonomous mobile system that reaches a destination by planning a route that allows the autonomous mobile system to move appropriately without losing sight of its own position.
以下、本発明の実施形態を図を参照しながら詳細に説明する。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
まず、本発明の実施例となる自律移動システムの構成を図1を参照して説明する。
図1は本発明の一実施例に係る自律移動システムのブロック図である。
図1において、自律移動システム1は、自律移動ロボットや自律移動車などの車両vに搭載されて自律移動機能を提供するものである。この自律移動システム1は、走行環境形状モデル入力部2、予測誤差地図生成記憶部3、自己位置誤差予測部4、ランドマーク地図記憶部5、周辺物体形状計測部6、自己位置推定部7、目的地入力部8、経路計画部9、走行部10から構成されている。
First, the configuration of an autonomous mobile system according to an embodiment of the present invention will be described with reference to FIG.
FIG. 1 is a block diagram of an autonomous mobile system according to an embodiment of the present invention.
In FIG. 1, an autonomous mobile system 1 is mounted on a vehicle v such as an autonomous mobile robot or an autonomous mobile vehicle and provides an autonomous mobile function. The autonomous mobile system 1 includes a traveling environment shape
以上の構成要素の個々について説明すると、走行環境形状モデル入力部2は、都市内の建物や街路樹や電柱などの道路構造物などの走行環境の形状モデルを入力する部分である。入力する環境形状モデルmには、近年のカーナビゲーションシステムにおいて利用されていたり、Open Geospatial Consortium(OGC)により世界標準化されているCityGML(http://www.citygml.org/)などを用いたりする。これらの環境形状モデルmは、ポリゴンなどで表現される道路構造物の形状やテクスチャといった情報と共に、属性情報として歩道/車道などの情報も持っている。
The individual components described above will be described. The traveling environment shape
予測誤差地図生成記憶部3は、走行環境形状モデル入力部2から入力された環境形状モデルmを用い、走行可能領域における予測誤差地図を生成して記憶する部分である。予測誤差地図を生成する際には、自己位置誤差予測部4と連携して処理を行う(具体的な処理内容については後述する)。
The prediction error map
自己位置誤差予測部4は、幾何学的な計算手法を用いることで、地図上の走行可能領域における各位置で発生する自己位置推定の誤差を事前に予測する部分である。予測誤差地図生成記憶部3から計算に必要なデータを取得し、計算した自己位置推定の予測誤差を予測誤差地図生成記憶部3に返す(具体的な処理内容については後述する)。
The self-position
ランドマーク地図記憶部5は、走行環境形状モデル入力部2から入力された環境形状モデルmの内、ランドマーク(目印)として利用可能な部分を抽出してランドマーク地図として記憶する部分である。なお、あらかじめ別途ランドマーク地図が取得可能である場合は、環境形状モデルmからランドマークを抽出することなく、別途取得したランドマーク地図を記憶しても良い。
The landmark
周辺物体形状計測部6は、車両vの周辺に存在する物体(建物や街路樹や電柱などの道路構造物、人や他の車両など)の形状を計測する部分である。例えば、レーザスキャナやステレオカメラや Time of Flight 距離画像カメラなどを用いることができる。 The peripheral object shape measuring unit 6 is a part that measures the shape of an object (a building, a road tree such as a tree, a telephone pole, a person, or another vehicle) existing around the vehicle v. For example, a laser scanner, a stereo camera, a “Time of Flight”, a Distance image camera, or the like can be used.
自己位置推定部7は、周辺物体形状計測部6が現在位置で計測した周辺物体の形状からランドマーク(目印)となる部分を検出し、ランドマーク地図記憶部5が記憶しているランドマーク地図とマッチング(位置合わせ)を行うことで、地図上での現在の自己位置を推定する部分である。
例えば、書籍「Probabilistic Robotics」(著者:Sebastian Thrun, Wolfram Burgard, and Dieter Fox、出版社:The MIT Press、出版年:2005年)に記載の方法などを用いることができる。
The self-position estimation unit 7 detects a portion to be a landmark (mark) from the shape of the surrounding object measured by the surrounding object shape measurement unit 6 at the current position, and the landmark map stored in the landmark
For example, the method described in the book “Probabilistic Robotics” (author: Sebastian Thrun, Wolfram Burgard, and Dieter Fox, publisher: The MIT Press, publication year: 2005) can be used.
目的地入力部8は、車両vのユーザが自律移動システム1に対して目的地を入力する部分である。例えばカーナビゲーションシステムと同様に、地図を表示したタッチパネルをユーザが操作することで、目的地の入力を行う。 The destination input unit 8 is a part where the user of the vehicle v inputs a destination to the autonomous mobile system 1. For example, as in the car navigation system, the user inputs a destination by operating a touch panel displaying a map.
経路計画部9は、自己位置推定部7が推定した現在の自己位置から、目的地入力部8から入力された目的地までの経路を、自己位置推定の予測誤差をコストとして計画する部分である。この際に、予測誤差地図生成記憶部3が記憶している予測誤差地図を利用する(具体的な処理内容については後述する)。
The
走行部10は、自律移動システム1が搭載された車両vの車輪などを駆動し、走行させるための部分である。経路計画部9が計画した経路に従って車両vが走行するように制御を行う。なお、車輪でなくクローラや脚による移動形態でも良い。
The
続いて、図2を参照して、自律移動システム1の別形態の構成について説明する。
図2は一実施例に係る自律移動システムの別形態のブロック図である。
図2において、自律移動システム1の内、走行環境形状モデル入力部2、予測誤差地図生成記憶部3、自己位置誤差予測部4、ランドマーク地図記憶部5に関しては、車両vの上ではなく、無線ネットワークによって接続された管理システム1aに設置されていても良い。車両vには、端末システム1bが設置され、管理システム1aと端末システム1bを併せて自律移動システム1を構成する。この場合も自律移動システム1の各構成要素は図1の実施形態と同様であるが、予測誤差地図生成記憶部3と経路計画部9の間、及びランドマーク地図記憶部5と自己位置推定部7の間の通信が無線ネットワークを介して行われるという部分が異なる。
Next, another configuration of the autonomous mobile system 1 will be described with reference to FIG.
FIG. 2 is a block diagram of another form of the autonomous mobile system according to one embodiment.
In FIG. 2, the traveling environment shape
次に、図3から図6を参照して、予測誤差地図生成記憶部3の具体的な処理内容について説明する。
図3は一実施例に係る環境形状モデルから走行可能領域を抽出してグリッドで区切った平面図である。
図3において、まず、走行環境形状モデル入力部2から入力された環境形状モデルmから、走行可能領域(道路)を抽出する。図3の上部に、環境形状モデルmから走行可能領域を抽出した結果の平面図を示す。環境形状モデルmは、単に形状やテクスチャを持つだけでなく、属性情報として歩道/車道などの情報も持っているため、例えば歩道を走行する自律移動ロボットを対象としている場合には、歩道部分を抜き出すことで走行可能領域が抽出できる。
Next, specific processing contents of the prediction error map
FIG. 3 is a plan view in which a travelable area is extracted from an environmental shape model according to an embodiment and divided by a grid.
In FIG. 3, first, a travelable region (road) is extracted from the environment shape model m input from the travel environment shape
続いて走行可能領域を、離散的に表現するためにグリッド11などで区切る。図3の下部に、均一な大きさのグリッド11で区切った例を示す。実際には、例えば1m四方のグリッド11で走行可能領域を区切る。なお、走行可能領域の区切り方は均一な大きさのグリッド11に限定されるものではなく、書籍「知能ロボット入門―動作計画問題の解法―」(著者:太田 順,倉林 大輔,新井 民夫、出版社:コロナ社、出版年:2001年)に記載の4分木などを用いて階層的にグリッド分割しても良い。
Subsequently, the travelable area is divided by a
図4は一実施例に係る走行可能領域をグリッドで区切りノードとリンクによるグラフ構造を構築した結果を示す平面図である。
図4において、分割した各グリッド11の中心にノード12を生成する。ただし、1つのグリッドに、一周360degを分割して複数のノード12を生成する。例えば一周360degを45deg毎に8分割して8個のノード12を生成する。よって、各ノード12は(x,y,θ)の3次元空間でのノード12となる。方位θの情報を持たせることにより、周辺物体形状計測部6の計測範囲14が全周をカバーできず、方位θによって計測可能な領域が変化することを考慮できる。全方位ステレオカメラなどの一部の周辺物体形状計測部6を除き、レーザスキャナなどの多くの周辺物体形状計測部6は計測範囲14が全周をカバーできないため、各ノード12が方位θの情報を持つことは有効である。
FIG. 4 is a plan view showing a result of constructing a graph structure with nodes and links by dividing a travelable area according to an embodiment by a grid.
In FIG. 4, a node 12 is generated at the center of each divided
さらに、(x,y)の2次元平面で見て8近傍に相当するノード間をリンク13で接続することで、グラフ構造を構築する。このグラフ構造が、経路計画部9が用いる基礎となるデータ構造である。ただし、リンク13を利用しない経路計画部9の実施形態もある(詳細については、経路計画部9の説明にて述べる)。
Furthermore, a graph structure is constructed by connecting nodes corresponding to 8 neighbors as viewed in the two-dimensional plane of (x, y) with links 13. This graph structure is the basic data structure used by the
続いて、予測誤差地図生成記憶部3が自己位置誤差予測部4に渡すためのデータを生成する。自己位置誤差予測部4は各ノード12毎に自己位置推定の誤差を予測するため、各ノード12毎に計算に必要なデータを生成する。
Subsequently, the prediction error map
図5は一実施例に係る建物や街路樹や電柱などの道路構造物や走行可能領域(道路)などの情報を持った環境形状モデルの平面図である。
図6は本実施例に係り、人よりも高い領域の計測データからランドマークデータを抽出する例を示す図である。以降、図5中のノード12aを例として説明する。
図5において、まず、ノード12aの位置(x,y,θ)であって、周辺物体形状計測部6の計測範囲14内に存在する物体に対して、計測データをシミュレーションして生成する。例えば周辺物体形状計測部6としてレーザスキャナを用いている場合には、コンピュータグラフィックスの技術であるレイキャスティングを行い、レーザスキャナから照射されるレーザの反射点を求めることで計測データのシミュレーションが可能である。
なお、この際にシミュレーションした計測データを均等に間引いても良い。例えば、0.1m間隔で間引きを行う。
FIG. 5 is a plan view of an environmental shape model having information on road structures such as buildings, roadside trees, utility poles, and travelable areas (roads) according to an embodiment.
FIG. 6 is a diagram illustrating an example of extracting landmark data from measurement data in a region higher than a person according to the present embodiment. Hereinafter, the
In FIG. 5, first, measurement data is generated by simulating an object that exists at the position (x, y, θ) of the
Note that the measurement data simulated at this time may be thinned out evenly. For example, thinning is performed at intervals of 0.1 m.
続いて、シミュレーションした計測データの中から、ランドマークとして使用するランドマークデータzを抽出する。例えば、図6に示すように人hよりも高い領域dの計測データをランドマークとして用いる場合には、各計測データの高さの情報を用いて、所定の高さを抜き出すことでランドマークデータzを抽出する。あるいは、計測データをクラスタリングしたり特徴抽出した結果から、ランドマークとして使用するデータを抽出しても良い。 Subsequently, landmark data z used as a landmark is extracted from the measured measurement data. For example, as shown in FIG. 6, in the case where measurement data of a region d higher than the person h is used as a landmark, the landmark data is extracted by extracting a predetermined height using the height information of each measurement data. Extract z. Alternatively, data used as a landmark may be extracted from the result of clustering measurement data or extracting features.
以上の処理を各ノード12毎に行い、各ノード12毎にランドマークデータzを抽出する。ここで抽出したランドマークデータzと、走行環境形状モデル入力部2から入力された環境形状モデルmが、自己位置誤差予測部4が計算に用いるデータである。
The above processing is performed for each node 12 and the landmark data z is extracted for each node 12. The landmark data z extracted here and the environment shape model m input from the traveling environment shape
次に、図7を参照して、自己位置誤差予測部4の具体的な処理内容について説明する。
図7は一実施例に係る自己位置誤差予測部の処理手順を示すフロー図である。
図7において、自己位置誤差予測部4は、各ノード12毎にランドマークデータzと環境形状モデルmを用いて、幾何学的な計算により、自己位置推定の予測誤差を共分散行列Σとして求める。基本的な考え方は、ランドマークデータzの位置を微小量ずつ移動させ、その際の環境形状モデルmとの重なりの程度を評価することで、マッチングの際の誤差(ずれ)の性質を求め、そこから共分散行列Σを計算するというものである。
Next, specific processing contents of the self-position
FIG. 7 is a flowchart illustrating a processing procedure of the self-position error prediction unit according to the embodiment.
In FIG. 7, the self-position
ランドマークデータzを移動させた際に環境形状モデルmとの重なりの程度が大きく変化する場合は、当該ランドマークデータzの形状の一意性が高いことを意味するため、結果としてマッチングの誤差が小さくなる性質を持つ。逆に、ランドマークデータzを移動させた際に重なりの程度があまり変化しない場合は、一意性が低いことを意味するため、結果としてマッチングの誤差が大きくなる性質を持つ。また、どのように移動させた際に誤差がどう変化するかの性質が分かるため、誤差の性質は値としてではなく共分散行列Σで表される分布として求めることができる。
まず、ランドマークデータzを移動させる移動量(xT,yT,θT)を決定する(S11)。
If the degree of overlap with the environmental shape model m changes greatly when the landmark data z is moved, it means that the uniqueness of the shape of the landmark data z is high. Has the property of becoming smaller. Conversely, if the degree of overlap does not change much when the landmark data z is moved, this means that the uniqueness is low, and as a result, the matching error becomes large. Further, since the nature of how the error changes when moved, the nature of the error can be obtained not as a value but as a distribution represented by the covariance matrix Σ.
First, the movement amount (x T , y T , θ T ) for moving the landmark data z is determined (S11).
移動量の最大値と最小値を事前に設定しておき、その範囲内を一定間隔で等分する。この最大値と最小値は、自己位置推定部7の誤差の大きさを基準に決定すれば良い。例えば、xTとyTに関しては-2.5mから+2.5mの範囲を0.5m毎にそれぞれ10分割し、θTに関しては-60degから+60degの範囲を10deg毎に12分割する。この場合では10×10×12で1200通りの移動量の組み合わせが存在することになる。 The maximum value and the minimum value of the movement amount are set in advance, and the range is equally divided at regular intervals. The maximum value and the minimum value may be determined based on the error magnitude of the self-position estimation unit 7. For example, for x T and y T respectively 10 divided for each 0.5m range of + 2.5 m from -2.5m, for theta T is 12 divided for each 10deg a range of + 60 deg from -60Deg. In this case, there are 1200 combinations of movement amounts of 10 × 10 × 12.
次に、ランドマークデータzを移動量(xT,yT,θT)で座標変換する(S12)。移動量(xT,yT,θT)による座標変換を表す同次変換行列をTとすると、座標変換後のランドマークデータはTzと表現される。
座標変換されたランドマークデータTzと環境形状モデルmを用いて、重なりの程度をマッチング評価値DTとして計算する(S13)。
Next, the landmark data z is coordinate-converted with the movement amount (x T , y T , θ T ) (S12). When the homogeneous transformation matrix representing the coordinate transformation by the movement amount (x T , y T , θ T ) is T, the landmark data after the coordinate transformation is expressed as Tz.
The degree of overlap is calculated as the matching evaluation value DT using the coordinate-converted landmark data Tz and the environmental shape model m (S13).
この計算には、例えば式(1)と式(2)を用いる。 For this calculation, for example, Equation (1) and Equation (2) are used. *
最初に、ランドマークデータTzの点群と環境形状モデルmのポリゴン頂点群の間の対応関係cを、式(1)の最近傍探索により求める。ここで、ランドマークデータTzの点群の数をNとしている。続いて求めた対応関係cを用いて、マッチング評価値DTを式(2)の対応点間距離の二乗平均により求める。ここで、nは環境形状モデルmのポリゴン面の法線方向単位ベクトルである。法線方向単位ベクトルnとの内積を計算することにより、点対面の距離として対応点間距離を計算している。なお、法線方向単位ベクトルnとの内積を計算せず、通常の点対点の距離として対応点間距離を計算しても良い。 First, the correspondence c between the point group of the landmark data Tz and the polygon vertex group of the environmental shape model m is obtained by the nearest neighbor search of Expression (1). Here, the number of point groups of the landmark data Tz is N. Subsequently, the matching evaluation value DT is obtained by the root mean square of the distance between corresponding points in Expression (2) using the obtained correspondence c. Here, n is a normal direction unit vector of the polygon surface of the environmental shape model m. By calculating the inner product with the normal direction unit vector n, the distance between corresponding points is calculated as a point-to-face distance. Note that the distance between corresponding points may be calculated as a normal point-to-point distance without calculating the inner product with the normal direction unit vector n.
続いて、マッチング評価値DTから尤度LTを計算する(S14)。ここでの尤度LTは、当該移動量(xT,yT,θT)でランドマークデータzを座標変換した際の、環境形状モデルmとの重なりの尤もらしさを正規分布で近似したものである。尤度LTの計算には、例えば式(3)を用いる。ここでσは、周辺物体形状計測部6の誤差の性質を表す定数である。具体的な値は、周辺物体形状計測部6の誤差の大きさを基準に決定すれば良い。 Then, it calculates the likelihood L T from the matching evaluation value D T (S14). The likelihood L T here approximates the likelihood of overlapping with the environmental shape model m when the landmark data z is coordinate-transformed with the movement amount (x T , y T , θ T ) by a normal distribution. Is. The calculation of likelihood L T, for example using equation (3). Here, σ is a constant representing the nature of the error of the peripheral object shape measuring unit 6. A specific value may be determined based on the magnitude of the error of the peripheral object shape measurement unit 6.
以上の移動量(xT,yT,θT)に対する尤度LTの計算を、すべての移動量の組み合わせに対して実施する(S15)。前述の例では、1200通りの組み合わせである。 The calculation of the likelihood L T for the above movement amounts (x T , y T , θ T ) is performed for all combinations of movement amounts (S15). In the above example, there are 1200 combinations.
ひとつのノード12に対して、すべての移動量の組み合わせ個数分の尤度LTを計算したら、例えば式(4)を用いて共分散行列Σを計算する(S16)。式(4)は、各移動量(xT,yT,θT)のすべての組み合わせにおける尤度LTを重みとした、重み付き最小二乗法による共分散行列Σの計算である。ここでΣTは、各移動量に関する総和を意味する数学記号である。 For one node 12, After calculating the likelihood L T combination number of any amount of movement, for example, to calculate the covariance matrix Σ using Equation (4) (S16). Equation (4), each amount of movement (x T, y T, θ T) was weighted likelihood L T for all the combinations of a calculation of the covariance matrix Σ by weighted least squares method. Here sigma T is a mathematical symbol indicates a total for each amount of movement.
この共分散行列Σの計算を、すべてのノード12に対して実施したら、自己位置誤差予測部4の処理は終了となる(S17、S18)。これにより、走行可能領域における各ノード12での自己位置推定の予測誤差が、共分散行列Σとして求まる。また、求めた各ノード12における自己位置推定の予測誤差を、予測誤差地図生成記憶部3に返す。
When the calculation of the covariance matrix Σ is performed for all the nodes 12, the processing of the self-position
これにより予測誤差地図生成記憶部3は、走行可能領域における各ノード12と、各ノード12に対応付いた自己位置推定の予測誤差というデータ構造で、予測誤差地図を生成して記憶することができる。
Accordingly, the prediction error map
続いて、図8から図11を参照して、自己位置誤差予測部4の具体的な処理内容の例について説明する。
Subsequently, an example of specific processing contents of the self-position
図8は一実施例に係り、あるノードにおけるランドマークデータを抽出した計測範囲と環境形状モデルの平面図である。
このような状況での、図7に示した自己位置誤差予測部4のフローチャート中のS11からS14の処理を、例として説明する。
FIG. 8 is a plan view of a measurement range and an environmental shape model obtained by extracting landmark data at a certain node according to one embodiment.
The processing from S11 to S14 in the flowchart of the self-position
まず、例えば移動量(xT,yT,θT)が(+1.0m,+2.0m,0deg)と決定されたとする(S11)。 First, for example, assume that the movement amount (x T , y T , θ T ) is determined to be (+1.0 m, +2.0 m, 0 deg) (S11).
次に、計測範囲14から抽出したランドマークデータz1を、当該移動量で座標変換する(S12)。 Next, the landmark data z1 extracted from the measurement range 14 is coordinate-converted with the movement amount (S12).
図9は一本実施例に係り、座標変換されたランドマークデータと環境形状モデルの平面図である。
図9において、座標変換されたランドマークデータTz1と環境形状モデルmが重なり合っていない(ずれが大きい)。座標変換する前と比較すると、ランドマークデータz1と環境形状モデルmの重なりの程度が大きく変化している。ランドマークデータz1を移動させた際に環境形状モデルmとの重なりの程度が大きく変化する場合は、当該ランドマークデータz1の形状の一意性が高いことを意味するため、結果として当該移動量の方向に対してマッチングの誤差が小さくなる性質を持つ。
FIG. 9 is a plan view of the landmark data subjected to coordinate conversion and the environmental shape model according to the present embodiment.
In FIG. 9, the landmark data Tz1 subjected to coordinate conversion and the environmental shape model m do not overlap (the deviation is large). Compared with before the coordinate conversion, the degree of overlap between the landmark data z1 and the environmental shape model m is greatly changed. If the degree of overlap with the environmental shape model m changes greatly when the landmark data z1 is moved, it means that the uniqueness of the shape of the landmark data z1 is high. It has the property that the matching error becomes smaller with respect to the direction.
続いて、座標変換されたランドマークデータTz1と環境形状モデルmの重なりの程度を定量的に求めるため、マッチング評価値DTを計算する(S13)。図9では重なりの程度は低いため、対応点間距離の二乗平均であるマッチング評価値DTは大きな値となる。 Subsequently, a matching evaluation value DT is calculated in order to quantitatively determine the degree of overlap between the coordinate-converted landmark data Tz1 and the environmental shape model m (S13). In FIG. 9, since the degree of overlap is low, the matching evaluation value DT, which is the mean square of the distance between corresponding points, is a large value.
さらに、マッチング評価値DTから尤度LTを計算する(S14)。尤度LTは重なりの尤もらしさを正規分布で近似したものであるため、マッチング評価値DTが大きい(重なりの程度が低い)と尤度LTは小さな値となる。 Moreover, to calculate the likelihood L T from the matching evaluation value D T (S14). Since the likelihood L T is obtained by approximating the likelihood of overlap in the normal distribution, the likelihood L T as the matching evaluation value D T is large (low degree of overlap) is a small value.
図10は、図8とは異なる状況における、あるノード12cにおける、ランドマークデータzを抽出した計測範囲14bと環境形状モデルmを上から見た平面図を示している。 FIG. 10 shows a plan view of the measurement range 14b from which the landmark data z is extracted and the environmental shape model m, as seen from above, in a certain node 12c in a situation different from that in FIG.
このような状況での、図7に示した自己位置誤差予測部4のフローチャート中のS11からS14の処理を、例として説明する。
まず、例えば移動量(xT,yT,θT)は図9の例と同じく(+1.0m,+2.0m,0deg)と決定されたとする(S11)。
次に、計測範囲14から抽出したランドマークデータz2を、当該移動量で座標変換する(S12)。
The processing from S11 to S14 in the flowchart of the self-position
First, for example, it is assumed that the movement amount (x T , y T , θ T ) is determined to be (+1.0 m, +2.0 m, 0 deg) as in the example of FIG. 9 (S11).
Next, the landmark data z2 extracted from the measurement range 14 is coordinate-converted with the movement amount (S12).
図11は一実施例に係り、座標変換されたランドマークデータと環境形状モデルの平面図である。
図11において、座標変換されたランドマークデータTz2と環境形状モデルmは比較的重なり合っている(ずれが小さい)。すなわち、x方向のずれが存在することは見て分かるが、y方向のずれは存在しないように見える。座標変換する前と比較すると、ランドマークデータz2と環境形状モデルmの重なりの程度はあまり変化していない。ランドマークデータz2を移動させた際に環境形状モデルmとの重なりの程度があまり変化しない場合は、当該ランドマークデータz2の形状の一意性が低いことを意味するため、結果として当該移動量の方向に対してマッチングの誤差が大きくなる性質を持つ。
FIG. 11 is a plan view of the landmark data subjected to coordinate conversion and the environmental shape model according to one embodiment.
In FIG. 11, the landmark data Tz2 subjected to coordinate conversion and the environmental shape model m are relatively overlapped (the deviation is small). That is, it can be seen that there is a deviation in the x direction, but there appears to be no deviation in the y direction. Compared to before the coordinate conversion, the degree of overlap between the landmark data z2 and the environmental shape model m does not change much. If the degree of overlap with the environmental shape model m does not change much when the landmark data z2 is moved, it means that the uniqueness of the shape of the landmark data z2 is low. It has the property that the matching error increases with respect to the direction.
続いて、座標変換されたランドマークデータTz2と環境形状モデルmの重なりの程度を定量的に求めるため、マッチング評価値DTを計算する(S13)。図11では重なりの程度は高いため、対応点間距離の二乗平均であるマッチング評価値DTは小さな値となる。 Subsequently, a matching evaluation value DT is calculated in order to quantitatively determine the degree of overlap between the landmark data Tz2 subjected to coordinate conversion and the environmental shape model m (S13). In FIG. 11, since the degree of overlap is high, the matching evaluation value DT, which is the mean square of the distance between corresponding points, is a small value.
さらに、マッチング評価値DTから尤度LTを計算する(S14)。尤度LTは重なりの尤もらしさを正規分布で近似したものであるため、マッチング評価値DTが小さい(重なりの程度が高い)と尤度LTは大きな値となる。
次に、経路計画部9の具体的な処理内容について説明する。
Moreover, to calculate the likelihood L T from the matching evaluation value D T (S14). Since the likelihood L T is obtained by approximating the likelihood of overlap in the normal distribution, the likelihood L T as the matching evaluation value D T is smaller (higher degree of overlap) is a large value.
Next, specific processing contents of the
経路計画部9は、予測誤差地図生成記憶部3が記憶している予測誤差地図を利用して、経路を計画する。前述のとおり予測誤差地図は、図4に示した走行可能領域をグリッド11で区切った各ノード12と、各ノード12に対応付いた自己位置推定の予測誤差というデータ構造である。また、各ノード12は(x,y,θ)の3次元空間でのノード12であり、(x,y)の2次元平面で見て同一箇所に方位θが異なる複数のノード12が存在する。
The
経路計画部9の実施形態としては、少なくとも2つの形態が考えられる。1つ目は、前述のとおりノード間をリンク13で接続して構築したグラフ構造に対して、グラフ理論における最短経路探索アルゴリズムで経路計画を行う形態である。各ノード12が持つ自己位置推定の予測誤差をコストとして、最短経路探索を行う。2つ目は、各ノード12が持つ自己位置推定の予測誤差をグリッド11のコストとして、論文「経路探索問題―ロボットの動作計画―」(著者:比留川 博久、出版社:情報処理学会誌、出版年:1994年)に記載の大域的ポテンシャル法を用いて経路計画を行う形態である。大域的ポテンシャル法を用いる場合は、リンク13を生成する必要はない。
As an embodiment of the
いずれの実施形態においても、各ノード12は走行可能領域中に存在するため、走行可能領域を3次元(x,y,θ)のコンフィギュレーション空間で離散的に表現し、その中での目標経路を求めることができる。自己位置推定部7が推定した現在の自己位置をスタート位置とし、目的地入力部8から入力された目的地をゴール位置として、各ノード12が持つ自己位置推定の予測誤差をコストとして経路計画を行う。 In any embodiment, since each node 12 exists in the travelable region, the travelable region is discretely expressed in a three-dimensional (x, y, θ) configuration space, and the target route in the region is represented. Can be requested. The current self-position estimated by the self-position estimation unit 7 is set as the start position, the destination input from the destination input unit 8 is set as the goal position, and the path planning is performed using the prediction error of the self-position estimation of each node 12 as the cost. Do.
各ノード12が持つ自己位置推定の予測誤差は共分散行列Σで表される分布として求まっているため、経路計画部9がコストとして計算する際には、経路が通過する複数のノード12が持つ共分散行列Σを最尤推定により融合することで、各ノード12を通過する時点での共分散を求める。また、書籍「Probabilistic Robotics」(著者:Sebastian Thrun, Wolfram Burgard, and Dieter Fox、出版社:The MIT Press、出版年:2005年)に記載の走行モデルによって、経路を走行する際の共分散をシミュレーションし、ノード12が持つ共分散行列Σと最尤推定により融合してコストとすることもできる。
Since the prediction error of the self-position estimation possessed by each node 12 is obtained as a distribution represented by the covariance matrix Σ, when the
経路計画のコストとして、融合した共分散により表される予測誤差を用いて、各ノード12を通過する時点での共分散が所定の閾値を超えているかを判断することにより、予測誤差の大きくなる経路は通行しないように考慮した目的地までの経路計画が可能となる。共分散が所定の閾値を超えているかの判断は、行列の要素を比較しても良いし、あるいは行列の固有値を比較しても良い。 By using the prediction error represented by the fused covariance as the cost of the path plan, it is determined whether the covariance at the time of passing through each node 12 exceeds a predetermined threshold, thereby increasing the prediction error. It is possible to plan a route to the destination considering that the route does not pass. To determine whether the covariance exceeds a predetermined threshold, the elements of the matrix may be compared, or the eigenvalues of the matrix may be compared.
各ノード12がコストを値として持ち、経路が通過する複数のノード12のコストの総和により当該経路を評価するのではなく、各ノード12がコストを共分散行列Σで表される予測誤差の分布として持ち、経路が各ノード12を通過する時点での共分散により当該経路を評価する。これにより、各位置での自己位置推定で発生する誤差の性質(大きさやずれる方向)が異なることを考慮した経路計画が可能となる。このようなコストの評価方法に基づき、書籍「知能ロボット入門―動作計画問題の解法―」(著者:太田 順,倉林 大輔,新井 民夫、出版社:コロナ社、出版年:2001年)に記載の経路計画方法(グラフ理論における最短経路探索や大域的ポテンシャル法)を用いることで、自己位置推定の予測誤差を考慮に入れて経路計画を行う(当該予測誤差の大きくなる経路は避けて、目的地までの経路を計画する)ことができる。 A distribution of prediction errors in which each node 12 has a cost as a value, and each node 12 represents the cost as a covariance matrix Σ, instead of evaluating the route by the sum of the costs of a plurality of nodes 12 through which the route passes. And the route is evaluated by covariance when the route passes through each node 12. This makes it possible to plan a route in consideration of differences in the nature (size and direction of deviation) of errors that occur in self-position estimation at each position. Based on this cost evaluation method, it is described in the book “Introduction to Intelligent Robots—Solutions for Motion Planning Problems” (authors: Junjun Ota, Daisuke Kurabayashi, Tomio Arai, publisher: Corona, publication year: 2001) By using a route planning method (shortest route search or global potential method in graph theory), the route planning is performed taking into account the prediction error of self-position estimation (a route with a large prediction error should be avoided, and the destination Plan the route to
以上のごとく本発明によれば、上記構成の自律移動システムとすることにより、走行可能領域中の各位置における自己位置推定の誤差を事前に予測し、当該予測誤差の大きくなる経路は通行しないように考慮した目的地までの経路計画が可能となる。 As described above, according to the present invention, by using the autonomous mobile system having the above-described configuration, an error in self-position estimation at each position in the travelable region is predicted in advance, and a route that increases the prediction error does not pass. It is possible to plan the route to the destination in consideration of the above.
もって、本発明は自己位置を見失うことなく、目的地まで適切に移動することができる自律移動システムを実現するという優れた効果を得ることができる。 Therefore, this invention can acquire the outstanding effect of implement | achieving the autonomous movement system which can move to the destination appropriately, without losing sight of a self-position.
1・・・自律移動システム、2・・・走行環境形状モデル入力部、3・・・予測誤差地図生成記憶部、4・・・自己位置誤差予測部、5・・・ランドマーク地図記憶部、6・・・周辺物体形状計測部、7・・・自己位置推定部、8・・・目的地入力部、9・・・経路計画部、10・・・走行部、11・・・グリッド、12・・・ノード、13・・・リンク、14・・・計測範囲。 DESCRIPTION OF SYMBOLS 1 ... Autonomous movement system, 2 ... Driving environment shape model input part, 3 ... Prediction error map production | generation storage part, 4 ... Self-position error prediction part, 5 ... Landmark map storage part, 6 ... Surrounding object shape measuring unit, 7 ... Self-position estimating unit, 8 ... Destination input unit, 9 ... Route planning unit, 10 ... Traveling unit, 11 ... Grid, 12 ... node, 13 ... link, 14 ... measurement range.
Claims (5)
走行環境の形状モデルを入力する走行環境形状モデル入力部と、この走行環境形状モデル入力部からの情報に基づき自己位置推定の予測誤差を計算する自己位置誤差予測部と、
前記自律移動体が走行可能領域において前記予測誤差を対応付けた予測誤差地図を生成して記憶する予測誤差地図生成記憶部と、
前記予測誤差地図を参照して前記自律移動体が走行する経路に対応した前記予測誤差に基づいて経路計画を行う経路計画部とを備えたことを特徴とする自律移動システム。 In an autonomous mobile system in which an autonomous mobile body travels to a destination while estimating its own position by referring to a map based on measurement data from a surrounding object shape measurement unit,
A driving environment shape model input unit that inputs a driving environment shape model; a self-position error prediction unit that calculates a prediction error of self-position estimation based on information from the driving environment shape model input unit;
A prediction error map generation storage unit that generates and stores a prediction error map in which the prediction error is associated with the autonomous mobile body in a travelable region;
An autonomous mobile system comprising: a route planning unit that performs route planning based on the prediction error corresponding to a route traveled by the autonomous mobile body with reference to the prediction error map.
前記走行環境形状モデル入力部と前記自己位置誤差予測部と前記予測誤差地図生成記憶部は管理システムに設置されてなり、
前記経路計画部は端末システムに設置され、前記管理システムと前記端末システムとが無線ネットワークによって接続されていることを特徴とする自律移動システム。 In the autonomous mobile system according to claim 1,
The travel environment shape model input unit, the self-position error prediction unit, and the prediction error map generation storage unit are installed in a management system,
The route planning unit is installed in a terminal system, and the management system and the terminal system are connected by a wireless network.
前記予測誤差地図生成記憶部が生成して記憶する前記予測誤差地図は前記走行可能領域を複数の小領域に区切り、前記小領域に対応した前記予測誤差を保持していることを特徴とする自律移動システム。 In the autonomous mobile system according to claim 1,
The prediction error map generated and stored by the prediction error map generation storage unit divides the travelable area into a plurality of small areas, and holds the prediction error corresponding to the small areas. Moving system.
前記小領域に複数の方位毎に対応した前記予測誤差を保持していることを特徴とする自律移動システム。 In the autonomous mobile system according to claim 3,
The autonomous mobile system, wherein the prediction error corresponding to each of a plurality of directions is held in the small area.
前記予測誤差地図生成記憶部は前記走行環境形状モデル入力部に入力される前記形状モデルから所定の高さの形状を抽出し、
前記自己位置誤差予測部は前記予測誤差地図生成記憶部により抽出された前記形状を用いて前記予測誤差を計算することを特徴とする自律移動システム。 In the autonomous mobile system according to claim 1 or 2,
The prediction error map generation storage unit extracts a shape having a predetermined height from the shape model input to the driving environment shape model input unit,
The autonomous mobile system characterized in that the self-position error prediction unit calculates the prediction error using the shape extracted by the prediction error map generation storage unit.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2010/073131 WO2012086029A1 (en) | 2010-12-22 | 2010-12-22 | Autonomous movement system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2010/073131 WO2012086029A1 (en) | 2010-12-22 | 2010-12-22 | Autonomous movement system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012086029A1 true WO2012086029A1 (en) | 2012-06-28 |
Family
ID=46313333
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2010/073131 Ceased WO2012086029A1 (en) | 2010-12-22 | 2010-12-22 | Autonomous movement system |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2012086029A1 (en) |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014139821A1 (en) * | 2013-03-15 | 2014-09-18 | Volkswagen Aktiengesellschaft | Automatic driving route planning application |
| WO2015025599A1 (en) * | 2013-08-21 | 2015-02-26 | シャープ株式会社 | Autonomous mobile body |
| JP2016162013A (en) * | 2015-02-27 | 2016-09-05 | 株式会社日立製作所 | Self position estimation device and mobile entity |
| CN106444756A (en) * | 2016-09-22 | 2017-02-22 | 纳恩博(北京)科技有限公司 | Information processing method and electronic equipment |
| WO2017060947A1 (en) * | 2015-10-05 | 2017-04-13 | パイオニア株式会社 | Estimation apparatus, control method, program, and storage medium |
| JP2017101944A (en) * | 2015-11-30 | 2017-06-08 | パイオニア株式会社 | Speed calculation device, control method, program, and storage medium |
| WO2017199333A1 (en) * | 2016-05-17 | 2017-11-23 | パイオニア株式会社 | Information output device, terminal device, control method, program, and storage medium |
| JP2018504650A (en) * | 2014-12-26 | 2018-02-15 | ヘーレ グローバル ベスローテン フェンノートシャップ | Geometric fingerprinting for device location |
| EP3106836B1 (en) * | 2015-06-16 | 2018-06-06 | Volvo Car Corporation | A unit and method for adjusting a road boundary |
| JP2019192206A (en) * | 2018-04-27 | 2019-10-31 | 深セン市優必選科技股▲ふん▼有限公司Ubtech Poboticscorp Ltd | Charging stand recognition method, apparatus therefor, and robot |
| JP2020098196A (en) * | 2019-10-23 | 2020-06-25 | パイオニア株式会社 | Estimation device, control method, program, and storage medium |
| JP2021044008A (en) * | 2020-12-01 | 2021-03-18 | パイオニア株式会社 | Information output device, terminal, control method, program and storage medium |
| JP2022025118A (en) * | 2019-10-23 | 2022-02-09 | パイオニア株式会社 | Estimation device, control method, program, and storage medium |
| WO2022070324A1 (en) * | 2020-09-30 | 2022-04-07 | 日本電気株式会社 | Moving body control device, moving body control method, moving body control system, and computer-readable non-transitory storage medium with moving body control program recorded therein |
| JP2023105152A (en) * | 2021-11-01 | 2023-07-28 | パイオニア株式会社 | Estimation device, control method, program, and storage medium |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH02259913A (en) * | 1989-03-31 | 1990-10-22 | Glory Ltd | Self normal positioning method for mobile object |
| JP2003015739A (en) * | 2001-07-02 | 2003-01-17 | Yaskawa Electric Corp | Outside environment map, self-position identification device and guidance control device |
| WO2007069726A1 (en) * | 2005-12-16 | 2007-06-21 | Ihi Corporation | Self-position identifying method and device, and three-dimensional shape measurement method and device |
| JP2007322138A (en) * | 2006-05-30 | 2007-12-13 | Toyota Motor Corp | Mobile device and self-position estimation method of mobile device |
| JP2010152787A (en) * | 2008-12-26 | 2010-07-08 | Fujitsu Ltd | Environment map creation program, environment map creation method, and mobile robot |
| JP2010238217A (en) * | 2009-03-09 | 2010-10-21 | Yaskawa Electric Corp | Mobile robot self-position identification method and mobile robot |
-
2010
- 2010-12-22 WO PCT/JP2010/073131 patent/WO2012086029A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH02259913A (en) * | 1989-03-31 | 1990-10-22 | Glory Ltd | Self normal positioning method for mobile object |
| JP2003015739A (en) * | 2001-07-02 | 2003-01-17 | Yaskawa Electric Corp | Outside environment map, self-position identification device and guidance control device |
| WO2007069726A1 (en) * | 2005-12-16 | 2007-06-21 | Ihi Corporation | Self-position identifying method and device, and three-dimensional shape measurement method and device |
| JP2007322138A (en) * | 2006-05-30 | 2007-12-13 | Toyota Motor Corp | Mobile device and self-position estimation method of mobile device |
| JP2010152787A (en) * | 2008-12-26 | 2010-07-08 | Fujitsu Ltd | Environment map creation program, environment map creation method, and mobile robot |
| JP2010238217A (en) * | 2009-03-09 | 2010-10-21 | Yaskawa Electric Corp | Mobile robot self-position identification method and mobile robot |
Cited By (32)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105229422A (en) * | 2013-03-15 | 2016-01-06 | 大众汽车有限公司 | Autonomous driving route planning application |
| US10451428B2 (en) | 2013-03-15 | 2019-10-22 | Volkswagen Aktiengesellschaft | Automatic driving route planning application |
| WO2014139821A1 (en) * | 2013-03-15 | 2014-09-18 | Volkswagen Aktiengesellschaft | Automatic driving route planning application |
| WO2015025599A1 (en) * | 2013-08-21 | 2015-02-26 | シャープ株式会社 | Autonomous mobile body |
| JP2015041203A (en) * | 2013-08-21 | 2015-03-02 | シャープ株式会社 | Autonomous moving body |
| CN105247431A (en) * | 2013-08-21 | 2016-01-13 | 夏普株式会社 | Autonomous mobile body |
| CN105247431B (en) * | 2013-08-21 | 2017-09-19 | 夏普株式会社 | Autonomous body |
| US9804598B2 (en) | 2013-08-21 | 2017-10-31 | Sharp Kabushiki Kaisha | Autonomous mobile body |
| JP2020101833A (en) * | 2014-12-26 | 2020-07-02 | ヘーレ グローバル ベスローテン フェンノートシャップ | Geometric fingerprinting for localization of device |
| JP2021060602A (en) * | 2014-12-26 | 2021-04-15 | ヘーレ グローバル ベスローテン フェンノートシャップ | Geometric fingerprinting for localization of device |
| JP7111794B2 (en) | 2014-12-26 | 2022-08-02 | ヘーレ グローバル ベスローテン フェンノートシャップ | Geometric fingerprinting for device localization |
| JP2018504650A (en) * | 2014-12-26 | 2018-02-15 | ヘーレ グローバル ベスローテン フェンノートシャップ | Geometric fingerprinting for device location |
| US9881379B2 (en) | 2015-02-27 | 2018-01-30 | Hitachi, Ltd. | Self-localization device and movable body |
| JP2016162013A (en) * | 2015-02-27 | 2016-09-05 | 株式会社日立製作所 | Self position estimation device and mobile entity |
| EP3106836B1 (en) * | 2015-06-16 | 2018-06-06 | Volvo Car Corporation | A unit and method for adjusting a road boundary |
| US11199850B2 (en) | 2015-10-05 | 2021-12-14 | Pioneer Corporation | Estimation device, control method, program and storage medium |
| JPWO2017060947A1 (en) * | 2015-10-05 | 2018-08-02 | パイオニア株式会社 | Estimation apparatus, control method, program, and storage medium |
| WO2017060947A1 (en) * | 2015-10-05 | 2017-04-13 | パイオニア株式会社 | Estimation apparatus, control method, program, and storage medium |
| JP2017101944A (en) * | 2015-11-30 | 2017-06-08 | パイオニア株式会社 | Speed calculation device, control method, program, and storage medium |
| JPWO2017199333A1 (en) * | 2016-05-17 | 2019-03-14 | パイオニア株式会社 | Information output device, terminal device, control method, program, and storage medium |
| WO2017199333A1 (en) * | 2016-05-17 | 2017-11-23 | パイオニア株式会社 | Information output device, terminal device, control method, program, and storage medium |
| CN106444756A (en) * | 2016-09-22 | 2017-02-22 | 纳恩博(北京)科技有限公司 | Information processing method and electronic equipment |
| JP2019192206A (en) * | 2018-04-27 | 2019-10-31 | 深セン市優必選科技股▲ふん▼有限公司Ubtech Poboticscorp Ltd | Charging stand recognition method, apparatus therefor, and robot |
| JP2020098196A (en) * | 2019-10-23 | 2020-06-25 | パイオニア株式会社 | Estimation device, control method, program, and storage medium |
| JP2022025118A (en) * | 2019-10-23 | 2022-02-09 | パイオニア株式会社 | Estimation device, control method, program, and storage medium |
| WO2022070324A1 (en) * | 2020-09-30 | 2022-04-07 | 日本電気株式会社 | Moving body control device, moving body control method, moving body control system, and computer-readable non-transitory storage medium with moving body control program recorded therein |
| JPWO2022070324A1 (en) * | 2020-09-30 | 2022-04-07 | ||
| JP7601105B2 (en) | 2020-09-30 | 2024-12-17 | 日本電気株式会社 | MOBILE BODY CONTROL DEVICE, MOBILE BODY CONTROL METHOD, MOBILE BODY CONTROL SYSTEM, AND MOBILE BODY CONTROL PROGRAM |
| US12332647B2 (en) | 2020-09-30 | 2025-06-17 | Nec Corporation | Mobile body control apparatus, mobile body control method, mobile body control system, and non-transitory computer-readable storage medium storing mobile body control program |
| JP2021044008A (en) * | 2020-12-01 | 2021-03-18 | パイオニア株式会社 | Information output device, terminal, control method, program and storage medium |
| JP2023105152A (en) * | 2021-11-01 | 2023-07-28 | パイオニア株式会社 | Estimation device, control method, program, and storage medium |
| JP2024161108A (en) * | 2021-11-01 | 2024-11-15 | パイオニア株式会社 | Estimation device, control method, program, and storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2012086029A1 (en) | Autonomous movement system | |
| JP6813703B2 (en) | Geometric fingerprint method for positioning the device | |
| Gwon et al. | Generation of a precise and efficient lane-level road map for intelligent vehicle systems | |
| EP3237923B1 (en) | Localization of a device using multilateration | |
| CN104914865B (en) | Intelligent Mobile Robot Position Fixing Navigation System and method | |
| EP3238494B1 (en) | Selecting feature geometries for localization of a device | |
| Kim et al. | Updating point cloud layer of high definition (hd) map based on crowd-sourcing of multiple vehicles installed lidar | |
| CN108089572A (en) | For the algorithm and infrastructure of steady and effective vehicle location | |
| Huang et al. | Task-specific performance evaluation of UGVs: Case studies at the IVFC | |
| CN109059942A (en) | A kind of high-precision underground navigation map building system and construction method | |
| Singh et al. | Evaluating the performance of map matching algorithms for navigation systems: an empirical study | |
| US20210003415A1 (en) | Submap geographic projections | |
| EP3237922A2 (en) | Extracting feature geometries for localization of a device | |
| CN113741503B (en) | Autonomous positioning unmanned aerial vehicle and indoor path autonomous planning method thereof | |
| JP2016149090A (en) | Autonomous mobile device, autonomous mobile system, autonomous mobile method and program | |
| Li et al. | Hybrid filtering framework based robust localization for industrial vehicles | |
| CN120178933B (en) | Unmanned aerial vehicle route planning method and system based on deep reinforcement learning | |
| JP2023126893A (en) | How to generate universally usable feature maps | |
| JP2011059043A (en) | Route searching device and moving system | |
| WO2021112078A1 (en) | Information processing device, control method, program, and storage medium | |
| JPWO2012160630A1 (en) | Trajectory correction method, trajectory correction device, and moving body device | |
| Zhou et al. | An autonomous navigation approach for unmanned vehicle in off-road environment with self-supervised traversal cost prediction | |
| Basiri | Open area path finding to improve wheelchair navigation | |
| Zakaria et al. | Autonomous shuttle development at Universiti Malaysia Pahang: LiDAR point cloud data stitching and mapping using iterative closest point cloud algorithm | |
| Ugur et al. | Fast and efficient terrain-aware motion planning for exploration rovers |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10861072 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10861072 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |