WO2019188745A1 - Dispositif de traitement d'informations, procédé de commande, programme et support de stockage - Google Patents
Dispositif de traitement d'informations, procédé de commande, programme et support de stockage Download PDFInfo
- Publication number
- WO2019188745A1 WO2019188745A1 PCT/JP2019/011974 JP2019011974W WO2019188745A1 WO 2019188745 A1 WO2019188745 A1 WO 2019188745A1 JP 2019011974 W JP2019011974 W JP 2019011974W WO 2019188745 A1 WO2019188745 A1 WO 2019188745A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measurement
- unit
- predicted
- distance
- measurement unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/0969—Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
Definitions
- the present invention relates to a technique for detecting a positional deviation of a measurement unit.
- Patent Document 1 discloses a technique for estimating a self-position by collating the output of a measurement sensor with the position information of a feature registered in advance on a map.
- Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
- Data obtained from measurement units such as radar and cameras are coordinate system values based on the measurement unit, and are data that depends on the attitude of the measurement unit with respect to the vehicle. Need to be converted to Therefore, when a deviation occurs in the posture of the measurement unit, there is a possibility that an error occurs in the measurement value after conversion into the coordinate system based on the vehicle.
- the present invention has been made to solve the above-described problems, and it is an object of the present invention to provide an information processing apparatus that can suitably detect a positional deviation of a measuring unit that measures a distance to an object with respect to a moving body. With a purpose.
- the invention according to the claim is an information processing apparatus, and obtains a measurement distance obtained by the measurement unit up to an object included in a measurement range of each of at least three measurement units provided in the moving body.
- a predicted distance acquisition unit that acquires a predicted distance from the moving body to the target predicted based on position information of the target, the measured distance acquired for each of the measuring units, and the predicted distance
- a detection unit configured to detect, from the at least three measurement units, at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred based on a plurality of difference values.
- the invention described in the claims is a control method executed by the information processing apparatus, wherein the measurement unit measures up to an object included in a measurement range of each of at least three measurement units provided in the moving body.
- the invention described in the claims is a program executed by a computer, and obtains a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body.
- a measurement distance acquisition unit a prediction distance acquisition unit that acquires a prediction distance from the moving object predicted based on position information of the object, and the measurement distance acquired for each measurement unit.
- the computer is caused to function as a detection unit that detects from at least three measurement units at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred.
- the information processing apparatus acquires a measurement distance by the measurement unit to an object included in each measurement range of at least three measurement units provided in the moving body.
- a detection unit that detects, from the at least three measurement units, at least one measurement unit in which the displacement of the attachment position with respect to the moving body is generated based on the plurality of difference values.
- the “positional deviation” is not limited to the deviation of the center of gravity position of the measuring unit with respect to the moving body, but also includes a deviation of the orientation (posture) that does not involve the deviation of the center of gravity position.
- the information processing apparatus can suitably detect a measurement unit in which a positional deviation has occurred by comparing difference values calculated from measurement distances of three or more measurement units.
- the detection unit detects, from the at least three measurement units, measurement units in which a positional deviation has occurred based on an average value of the difference values for each measurement unit in a predetermined time. According to this aspect, the information processing apparatus can accurately detect a measurement unit in which a difference value difference is steadily generated as a measurement unit in which a position error occurs.
- the information processing apparatus includes a predicted position acquisition unit that acquires the predicted predicted position of the moving body for each measurement unit, and a value obtained by multiplying the difference value by a predetermined gain. And a correction unit that corrects the predicted position for each measurement unit, and the predicted position acquisition unit corrects the predicted position corrected based on the difference value corresponding to one measurement unit of the measurement unit. Then, the predicted position corresponding to the measurement unit other than the measurement unit is acquired.
- the information processing apparatus can perform highly accurate position estimation using the measurement distances of the three measurement units.
- the correction unit corrects the predicted position based on the difference value of a measurement unit other than the measurement unit detected by the detection unit. According to this aspect, it is possible to suitably suppress a decrease in position estimation accuracy caused by using the measurement result of the measurement unit in which the positional deviation occurs.
- the predicted distance acquisition unit corrects the predicted position corrected based on the difference value of a measurement unit other than the measurement unit detected by the detection unit, and the measurement detected by the detection unit.
- the information processing device acquires the predicted distance based on the position information of the target object measured by the unit, and the information processing apparatus detects the detection unit based on the predicted distance and the measured distance detected by the measurement unit.
- An estimation unit for estimating the position of the measurement unit is provided.
- the “position” estimated by the estimation unit is not limited to the position of the center of gravity of the measurement unit, and may include the direction (posture) of the measurement unit.
- the information processing apparatus can detect the predicted distance of the target object based on the predicted position of the moving object and the position of the target object on the map predicted by the measurement unit in which the positional error has not occurred, and the measurement in which the positional error has occurred. It is possible to suitably estimate the position (including the posture) of the measurement unit where the positional deviation has occurred so that the measurement distance of the object by the unit matches or approximates.
- the information processing apparatus includes: the predicted position corrected based on the difference value of the measurement unit detected by the detection unit; and a measurement unit other than the measurement unit detected by the detection unit.
- An estimation unit configured to estimate the position of the measurement unit detected by the detection unit based on the predicted position corrected based on the difference value;
- the information processing apparatus can suitably estimate the position (including the posture) of the measurement unit where the positional deviation has occurred.
- the estimation unit estimates the displacement amount of the positional deviation based on the estimated position of the measurement unit and the position of the measurement unit stored in the storage unit. According to this aspect, the information processing apparatus can perform processing such as correction of the measurement result of the measurement unit in which the positional deviation has occurred, and can appropriately suppress a decrease in accuracy of processing using the measurement unit.
- the estimation unit may detect, as the deviation amount, a deviation amount in the pitch direction, yaw direction, or roll direction of the measurement unit detected by the detection unit, or a three-dimensional measurement unit detected by the detection unit. It is preferable to calculate at least one of the shift amounts of the center of gravity position in the space.
- a control method executed by the information processing apparatus wherein the measurement up to an object included in each measurement range of at least three measurement units provided in the moving body is performed.
- a detection step of detecting, from the at least three measurement units, at least one measurement unit in which a displacement of the attachment position with respect to the moving body has occurred based on a plurality of difference values between the measured distance and the predicted distance, Have By using this control method, the information processing apparatus can suitably detect the measurement unit in which the positional deviation has occurred.
- the program executed by the computer is measured by the measurement unit up to an object included in each measurement range of at least three measurement units provided on the moving body.
- a measurement distance acquisition unit that acquires a distance; a predicted distance acquisition unit that acquires a predicted distance from the moving object to the target predicted based on position information of the target; and the measurement acquired for each measurement unit
- the computer functions as a detection unit that detects from at least three measurement units at least one measurement unit in which the displacement of the attachment position with respect to the moving body has occurred.
- the computer can suitably detect the measurement unit in which the positional deviation has occurred.
- the program is stored in a storage medium.
- FIG. 1A is a schematic configuration diagram of a driving support system according to the present embodiment.
- the driving support system shown in FIG. 1 (A) is mounted on a vehicle and has an in-vehicle device 1 that performs control related to driving support of the vehicle, a lidar (Lider: Light Detection and Ranging, or Laser Illuminated Detection And Ranging) 2 (2A 2C), a gyro sensor 3, a vehicle speed sensor 4, and a GPS receiver 5.
- FIG. 1B is an overhead view of the vehicle showing an example of the arrangement of the lidar 2.
- the in-vehicle device 1 is electrically connected to the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, the position of the vehicle on which the in-vehicle device 1 is mounted ("own vehicle position"). Also called.) And the vehicle equipment 1 performs automatic driving
- the vehicle-mounted device 1 stores a map database (DB: DataBase) 10 that stores road data and feature information that is information on landmarks (landmarks) provided in the vicinity of the road.
- DB DataBase
- the above-mentioned landmark features are, for example, features such as kilometer posts, 100-meter posts, delineators, traffic infrastructure facilities (eg signs, direction signs, signals), utility poles, street lights, etc. that are periodically lined up on the roadside.
- the feature information is information in which at least an index assigned to each feature, position information of the feature, and information on the direction of the feature are associated with each other.
- the vehicle equipment 1 estimates the own vehicle position by making it collate with the output of the lidar 2 etc. based on this feature information.
- the in-vehicle device 1 detects a rider 2 (also referred to as a “deviation-generating lidar”) in which a positional deviation (including an attitude deviation) has occurred based on the vehicle position estimation result by each rider 2 and the occurrence of the deviation. Estimate the position and attitude angle of the rider. And the vehicle equipment 1 performs the process etc. which correct
- production lidar outputs based on this estimation result.
- the in-vehicle device 1 is an example of the “information processing device” in the present invention.
- the lidar 2 (2A to 2C) emits a pulse laser to a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to the object existing in the outside world, and determining the position of the object.
- the three-dimensional point cloud information shown is generated.
- the lidar 2 includes an irradiation unit that emits laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit. Output unit.
- the scan data is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal, and is supplied to the in-vehicle device 1.
- the lidar 2 is an example of the “measurement unit” in the present invention.
- the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
- FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device 1.
- the in-vehicle device 1 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
- the interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15. In addition, the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
- ECU Electronic Control Unit
- the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
- the storage unit 12 includes a map DB 10 and rider installation information IL.
- the lidar installation information IL is information related to the relative three-dimensional position and posture angle of each rider 2 measured at a certain reference time (for example, when there is no positional deviation such as immediately after alignment adjustment of the rider 2).
- the attitude angle of the lidar 2 and the like is represented by a roll angle, a pitch angle, and a yaw angle (that is, Euler angle).
- the input unit 14 is a button operated by the user, a touch panel, a remote controller, a voice input device, and the like, and receives an input for specifying a destination for route search, an input for specifying on / off of automatic driving, and the like.
- the information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
- the control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1.
- the control unit 15 estimates the vehicle position based on the output signal of each sensor supplied from the interface 11 and the map DB 10, and controls vehicle driving support including automatic driving control based on the estimation result of the vehicle position. And so on.
- the control unit 15 uses the measurement data output by the rider 2 as the reference for the attitude angle and position of the rider 2 recorded in the rider installation information IL.
- the reference coordinate system also referred to as “rider coordinate system” is converted into a coordinate system based on the vehicle (also referred to as “vehicle coordinate system”).
- control unit 15 detects the lidar 2 in which the positional deviation has occurred, and estimates the position and posture angle of the lidar 2.
- the control unit 15 includes a “measurement distance acquisition unit”, a “predicted distance acquisition unit”, a “detection unit”, a “predicted position acquisition unit”, a “correction unit”, an “estimation unit”, and a “computer” that executes a program. Is an example.
- the control unit 15 outputs the gyro sensor 3, the vehicle speed sensor 4, and / or the GPS receiver 5 based on the measured values of the distance and angle by the lidar 2 with respect to the landmark and the position information of the landmark extracted from the map DB 10.
- the vehicle position estimated from the data is corrected.
- the control unit 15 performs a prediction step of predicting the vehicle position from output data of the gyro sensor 3 and the vehicle speed sensor 4 based on a state estimation method based on Bayesian estimation, and a previous prediction step.
- a measurement update step for correcting the calculated predicted value of the vehicle position is executed alternately.
- Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter. As described above, various methods have been proposed for position estimation based on Bayesian estimation.
- FIG. 3 is a diagram showing the state variable vector x in two-dimensional orthogonal coordinates.
- the vehicle position on the plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction (yaw angle) “ ⁇ ” of the vehicle.
- the yaw angle ⁇ is defined as an angle formed by the traveling direction of the vehicle and the x-axis.
- the description here is based on four variables (x, y, z, ⁇ ) taking into account the coordinates of the z axis perpendicular to the x axis and the y axis in addition to the coordinates (x, y) and the yaw angle ⁇ described above.
- the vehicle position is estimated using the state variable of the vehicle position. This is because a general road has a gentle slope, so it is possible to ignore the pitch angle and roll angle of the vehicle. However, six variables including the pitch angle and roll angle of the vehicle are not used. An example of the vehicle position estimation using the vehicle position state variable will also be described later.
- FIG. 4 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step.
- FIG. 5 shows an example of functional blocks of the control unit 15. As shown in FIG. 4, by repeating the prediction step and the measurement update step, calculation and update of the estimated value of the state variable vector “X” indicating the vehicle position are sequentially executed. Moreover, as shown in FIG. 5, the control part 15 has the position estimation part 21 which performs a prediction step, and the position estimation part 22 which performs a measurement update step.
- the position prediction unit 21 includes a dead reckoning block 23 and a position prediction block 24, and the position estimation unit 22 includes a landmark search / extraction block 25 and a position correction block 26.
- the provisional estimated value (predicted value) estimated in the predicting step is appended with “ - ” on the character representing the predicted value, and the estimated value with higher accuracy updated in the measurement updating step. Is appended with “ ⁇ ” on the character representing the value.
- T is used to obtain the movement distance and azimuth change from the previous time.
- the position prediction block 24 of the control unit 15 adds the obtained moving distance and azimuth change to the state variable vector X ⁇ (k-1) at the time k-1 calculated in the immediately previous measurement update step, so that the time k A predicted value (also referred to as “predicted position”) X ⁇ (k) is calculated.
- the landmark search / extraction block 25 of the control unit 15 associates the landmark position vector registered in the map DB 10 with the scan data of the lidar 2. Then, the landmark search / extraction block 25 of the control unit 15, when this association is possible, the measurement value “Z (k)” by the lidar 2 of the landmark that has been associated, and the predicted position X ⁇ ( k) and a landmark measurement value obtained by modeling the measurement processing by the lidar 2 using the landmark position vector registered in the map DB 10 (referred to as “measurement prediction value”) “Z ⁇ (k)”. And get respectively.
- the measured value Z (k) is a vector value in the vehicle coordinate system converted from the landmark distance and scan angle measured by the lidar 2 at time k into components with the vehicle traveling direction and the lateral direction as axes. Then, the position correction block 26 of the control unit 15 calculates a difference value between the measured value Z (k) and the measured predicted value Z ⁇ (k).
- the position correction block 26 of the control unit 15 then adds the Kalman gain “K (k) to the difference value between the measured value Z (k) and the measured predicted value Z ⁇ (k) as shown in the following equation (1). ”And adding this to the predicted position X ⁇ (k), an updated state variable vector (also referred to as“ estimated position ”) X ⁇ (k) is calculated.
- the position correction block 26 of the control unit 15 is similar to the prediction step in that the covariance matrix P ⁇ (k) (simply referred to as P (k) corresponding to the error distribution of the estimated position X ⁇ (k). Is expressed from the covariance matrix P ⁇ (k). Parameters such as the Kalman gain K (k) can be calculated in the same manner as a known self-position estimation technique using an extended Kalman filter, for example.
- the prediction step and the measurement update step are repeatedly performed, and the predicted position X ⁇ (k) and the estimated position X ⁇ (k) are sequentially calculated, so that the most likely vehicle position is calculated. .
- the vehicle position estimation using a plurality of riders 2 will be described.
- the vehicle position is determined by using the Kalman filter update equation corresponding to equation (1) using the measured values of the riders 2A to 2C. Estimate with high accuracy.
- the landmarks detected by the lidars 2A to 2C may be the same or different.
- the measured values for the landmark at time “k” in the point cloud data of the riders 2A to 2C are “Z 1 (k)”, “Z 2 (k)”, and “Z 3 (k)”, respectively.
- the measurement predicted values corresponding to these are assumed to be “Z - 1 (k)”, “Z - 2 (k)”, and “Z - 3 (k)”.
- the Kalman filter update formulas for the riders 2A to 2C are expressed by the following formulas (2-1) to (2-3).
- the control unit 15 uses the estimated position X ⁇ (k) obtained by using the equation (2-1) based on the measured value Z 1 (k) of the lidar 2A as the predicted position of the lidar 2B.
- X ⁇ (k) is substituted, and the estimated position X ⁇ (k) is calculated from the equation (2-2) based on the measured value Z 2 (k) of the lidar 2B.
- the control unit 15 substitutes this estimated position X ⁇ (k) as the predicted position X ⁇ (k) of the lidar 2C, and from the equation (2-3) based on the measured value Z 3 (k) of the lidar 2C.
- Estimated position X ⁇ (k) is calculated. In this way, it is possible to calculate the estimated position X ⁇ (k) with high accuracy using the measurement values of the riders 2A to 2C.
- x mi (j)”, “y mi (j)”, and “z mi (j)” are also called landmark positions (“landmark coordinates”) of index number j extracted from the map DB 10. ).
- the landmark coordinates are represented by predicted positions x ⁇ (k), y ⁇ (k), z ⁇ (k ) And a rotation matrix using the vehicle yaw angle ⁇ , it is converted into a vehicle coordinate system.
- directions along the respective coordinate axes in the vehicle coordinate system are also simply referred to as “x direction”, “y direction”, and “z direction”, respectively.
- This difference value ⁇ Z i (k) is used for detection of a deviation occurrence lidar, as will be described later.
- the control unit 15 calculates an average value for a predetermined time of a difference value between the measured predicted value Z ⁇ i (k) and the measured value Z i (k) of each rider 2, and the average value of the rider whose average value is separated from 0 2 is detected as a deviation occurrence lidar.
- the predetermined time is set to a time length such that the difference value ⁇ Z calculated based on the measurement value of the rider 2 in which no positional deviation has occurred is substantially zero.
- FIG. 6A is a graph showing temporal changes in the coordinate values ⁇ x L1 , ⁇ y L1 , and ⁇ z L1 of the difference value ⁇ Z 1 (k) of the rider 2A at a predetermined time.
- FIG. 6B is a graph showing temporal changes in the coordinate values ⁇ x L2 , ⁇ y L2 , and ⁇ z L2 of the difference value ⁇ Z 2 (k) of the rider 2B at a predetermined time.
- FIG. 6C is a graph showing temporal changes in the coordinate values ⁇ x L3 , ⁇ y L3 , and ⁇ z L3 of the difference value Z 3 (k) of the rider 2C at a predetermined time.
- the time average size of the coordinate values ⁇ x L1 , ⁇ y L1 , ⁇ z L1 of the difference value ⁇ Z 1 (k) in a predetermined time is close to zero.
- the magnitude of the time average of the coordinate values ⁇ x L2 , ⁇ y L2 , ⁇ z L2 of the difference value ⁇ Z 2 (k) shown in FIG. On the other hand, a displacement occurs in the lidar 2C, and the measurement value of the lidar 2C in the z coordinate is affected by the displacement. Therefore, in this case, as shown in FIG.
- the difference value ⁇ Z based on the measurement value of the lidar 2 in which the positional deviation has occurred is an abnormal value if at least one of the coordinate values x L , y L , and z L of the measured value Z is caused by the positional deviation.
- at least one of the coordinate values ⁇ x L , ⁇ y L , ⁇ z L of the difference value ⁇ Z is far from 0.
- the control unit 15 calculates the time average of the coordinate values ⁇ x L , ⁇ y L , ⁇ z L of the difference values ⁇ Z of the riders 2 for a predetermined time, and the magnitude of the time average
- the corresponding rider 2 is regarded as a position shift occurrence rider.
- the control unit 15 determines that the position deviation occurrence rider cannot be specified when the time average size of the coordinate values of the difference values ⁇ Z corresponding to all the riders 2 is larger than a predetermined value, and at least any of them is determined.
- the information output unit 16 may output a warning or the like indicating that a positional deviation has occurred in the rider 2.
- the control unit 15 uses the vehicle position estimation result estimated by the rider 2 other than the deviation occurrence rider and the landmark coordinates of the landmark measured by the deviation occurrence rider.
- the measured predicted value of the landmark calculated based on the calculated value is calculated as an expected value (also referred to as “measured expected value”) of the measured value of the deviation occurrence lidar 2.
- the control unit 15 performs the least square method using the Newton method based on the set of the measurement expected value and the actual measurement value of the deviation occurrence lidar, and estimates the attitude angle and position of the deviation occurrence lidar.
- positional deviation is measured predicted value Z rider 2C by not not rider 2A and rider 2B which is a deviation occurs rider based on the vehicle position estimation result of the estimation occurs -
- measured expected value 3 (k) measuring expected value [x b (k), y b (k), z b (k)] T is expressed by the following equation (5).
- each predicted value x ⁇ , y ⁇ , z ⁇ , ⁇ ⁇ is estimated based on the measured values of the lidars 2A and 2B in which no positional deviation occurs, as shown in the following equation (7). Indicates the position.
- the expected measurement value [x b (k), y b (k), z b (k)] T is the landmark coordinate of the landmark of the index number j measured by the lidar 2C, which is the displacement-generating lidar [ x m3 (j), y m3 (j), z m3 (j)] T is the vehicle position estimation result (x ⁇ , y ⁇ , z ⁇ , ⁇ ⁇ ) by the riders 2A and 2B in which no positional deviation has occurred. Is the value converted into the vehicle coordinate system.
- the control unit 15 performs the least square method using the Newton method on the measurement expected value shown in the equation (5), and the roll angle “L ⁇ ” and the pitch angle “ L ⁇ ”, yaw angle“ L ⁇ ”, position“ L x ”in the x direction, position“ L y ”in the y direction, and position“ L z ”in the z direction are estimated.
- the relationship between the vehicle coordinate system and the lidar coordinate system is determined by these posture angles L ⁇ , L ⁇ , L ⁇ and positions L x , L y , L z .
- the relationship between the vehicle coordinate system and the lidar coordinate system will be described in detail in the section “(3) Coordinate system conversion”.
- the measured value of the landmark at time k by the shift generation lidar is expressed as [x L (k), y L (k), z L (k)] T.
- the measured values [x L (k), y L (k), z L (k)] T use posture angles L ⁇ , L ⁇ , L ⁇ , and positions L x , L y , L z . Is represented by the following equation (8).
- the solution of these six variables is obtained by substituting into the formula (9) using the measurement result and the expected measurement value at least twice.
- the accuracy of the solution by the least square method can be improved as the number of measured values and measured expected values to be used is increased, preferably, all obtained measured values and measured expected values are used. Good.
- initial values of posture angles and positions stored in the rider installation information IL are “L ⁇ 0 ”, “L ⁇ 0 ”, “L ⁇ 0 ”, “L x0 ”, Assuming that “L y0 ” and “L z0 ”, the following equation (14) similar to the equation (9) is established.
- Equation (15) the 9-row and 1-column matrix (that is, the left side) of the measurement values for 3 times is “z”, the 9-by-6 Jacobian matrix is “C”, and the initial values of the posture angle and position
- the matrix of 6 rows and 1 column indicating the difference between the current value and the current value is “ ⁇ x”, and the matrix of 9 rows and 1 column regarding the initial measurement values x L0 , y L0 , and z L0 obtained by the equation (14) is “d”.
- Expression (15) is expressed by the following expression.
- ⁇ x is expressed by the following equation (20).
- control unit 15 can suitably estimate the attitude angle and position of the deviation-generating lidar by performing the following first to fifth steps.
- the control unit 15 calculates ⁇ x based on Expression (20).
- the control unit 15 sets x to the positions L x0 , L y0 , L z0 and the posture angles L ⁇ 0 , L ⁇ 0 , L ⁇ 0 , and the first step again.
- the control unit 15 regards x obtained in the fourth step as a final solution, and positions L x , L y , L z indicated by x and the posture angle L ⁇ . , L ⁇ , L ⁇ are regarded as the position and posture angle of the displacement-generating lidar.
- FIG. 7 is an example of a flowchart showing the execution procedure of the position / posture angle estimation process based on the detection process of the deviation occurrence lidar and the first estimation method.
- the control unit 15 repeatedly executes the process of the flowchart of FIG.
- the control unit 15 performs vehicle position estimation using a Kalman filter using three or more riders (in this case, the riders 2A to 2C) while the vehicle is traveling (step S101).
- the control unit 15 calculates the estimated position X ⁇ reflecting the measurement value of each rider 2 by sequentially executing the Kalman filter update formulas shown in Formulas (2-1) to (2-3). To do.
- control unit 15 averages the difference value ⁇ Z i between the measured predicted value Z ⁇ i of each rider 2 and the measured value Z i for a predetermined time until the current processing reference time (step S102). .
- the control unit 15 stores the past difference value ⁇ Z i for the predetermined time in a buffer such as the storage unit 12, and each coordinate value ⁇ x Li of the stored difference value ⁇ Z i for the past predetermined time. , ⁇ y Li , ⁇ z Li are averaged.
- the control unit 15 determines whether there is a rider 2 whose average difference value ⁇ Z i is away from 0 (step S103). For example, the control unit 15 determines whether or not the absolute value of any of the coordinate values ⁇ x Li , ⁇ y Li , ⁇ z Li of the averaged difference value ⁇ Z i is equal to or greater than a predetermined value.
- the above-mentioned predetermined value is a value larger than the average absolute value of ⁇ x Li , ⁇ y Li , ⁇ z Li based on the measured value of the lidar 2 in which no deviation occurs, taking into account the length of the predetermined time used for averaging, etc.
- step S103 when there is a rider 2 whose average difference value ⁇ Z i is away from 0 (step S103; Yes), the control unit 15 shifts the rider 2 corresponding to the difference value ⁇ Z i whose average is away from 0. And the own vehicle position estimation process is executed by the rider 2 other than the deviation occurrence rider (step S104). On the other hand, when there is no rider 2 whose average difference value ⁇ Z i is away from 0 (step S103; No), the control unit 15 regards that there is no deviation occurrence rider and returns the process to step S101 again.
- step S104 after executing the own vehicle position estimation process by the rider 2 other than the deviation occurrence rider, the control unit 15 performs measurement prediction from the own vehicle position estimation result and the landmark coordinates of the landmark registered in the map DB 10.
- a value Z ⁇ is calculated and set as a measurement expected value [x b (k), y b (k), z b (k)] T of the deviation generation lidar (step S105).
- the control unit 15 selects a landmark that can be measured by the deviation generation lidar as the above-described landmark. Then, the control unit 15 acquires a plurality of sets of measurement values and measurement expectation values of the deviation occurrence lidar for the above-described landmarks.
- control unit 15 performs a least square method using the Newton method based on a plurality of sets of measurement expected values and measurement values, and estimates the attitude angle and position of the deviation-generating lidar (step S106).
- the use of the posture angle and position of the estimated deviation generation lidar will be described in the section “(3) Application Example”.
- the control unit 15 uses the own vehicle position estimated by the rider 2 other than the deviation occurrence rider (also referred to as “reference position”) and the own vehicle estimated by the deviation occurrence rider. Based on the difference value with the position (also referred to as “temporary position”), the position and the posture angle of the deviation occurrence lidar are updated.
- the state variables of the vehicle position estimated by the Kalman filter are assumed to be six variables of position x, y, z, yaw angle ⁇ , roll angle ⁇ , and pitch angle ⁇ .
- the measured predicted value Z ⁇ i (k) for the landmark at time k of each rider 2 is expressed by the following equation (21).
- the control unit 15 detects the deviation occurrence lidar based on the average of the difference values ⁇ Z i (k) as described in [Detection of deviation occurrence lidar].
- the control unit 15 detects the lidar 2C as a shift generation lidar.
- the control unit 15 uses the reference positions [x ⁇ r (k), y ⁇ r (k), z ⁇ r (k), ⁇ ⁇ r (k), estimated by the lidars 2A and 2B other than the deviation-generating lidar.
- control part 15 adds the difference value shown by Formula (22) to the position and attitude
- FIG. 8 is an example of a flowchart showing the execution procedure of the position angle and position estimation processing based on the detection processing of the deviation occurrence lidar and the second estimation method.
- the control unit 15 repeatedly executes the process of the flowchart of FIG.
- control unit 15 performs vehicle position estimation using a Kalman filter using three or more riders (here, riders 2A to 2C) while the vehicle is traveling (step S201).
- control unit 15 the measurement predicted value of each rider 2 Z - averaging at a given time with respect to the difference values [Delta] Z i of i and the measured value Z i, until the current standards (step S202).
- step S203 when there is a rider 2 whose average difference value ⁇ Z i is away from 0 (step S203; Yes), the control unit 15 shifts the rider 2 corresponding to the difference value ⁇ Z i whose average is away from 0. And the own vehicle position estimated from the measured values of the lidar 2 other than the deviation-generating lidar is set as a reference position (step S204). Further, the control unit 15 sets the own vehicle position estimated from the measurement value of the deviation occurrence lidar as a temporary position (step S205). Then, the control unit 15 updates the estimated value of the position and posture angle of the deviation-generating lidar based on the difference value between the reference position and the temporary position (step S206).
- step S207 when the difference value between the reference position and the temporary position is close to 0 (step S207; Yes), the control unit 15 determines that the estimated value of the position and posture angle of the deviation occurrence lidar is sufficiently accurate. Then, the process of the flowchart ends. On the other hand, when the difference value between the reference position and the temporary position is not close to 0 (step S207; No), the control unit 15 returns the process to step S204, and recalculates the reference position.
- FIG. 9 is a diagram showing the relationship between the vehicle coordinate system and the lidar coordinate system represented by two-dimensional coordinates.
- the vehicle coordinate system has a coordinate center “x v ” along the traveling direction of the vehicle and a coordinate axis “y v ” along the lateral direction of the vehicle with the vehicle center as the origin.
- the lidar coordinate system has a coordinate axis “x L ” along the front direction of the rider 2 (see arrow A ⁇ b> 2) and a coordinate axis “y L ” along the side surface direction of the rider 2.
- the measurement point [x] at the time “k” viewed from the vehicle coordinate system [x v (k), y v (k)] T is converted to the coordinates [x L (k), y L (k)] T of the lidar coordinate system by the following equation (23) using the rotation matrix “C ⁇ ”. Converted.
- the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the rotation matrix. Therefore, the measurement point [x L (k), y L (k)] T obtained at the time k obtained in the lidar coordinate system is expressed by the coordinates [x v (k), y v of the vehicle coordinate system according to the following equation (24). (K)] It is possible to convert to T.
- FIG. 10 is a diagram illustrating the relationship between the vehicle coordinate system and the lidar coordinate system represented by three-dimensional coordinates.
- a coordinate axis perpendicular to the coordinate axes x v and y v is “z v ”
- a coordinate axis perpendicular to the coordinate axes x L and y L is “z L ”.
- the roll angle of the lidar 2 with respect to the vehicle coordinate system is “L ⁇ ”, the pitch angle is “L ⁇ ”, the yaw angle is “L ⁇ ”, the position of the lidar 2 on the coordinate axis x v is “L x ”, and the coordinate axis y v
- the measurement point [x v (k), y v (k), z v ( k)] T is the following equation (25) using the direction cosine matrix “C” represented by the rotation matrices “C ⁇ ”, “C ⁇ ”, and “C ⁇ ” corresponding to roll, pitch, and yaw. Is converted into coordinates [x L (k), y L (k), z L (k)] T in the lidar coordinate system.
- the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the direction cosine matrix. Therefore, the measurement point [x L (k), y L (k), z L (k)] T obtained at the time k acquired in the lidar coordinate system is expressed by the coordinate [x v (K), y v (k), z v (k)] T.
- the control unit 15 calculates the estimated posture angle and position change amount of the position deviation occurrence lidar with respect to the posture angle and position of the position deviation occurrence rider before the occurrence of the deviation recorded in the rider installation information IL, and Based on the amount of change, each measurement value of the point cloud data output by the misregistration occurrence lidar is corrected.
- the control unit 15 stores a map or the like indicating the correction amount of the measurement value for each change amount, and corrects the above-described measurement value by referring to the map or the like.
- the measurement value may be corrected using the value of a predetermined ratio of the change amount as the correction amount of the measurement value.
- the control unit 15 stops the use of the position shift occurrence lidar and outputs a predetermined warning by the information output unit 16 when detecting a shift in position or posture angle at which the change amount is equal to or greater than a predetermined threshold. May be.
- the control unit 15 uses the calculated roll angle L ⁇ , pitch angle L ⁇ , yaw angle L ⁇ , x-direction position L x , y-direction position L y , and z-direction position L z to use the lidar 2.
- Each measured value of the point cloud data output from the vehicle may be converted from the lidar coordinate system to the vehicle body coordinate system, and based on the converted data, the vehicle position estimation or automatic driving control may be executed. Accordingly, the measurement value of the rider 2 can be appropriately converted into the vehicle coordinate system based on the attitude angle and position of the rider 2 after the occurrence of the deviation.
- the in-vehicle device 1 when the in-vehicle device 1 includes an adjustment mechanism such as an actuator for correcting the posture angle and position of each rider 2, the posture angle of the rider 2 based on the estimation result. And the position may be modified.
- the in-vehicle device 1 calculates the estimated amount of change in posture angle and position on the basis of the posture angle and position recorded in the rider installation information IL, and the posture angle and position of the rider 2 by the amount of the change amount. Control is performed to drive the adjustment mechanism so as to correct.
- the in-vehicle device 1 acquires the measurement value Z i by the rider 2 up to the landmark included in each measurement range of at least three riders 2 provided in the vehicle. Then, the in-vehicle device 1 acquires a predicted measurement value Z - i predicted based on the landmark coordinates indicated by the position information of the landmark registered in the map DB 10. The in-vehicle device 1 has at least one displacement of the mounting position with respect to the vehicle based on a plurality of difference values ⁇ Z i between the measured value Z i and the measured predicted value Z ⁇ i acquired for each rider 2. The lidar 2 is detected. As a result, the in-vehicle device 1 can accurately detect the lidar 2 in which the positional deviation has occurred.
- the control unit 15 may individually execute the Kalman filter update formula shown in Formula (1) based on the measurement value of each rider 2. That is, in this case, the control unit 15 performs vehicle position estimation based on the measurement value of the lidar 2A, vehicle position estimation based on the measurement value of the lidar 2B, and vehicle position estimation based on the measurement value of the lidar 2C based on the formula (1). And the difference value ⁇ Z between the measured value Z calculated at the time of executing each formula (1) and the measured predicted value Z ⁇ is compared. Then, when there is a difference value ⁇ Z that is greater than or equal to a predetermined value, the control unit 15 detects the rider 2 corresponding to the difference value ⁇ Z as a deviation occurrence rider. In this case, as in the embodiment, the landmarks to be measured by the riders 2 do not have to be the same.
- the configuration of the driving support system shown in FIG. 1 is an example, and the configuration of the driving support system to which the present invention is applicable is not limited to the configuration shown in FIG.
- the electronic control device of the vehicle instead of having the in-vehicle device 1, the electronic control device of the vehicle may execute the processing shown in FIG.
- the lidar installation information IL is stored in, for example, a storage unit in the vehicle, and the vehicle electronic control device is configured to be able to receive output data of various sensors such as the lidar 2.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Une machine à bord d'un véhicule (1) exécute les étapes consistant à : au moyen de chaque LIDAR parmi au moins trois LIDAR (2) situés dans un véhicule, obtenir une valeur de mesure Zi , en un repère faisant partie des plages de mesure respectives des LIDAR (2) ; obtenir une valeur de mesure prédite Z- i prédite sur la base des coordonnées du repère indiquées par des informations sur la position du repère enregistré dans la base de données de carte (10) ; et détecter au moins un LIDAR (2) dans lequel un désalignement s'est produit par rapport à la position de montage sur le véhicule sur la base d'une pluralité de valeurs de différence ΔZi entre la valeur de mesure Zi obtenue pour chacun des LIDAR (2) et la valeur de mesure prédite Z- i.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018063186 | 2018-03-28 | ||
| JP2018-063186 | 2018-03-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019188745A1 true WO2019188745A1 (fr) | 2019-10-03 |
Family
ID=68061748
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/011974 Ceased WO2019188745A1 (fr) | 2018-03-28 | 2019-03-22 | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2019188745A1 (fr) |
Cited By (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111879287A (zh) * | 2020-07-08 | 2020-11-03 | 河南科技大学 | 基于多传感器的低速车辆的前向地形三维构建方法 |
| WO2021112074A1 (fr) | 2019-12-02 | 2021-06-10 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2021112177A1 (fr) | 2019-12-04 | 2021-06-10 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
| CN113790720A (zh) * | 2021-08-16 | 2021-12-14 | 北京自动化控制设备研究所 | 一种基于递推最小二乘的抗扰动粗对准方法 |
| CN114167381A (zh) * | 2021-11-30 | 2022-03-11 | 北京经纬恒润科技股份有限公司 | 一种目标检测方法及系统 |
| WO2022102577A1 (fr) | 2020-11-13 | 2022-05-19 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2022124115A1 (fr) | 2020-12-07 | 2022-06-16 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme, et support d'enregistrement |
| WO2022208617A1 (fr) | 2021-03-29 | 2022-10-06 | パイオニア株式会社 | Structure de données de carte, dispositif de stockage, dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2023037500A1 (fr) | 2021-09-10 | 2023-03-16 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de détermination, programme, et support de stockage |
| WO2023037502A1 (fr) | 2021-09-10 | 2023-03-16 | パイオニア株式会社 | Dispositif serveur, procédé de commande, programme, et support de stockage |
| WO2023062782A1 (fr) | 2021-10-14 | 2023-04-20 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
| WO2023175714A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme, et support d'enregistrement |
| WO2023176639A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
| WO2023175715A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2023176653A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2023176640A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme, et support de stockage |
| WO2023175712A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004037239A (ja) * | 2002-07-03 | 2004-02-05 | Fuji Heavy Ind Ltd | 同一対象物判断方法および装置、並びに、位置ずれ補正方法および装置 |
| JP2014238297A (ja) * | 2013-06-06 | 2014-12-18 | トヨタ自動車株式会社 | 車両位置認識装置 |
| JP2016183911A (ja) * | 2015-03-26 | 2016-10-20 | 三菱電機株式会社 | レーダ軸ずれ判定装置、レーダ装置及びレーダ軸ずれ判定方法 |
| JP2016197081A (ja) * | 2015-04-06 | 2016-11-24 | 日立建機株式会社 | 運搬車両 |
| WO2017060947A1 (fr) * | 2015-10-05 | 2017-04-13 | パイオニア株式会社 | Appareil d'estimation, procédé de commande, programme et support de stockage |
| JP2017227580A (ja) * | 2016-06-24 | 2017-12-28 | 三菱電機株式会社 | 物体認識装置、物体認識方法および自動運転システム |
-
2019
- 2019-03-22 WO PCT/JP2019/011974 patent/WO2019188745A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004037239A (ja) * | 2002-07-03 | 2004-02-05 | Fuji Heavy Ind Ltd | 同一対象物判断方法および装置、並びに、位置ずれ補正方法および装置 |
| JP2014238297A (ja) * | 2013-06-06 | 2014-12-18 | トヨタ自動車株式会社 | 車両位置認識装置 |
| JP2016183911A (ja) * | 2015-03-26 | 2016-10-20 | 三菱電機株式会社 | レーダ軸ずれ判定装置、レーダ装置及びレーダ軸ずれ判定方法 |
| JP2016197081A (ja) * | 2015-04-06 | 2016-11-24 | 日立建機株式会社 | 運搬車両 |
| WO2017060947A1 (fr) * | 2015-10-05 | 2017-04-13 | パイオニア株式会社 | Appareil d'estimation, procédé de commande, programme et support de stockage |
| JP2017227580A (ja) * | 2016-06-24 | 2017-12-28 | 三菱電機株式会社 | 物体認識装置、物体認識方法および自動運転システム |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021112074A1 (fr) | 2019-12-02 | 2021-06-10 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2021112177A1 (fr) | 2019-12-04 | 2021-06-10 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
| CN111879287A (zh) * | 2020-07-08 | 2020-11-03 | 河南科技大学 | 基于多传感器的低速车辆的前向地形三维构建方法 |
| WO2022102577A1 (fr) | 2020-11-13 | 2022-05-19 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2022124115A1 (fr) | 2020-12-07 | 2022-06-16 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme, et support d'enregistrement |
| WO2022208617A1 (fr) | 2021-03-29 | 2022-10-06 | パイオニア株式会社 | Structure de données de carte, dispositif de stockage, dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| CN113790720B (zh) * | 2021-08-16 | 2023-08-15 | 北京自动化控制设备研究所 | 一种基于递推最小二乘的抗扰动粗对准方法 |
| CN113790720A (zh) * | 2021-08-16 | 2021-12-14 | 北京自动化控制设备研究所 | 一种基于递推最小二乘的抗扰动粗对准方法 |
| WO2023037500A1 (fr) | 2021-09-10 | 2023-03-16 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de détermination, programme, et support de stockage |
| WO2023037502A1 (fr) | 2021-09-10 | 2023-03-16 | パイオニア株式会社 | Dispositif serveur, procédé de commande, programme, et support de stockage |
| WO2023062782A1 (fr) | 2021-10-14 | 2023-04-20 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
| CN114167381A (zh) * | 2021-11-30 | 2022-03-11 | 北京经纬恒润科技股份有限公司 | 一种目标检测方法及系统 |
| WO2023175714A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme, et support d'enregistrement |
| WO2023176639A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
| WO2023175715A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2023176653A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
| WO2023176640A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Dispositif de traitement d'informations, procédé de commande, programme, et support de stockage |
| WO2023175712A1 (fr) | 2022-03-15 | 2023-09-21 | パイオニア株式会社 | Appareil de traitement d'informations, procédé de commande, programme et support d'enregistrement |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019188745A1 (fr) | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage | |
| JP2022113746A (ja) | 判定装置 | |
| EP4170282A1 (fr) | Procédé d'étalonnage d'angle d'écart de montage entre des capteurs, système de positionnement combiné et véhicule | |
| US12174021B2 (en) | Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium | |
| EP3954968A1 (fr) | Dispositif d'estimation de position, dispositif d'estimation, procédé de commande, programme et support d'informations | |
| US12146746B2 (en) | Information processing device, control method, program and storage medium | |
| KR102086270B1 (ko) | 주행 제어 장치의 제어 방법 및 주행 제어 장치 | |
| JP6806891B2 (ja) | 情報処理装置、制御方法、プログラム及び記憶媒体 | |
| JP6980010B2 (ja) | 自己位置推定装置、制御方法、プログラム及び記憶媒体 | |
| JP7418196B2 (ja) | 走行軌跡推定方法及び走行軌跡推定装置 | |
| JP2024161585A (ja) | 自己位置推定装置 | |
| WO2017199333A1 (fr) | Dispositif de sortie d'informations, dispositif de terminal, procédé de commande, programme et support de stockage | |
| JP2023075184A (ja) | 出力装置、制御方法、プログラム及び記憶媒体 | |
| JP2024105508A (ja) | 出力装置、制御方法、プログラム及び記憶媒体 | |
| JP2024161130A (ja) | 自己位置推定装置、制御方法、プログラム及び記憶媒体 | |
| WO2019188886A1 (fr) | Dispositif terminal, procédé de traitement d'informations et support d'informations | |
| CN113795726B (zh) | 自身位置修正方法及自身位置修正装置 | |
| JP6707627B2 (ja) | 測定装置、測定方法、及び、プログラム | |
| TWI394944B (zh) | 車輛姿態估測系統與方法 | |
| WO2019188802A1 (fr) | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage | |
| WO2017168588A1 (fr) | Dispositif de mesure, procédé de mesure et programme | |
| JP2021044008A (ja) | 情報出力装置、端末装置、制御方法、プログラム及び記憶媒体 | |
| JP2017181195A (ja) | 測定装置、測定方法、及び、プログラム | |
| WO2019188874A1 (fr) | Structure de données, dispositif de traitement d'informations et dispositif de génération de données cartographiques | |
| WO2018212290A1 (fr) | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19776476 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19776476 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |