[go: up one dir, main page]

WO2022123786A1 - Reliability estimation device, position estimation device, and reliability estimation method - Google Patents

Reliability estimation device, position estimation device, and reliability estimation method Download PDF

Info

Publication number
WO2022123786A1
WO2022123786A1 PCT/JP2020/046360 JP2020046360W WO2022123786A1 WO 2022123786 A1 WO2022123786 A1 WO 2022123786A1 JP 2020046360 W JP2020046360 W JP 2020046360W WO 2022123786 A1 WO2022123786 A1 WO 2022123786A1
Authority
WO
WIPO (PCT)
Prior art keywords
reliability
feature
calculation unit
feature amount
camera image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/046360
Other languages
French (fr)
Japanese (ja)
Inventor
昇之 芳川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to CN202080107744.7A priority Critical patent/CN116547714B/en
Priority to JP2022568023A priority patent/JP7221469B2/en
Priority to DE112020007687.3T priority patent/DE112020007687T5/en
Priority to PCT/JP2020/046360 priority patent/WO2022123786A1/en
Priority to TW110114658A priority patent/TW202232433A/en
Publication of WO2022123786A1 publication Critical patent/WO2022123786A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/77Determining position or orientation of objects or cameras using statistical methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Definitions

  • This disclosure relates to a reliability estimation device.
  • a three-dimensional position estimation algorithm for estimating the three-dimensional position of an object based on an image has been developed (in the present specification, the three-dimensional position of an object is synonymous with a three-dimensional posture).
  • the three-dimensional position of an object is synonymous with a three-dimensional posture.
  • Non-Patent Document 1 by applying a neural network model to a camera image obtained by a monocular camera taking an object, the object in the camera image is recognized and the position of a feature point of the object is recognized.
  • a three-dimensional position estimation method for estimating the three-dimensional position of an object based on the estimated position of the feature point is described.
  • the position of the object estimated by such a three-dimensional position estimation method (for example, the position of the feature point of the object or the three-dimensional position of the object) is referred to as an estimated position.
  • the reliability of the estimated position is determined by using the same neural network model as the machine learning model used to estimate the estimated position. It may be calculated and the calculated reliability may be used to evaluate the estimated position.
  • the present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to provide a technique for improving the reliability of the estimated position itself.
  • the reliability estimation device is a reliability estimation device that estimates the reliability of the estimated position of the object estimated based on the camera image, and is used for feature point information regarding a plurality of feature points of the object in the camera image. Based on this, it includes a feature amount calculation unit that calculates the feature amount of the object, and a reliability calculation unit that calculates the reliability based on the feature amount calculated by the feature amount calculation unit.
  • the reliability of the estimated position itself can be improved.
  • FIG. 4A is a block diagram showing a hardware configuration that realizes the function of the position estimation device according to the first embodiment.
  • FIG. 4B is a block diagram showing a hardware configuration for executing software that realizes the function of the position estimation device according to the first embodiment.
  • FIG. 1 is a block diagram showing a configuration of a position estimation system 100 according to a first embodiment.
  • the position estimation system 100 includes a position estimation device 1, N cameras 2 (N is a positive integer), and a storage device 3.
  • the position estimation device 1 includes a camera selection unit 40, a feature point estimation unit 10, a reliability estimation device 20, and a position calculation unit 30.
  • the reliability estimation device 20 includes a feature amount calculation unit 21 and a reliability calculation unit 22.
  • the N cameras 2 are composed of each camera 2 from the first camera 2 to the Nth camera 2. Each of the N cameras 2 acquires a camera image (camera images D11 to D1N) by photographing an object. Each of the N cameras 2 outputs the acquired camera image to the camera selection unit 40 of the position estimation device 1.
  • the camera selection unit 40 selects at least one camera image D2 from a plurality of camera images. More specifically, in the first embodiment, the camera selection unit 40 selects at least one camera image D2 from the plurality of camera images D11 to D1N acquired by the N cameras 2. The detailed configuration of the camera selection unit 40 will be described later.
  • the camera selection unit 40 outputs the selected camera image D2 to the feature point estimation unit 10 and the feature amount calculation unit 21, respectively.
  • the feature point estimation unit 10 estimates feature point information regarding a plurality of feature points of an object based on a camera image. More specifically, in the first embodiment, the feature point estimation unit 10 estimates the feature point information based on the camera image D2 selected by the camera selection unit 40.
  • the storage device 3 stores PVNet, which is a convolutional neural network model for object position detection.
  • the feature point estimation unit 10 applies PVNet stored in the storage device 3 to the camera image D2 selected by the camera selection unit 40, so that each estimated position D3 of a plurality of feature points can be used as feature point information.
  • the dispersion D4 of the candidate point group of the feature points, and the object area D5 which is the area occupied by the object in the camera image are estimated.
  • the feature point estimation unit 10 outputs each estimated position D3 of the estimated plurality of feature points to the position calculation unit 30. Further, the feature point estimation unit 10 outputs each estimated position D3 of the estimated plurality of feature points, the variance D4 of the candidate point group of the feature points, and the object region D5 to the feature amount calculation unit 21.
  • the feature amount calculation unit 21 calculates the feature amount of the object based on the feature point information regarding a plurality of feature points of the object in the camera image. More specifically, in the first embodiment, the feature amount calculation unit 21 calculates the feature amount of the object based on the feature point information estimated by the feature point estimation unit 10.
  • the feature amount calculation unit 21 includes m feature amount calculation units from the first feature amount calculation unit 201 to the mth feature amount calculation unit 20 m.
  • the first feature amount calculation unit 201 calculates the feature amount D61 of the object based on the variance D4 of the candidate point cloud of the feature points estimated by the feature point estimation unit 10.
  • the feature amount D61 calculated by the first feature amount calculation unit 201 is a feature amount proportional to the reciprocal of the variance D4 of the candidate point group of the feature points.
  • the feature amount D61 calculated by the first feature amount calculation unit 201 is a feature amount that is inversely proportional to the variance D4 of the candidate point cloud of the feature points.
  • the first feature amount calculation unit 201 outputs the calculated feature amount D61 to the reliability calculation unit 22.
  • the second feature amount calculation unit 202 inputs a region composed of a plurality of feature points and an object region based on the estimated positions D3 and the object region D5 of the plurality of feature points estimated by the feature point estimation unit 10.
  • the feature amount D62 which is a function to be taken is calculated. More specifically, in the first embodiment, the second feature amount calculation unit 202 has a plurality of feature points based on the estimated positions D3 and the object area D5 of the plurality of feature points estimated by the feature point estimation unit 10.
  • the area ratio between the convex plane composed of and the object area is calculated, and the feature amount proportional to the inverse number of the calculated area ratio or the feature amount D62 proportional to the inverse number of the difference between the calculated area ratio and 1 is calculated. ..
  • the "convex plane composed of a plurality of feature points” is, for example, the largest convex plane in which a plurality of feature points are stretched, or the smallest convex plane containing all the plurality of feature points.
  • the second feature amount calculation unit 202 outputs the calculated feature amount D62 to the reliability calculation unit 22.
  • Each feature amount calculation unit from the third feature amount calculation unit to the mth feature amount calculation unit 20m includes a camera image D2 selected by the camera selection unit 40 and a plurality of feature points estimated by the feature point estimation unit 10. Based on the information of at least one of each estimated position D3, the dispersion D4 of the candidate point group of the feature points, and the object area D5, the feature amount (feature amount D63 to D6m) that can be calculated from the at least one information is calculated. do.
  • Each feature amount calculation unit from the third feature amount calculation unit (not shown) to the mth feature amount calculation unit 20 m outputs the calculated feature amount to the reliability calculation unit 22.
  • the reliability calculation unit 22 calculates the reliability of the estimated position of the object based on the feature amount calculated by the feature amount calculation unit 21. More specifically, in the first embodiment, the reliability calculation unit 22 correlates with the estimation error of the estimated position of the object based on the feature amounts (feature amounts D61 to D6m) calculated by the feature amount calculation unit 21.
  • the reliability D7 (for example, the reliability D7 having a positive correlation with the estimation error) is calculated.
  • the storage device 3 stores a neural network model that calculates the reliability from the feature amount.
  • the reliability calculation unit 22 applies the neural network model stored in the storage device 3 to the feature amounts (feature amounts D61 to D6m) calculated by the feature amount calculation unit 21, and thereby, the estimated position of the object.
  • the reliability D7 which is inversely proportional to the estimation error of, is calculated.
  • the reliability calculation unit 22 outputs the calculated reliability D7 to the outside of the camera selection unit 40 and the position estimation device 1, respectively.
  • the camera selection unit 40 described above selects at least one camera image D2 from the N camera images D11 to D1N based on the reliability D7 calculated by the reliability calculation unit 22.
  • the camera selection unit 40 outputs the camera image D2 selected based on the reliability D7 to the feature point estimation unit 10 and the feature amount calculation unit 21, respectively.
  • the position calculation unit 30 calculates the estimated position D8 of the object based on the feature point information estimated by the feature point estimation unit 10. More specifically, in the first embodiment, the storage device 3 stores the position of the feature point of the object. The position calculation unit 30 solves the PnP problem by solving the PnP problem based on each estimated position D3 of the plurality of feature points estimated by the feature point estimation unit 10 and the feature point position of the object stored in the storage device 3. The estimated position D8 of is calculated.
  • An example of the estimated position D8 calculated by the position calculation unit 30 is a three-dimensional position (three-dimensional posture) of an object.
  • the position calculation unit 30 outputs the calculated estimated position D8 to the outside of the position estimation device 1.
  • the reliability D7 output by the reliability calculation unit 22 to the outside of the position estimation device 1 is used, for example, for the evaluation of the estimated position D8 output to the outside of the position estimation device 1 by the position calculation unit 30.
  • FIG. 2 is a flowchart showing a reliability estimation method by the position estimation device 1 according to the first embodiment. Before each step described below, it is assumed that each of the N cameras 2 acquires a camera image by photographing an object and outputs the acquired camera image to the camera selection unit 40.
  • the camera selection unit 40 selects at least one camera image D2 from a plurality of camera images (step ST1).
  • the camera selection unit 40 outputs the selected camera image D2 to the feature point estimation unit 10 and the feature amount calculation unit 21, respectively.
  • the feature point estimation unit 10 uses the camera image D2 selected by the camera selection unit 40 as feature point information such as each estimated position D3 of a plurality of feature points, a dispersion D4 of a candidate point group of the feature points, and a feature point cloud.
  • the object region D5 is estimated (step ST2).
  • the feature point estimation unit 10 outputs each estimated position D3 of the estimated plurality of feature points, the variance D4 of the candidate point group of the feature points, and the object region D5 to the feature amount calculation unit 21.
  • the feature amount calculation unit 21 includes a camera image D2 selected by the camera selection unit 40, each estimated position D3 of a plurality of feature points estimated by the feature point estimation unit 10, a dispersion D4 of a group of candidate points of the feature points, and an object.
  • the feature amounts D61 to D6m of the object are calculated in the area D5 based on the information of at least one of the feature amount calculation units 21 (step ST3).
  • the feature amount calculation unit 21 outputs the calculated feature amounts D61 to D6m to the reliability calculation unit 22.
  • FIG. 3 is a flowchart showing details of the feature amount calculation method in step ST3.
  • the first feature amount calculation unit 201 is based on the variance D4 of the candidate point group of the feature points estimated by the feature point estimation unit 10, and the feature amount of the object proportional to the reciprocal of the variance D4.
  • Calculate D61 step ST31).
  • the first feature amount calculation unit 201 outputs the calculated feature amount D61 to the reliability calculation unit 22.
  • Each feature amount calculation unit from the third feature amount calculation unit to the mth feature amount calculation unit 20m is a camera image D2 selected by the camera selection unit 40 and each of a plurality of feature points estimated by the feature point estimation unit 10.
  • the other feature quantities D63 to D6m are calculated based on the information of at least one of the estimated position D3, the dispersion D4 of the candidate point group of the feature points, and the object region D5 (step ST33).
  • Each feature amount calculation unit from the third feature amount calculation unit to the mth feature amount calculation unit 20m outputs the calculated feature amount to the reliability calculation unit 22.
  • step ST1 a specific example of the camera selection method (step ST1) in the reliability estimation method by the position estimation device 1 according to the first embodiment will be described in detail. It is assumed that the step of step ST1 described below is performed again after performing a series of steps from step ST1 to step ST4 once or more.
  • the position estimation device 1 returns to the step of step ST1 described above, and the camera selection unit 40 is a camera calculated by the reliability calculation unit 22 when the reliability calculation unit 22 calculates the reliability for each camera image.
  • At least one camera image D2 is selected from the N camera images D11 to D1N based on the maximum reliability of the reliability of each image. More specifically, for example, when the reliability calculation unit 22 calculates the reliability for each camera image, the camera selection unit 40 has the maximum reliability of the reliability for each camera image calculated by the reliability calculation unit 22.
  • the camera image D2 taken by the camera 2 corresponding to the degree is selected.
  • the functions of the camera selection unit 40, the feature point estimation unit 10, the feature amount calculation unit 21 and the reliability calculation unit 22 of the reliability estimation device 20, and the position calculation unit 30 in the position estimation device 1 are realized by separate processing circuits. Alternatively, these functions may be integrated into one processing circuit.
  • the reliability estimation device 20 is a reliability estimation device 20 that estimates the reliability of the estimated position of the object estimated based on the camera image, and is the reliability estimation device 20 of the object in the camera image.
  • the reliability of the estimated position of the object is calculated based on the feature amount calculation unit 21 that calculates the feature amount of the object based on the feature point information about a plurality of feature points and the feature amount calculated by the feature amount calculation unit 21.
  • a reliability calculation unit 22 is provided.
  • the reliability estimation device 20 when gripping an object with a robot arm, it is necessary to detect the position of the object. Therefore, when an image taken by a monocular camera or the like attached to the arm is applied as the above-mentioned camera image to the reliability estimation device 20 according to the first embodiment, an image in which position estimation is difficult due to poor shooting conditions is input. However, the above-mentioned reliability with high reliability can be calculated, and the estimated position can be evaluated based on the reliability. Therefore, stable robot control can be realized.
  • the reliability estimation device can be used as a position estimation device because the reliability of the reliability of the estimated position itself can be improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A reliability estimation device (20) estimates the reliability of an estimated position of an object estimated on the basis of a camera image, and is provided with: a feature quantity calculation unit (21) that calculates a feature quantity of the object on the basis of feature point information pertaining to a plurality of feature points of the object in the camera image; and a reliability calculation unit (22) that calculates a reliability on the basis of the feature quantity calculated by the feature quantity calculation unit (21).

Description

信頼度推定装置、位置推定装置、及び信頼度推定方法Reliability estimation device, position estimation device, and reliability estimation method

 本開示は、信頼度推定装置に関する。 This disclosure relates to a reliability estimation device.

 画像に基づいて、物体の3次元位置を推定する3次元位置推定アルゴリズムが開発されている(なお、本明細書において、物体の3次元位置は、3次元姿勢と同義とする)。例えば、非特許文献1には、単眼カメラが物体を撮影することにより得られたカメラ画像に対してニューラルネットワークモデルを適用することにより、カメラ画像における物体を認識し、当該物体の特徴点の位置を推定し、推定した特徴点の位置に基づいて物体の3次元位置を推定する3次元位置推定方法が記載されている。なお、以下では、このような3次元位置推定方法によって推定した物体の位置(例えば、物体の特徴点の位置又は物体の3次元位置等)を推定位置と呼称する。 A three-dimensional position estimation algorithm for estimating the three-dimensional position of an object based on an image has been developed (in the present specification, the three-dimensional position of an object is synonymous with a three-dimensional posture). For example, in Non-Patent Document 1, by applying a neural network model to a camera image obtained by a monocular camera taking an object, the object in the camera image is recognized and the position of a feature point of the object is recognized. A three-dimensional position estimation method for estimating the three-dimensional position of an object based on the estimated position of the feature point is described. In the following, the position of the object estimated by such a three-dimensional position estimation method (for example, the position of the feature point of the object or the three-dimensional position of the object) is referred to as an estimated position.

S. Peng, Y. Liu, Q. Huang, H.Bao, X. Zhou,“PVNet:Pixel-wise Voting Network for 6DoF Pose Estimation,” In Computer Vision and Pattern Recognition(CVPR), 2019.S. Peng, Y. Liu, Q. Huang, H. Bao, X. Zhou, "PVNet: Pixel-wise Voting Network for 6DoF Pose Estimation," In Computer Vision and Pattern Recognition (CVPR), 2019.

 画像に対してニューラルネットワークモデルを適用する上記のような従来の3次元位置推定方法では、推定位置を推定する際に用いた機械学習モデルと同じニューラルネットワークモデルを用いて、推定位置の信頼度を算出し、算出した信頼度を推定位置の評価に用いることがある。 In the conventional 3D position estimation method that applies a neural network model to an image, the reliability of the estimated position is determined by using the same neural network model as the machine learning model used to estimate the estimated position. It may be calculated and the calculated reliability may be used to evaluate the estimated position.

 その場合、推定位置の信頼性と、信頼度自体の信頼性とには相関があると考えられる。従って推定位置の信頼性が乏しい場合、信頼度自体の信頼性も乏しくなり、信頼度を用いた推定位置の評価の妥当性が不確かになってしまうという問題がある。
 本開示は、上記のような問題点を解決するためになされたものであり、推定位置の信頼度自体の信頼性を向上させる技術を提供することを目的とする。
In that case, it is considered that there is a correlation between the reliability of the estimated position and the reliability of the reliability itself. Therefore, when the reliability of the estimated position is poor, the reliability of the reliability itself is also poor, and there is a problem that the validity of the evaluation of the estimated position using the reliability becomes uncertain.
The present disclosure has been made to solve the above-mentioned problems, and an object of the present disclosure is to provide a technique for improving the reliability of the estimated position itself.

 本開示に係る信頼度推定装置は、カメラ画像に基づいて推定された物体の推定位置の信頼度を推定する信頼度推定装置であって、カメラ画像における物体の複数の特徴点に関する特徴点情報に基づいて、物体の特徴量を算出する特徴量算出部と、特徴量算出部が算出した特徴量に基づいて、信頼度を算出する信頼度算出部と、を備えている。 The reliability estimation device according to the present disclosure is a reliability estimation device that estimates the reliability of the estimated position of the object estimated based on the camera image, and is used for feature point information regarding a plurality of feature points of the object in the camera image. Based on this, it includes a feature amount calculation unit that calculates the feature amount of the object, and a reliability calculation unit that calculates the reliability based on the feature amount calculated by the feature amount calculation unit.

 本開示によれば、推定位置の信頼度自体の信頼性を向上させることができる。 According to the present disclosure, the reliability of the estimated position itself can be improved.

実施の形態1に係る位置推定システムの構成を示すブロック図である。It is a block diagram which shows the structure of the position estimation system which concerns on Embodiment 1. FIG. 実施の形態1に係る位置推定装置による信頼度推定方法を示すフローチャートである。It is a flowchart which shows the reliability estimation method by the position estimation apparatus which concerns on Embodiment 1. FIG. 実施の形態1に係る位置推定装置による信頼度推定方法における特徴量算出方法の詳細を示すフローチャートである。It is a flowchart which shows the detail of the feature amount calculation method in the reliability estimation method by the position estimation apparatus which concerns on Embodiment 1. FIG. 図4Aは、実施の形態1に係る位置推定装置の機能を実現するハードウェア構成を示すブロック図である。図4Bは、実施の形態1に係る位置推定装置の機能を実現するソフトウェアを実行するハードウェア構成を示すブロック図である。FIG. 4A is a block diagram showing a hardware configuration that realizes the function of the position estimation device according to the first embodiment. FIG. 4B is a block diagram showing a hardware configuration for executing software that realizes the function of the position estimation device according to the first embodiment.

 以下、本開示をより詳細に説明するため、本開示を実施するための形態について、添付の図面に従って説明する。
実施の形態1.
 図1は、実施の形態1に係る位置推定システム100の構成を示すブロック図である。図1が示すように、位置推定システム100は、位置推定装置1、N個のカメラ2(Nは正の整数)、及び記憶装置3を含む。位置推定装置1は、カメラ選択部40、特徴点推定部10、信頼度推定装置20、及び位置算出部30を備えている。信頼度推定装置20は、特徴量算出部21、及び信頼度算出部22を備えている。
Hereinafter, in order to explain the present disclosure in more detail, a mode for carrying out the present disclosure will be described with reference to the accompanying drawings.
Embodiment 1.
FIG. 1 is a block diagram showing a configuration of a position estimation system 100 according to a first embodiment. As shown in FIG. 1, the position estimation system 100 includes a position estimation device 1, N cameras 2 (N is a positive integer), and a storage device 3. The position estimation device 1 includes a camera selection unit 40, a feature point estimation unit 10, a reliability estimation device 20, and a position calculation unit 30. The reliability estimation device 20 includes a feature amount calculation unit 21 and a reliability calculation unit 22.

 N個のカメラ2は、第1のカメラ2から第Nのカメラ2までの各カメラ2から構成されている。N個のカメラ2は、それぞれ、物体を撮影することによりカメラ画像(カメラ画像D11~D1N)を取得する。N個のカメラ2は、それぞれ、取得したカメラ画像を位置推定装置1のカメラ選択部40に出力する。 The N cameras 2 are composed of each camera 2 from the first camera 2 to the Nth camera 2. Each of the N cameras 2 acquires a camera image (camera images D11 to D1N) by photographing an object. Each of the N cameras 2 outputs the acquired camera image to the camera selection unit 40 of the position estimation device 1.

 カメラ選択部40は、複数のカメラ画像のうちから少なくとも1つのカメラ画像D2を選択する。より詳細には、実施の形態1では、カメラ選択部40は、N個のカメラ2が取得した複数のカメラ画像D11~D1Nのうちから少なくとも1つのカメラ画像D2を選択する。カメラ選択部40の詳細な構成については後述する。カメラ選択部40は、選択したカメラ画像D2を特徴点推定部10及び特徴量算出部21にそれぞれ出力する。 The camera selection unit 40 selects at least one camera image D2 from a plurality of camera images. More specifically, in the first embodiment, the camera selection unit 40 selects at least one camera image D2 from the plurality of camera images D11 to D1N acquired by the N cameras 2. The detailed configuration of the camera selection unit 40 will be described later. The camera selection unit 40 outputs the selected camera image D2 to the feature point estimation unit 10 and the feature amount calculation unit 21, respectively.

 特徴点推定部10は、カメラ画像に基づいて、物体の複数の特徴点に関する特徴点情報を推定する。より詳細には、実施の形態1では、特徴点推定部10は、カメラ選択部40が選択したカメラ画像D2に基づいて、特徴点情報を推定する。 The feature point estimation unit 10 estimates feature point information regarding a plurality of feature points of an object based on a camera image. More specifically, in the first embodiment, the feature point estimation unit 10 estimates the feature point information based on the camera image D2 selected by the camera selection unit 40.

 さらに詳細には、実施の形態1では、記憶装置3が物体位置検出用畳み込みニューラルネットワークモデルであるPVNetを記憶している。特徴点推定部10は、カメラ選択部40が選択したカメラ画像D2に対して、記憶装置3が記憶しているPVNetを適用することにより、特徴点情報として、複数の特徴点の各推定位置D3、特徴点の候補点群の分散D4、及びカメラ画像において物体が占める領域である物体領域D5を推定する。 More specifically, in the first embodiment, the storage device 3 stores PVNet, which is a convolutional neural network model for object position detection. The feature point estimation unit 10 applies PVNet stored in the storage device 3 to the camera image D2 selected by the camera selection unit 40, so that each estimated position D3 of a plurality of feature points can be used as feature point information. , The dispersion D4 of the candidate point group of the feature points, and the object area D5 which is the area occupied by the object in the camera image are estimated.

 特徴点推定部10は、推定した複数の特徴点の各推定位置D3を位置算出部30に出力する。また、特徴点推定部10は、推定した複数の特徴点の各推定位置D3、特徴点の候補点群の分散D4及び物体領域D5を特徴量算出部21に出力する。 The feature point estimation unit 10 outputs each estimated position D3 of the estimated plurality of feature points to the position calculation unit 30. Further, the feature point estimation unit 10 outputs each estimated position D3 of the estimated plurality of feature points, the variance D4 of the candidate point group of the feature points, and the object region D5 to the feature amount calculation unit 21.

 特徴量算出部21は、カメラ画像における物体の複数の特徴点に関する特徴点情報に基づいて、物体の特徴量を算出する。より詳細には、実施の形態1では、特徴量算出部21は、特徴点推定部10が推定した特徴点情報に基づいて、物体の特徴量を算出する。 The feature amount calculation unit 21 calculates the feature amount of the object based on the feature point information regarding a plurality of feature points of the object in the camera image. More specifically, in the first embodiment, the feature amount calculation unit 21 calculates the feature amount of the object based on the feature point information estimated by the feature point estimation unit 10.

 さらに詳細には、実施の形態1では、特徴量算出部21は、第1の特徴量算出部201から第mの特徴量算出部20mまでのm個の特徴量算出部を備えている。
 第1の特徴量算出部201は、特徴点推定部10が推定した特徴点の候補点群の分散D4に基づいて、物体の特徴量D61を算出する。なお、実施の形態1では、第1の特徴量算出部201が算出する特徴量D61は、特徴点の候補点群の分散D4の逆数に比例する特徴量である。換言すれば、第1の特徴量算出部201が算出する特徴量D61は、特徴点の候補点群の分散D4に逆比例する特徴量である。第1の特徴量算出部201は、算出した特徴量D61を信頼度算出部22に出力する。
More specifically, in the first embodiment, the feature amount calculation unit 21 includes m feature amount calculation units from the first feature amount calculation unit 201 to the mth feature amount calculation unit 20 m.
The first feature amount calculation unit 201 calculates the feature amount D61 of the object based on the variance D4 of the candidate point cloud of the feature points estimated by the feature point estimation unit 10. In the first embodiment, the feature amount D61 calculated by the first feature amount calculation unit 201 is a feature amount proportional to the reciprocal of the variance D4 of the candidate point group of the feature points. In other words, the feature amount D61 calculated by the first feature amount calculation unit 201 is a feature amount that is inversely proportional to the variance D4 of the candidate point cloud of the feature points. The first feature amount calculation unit 201 outputs the calculated feature amount D61 to the reliability calculation unit 22.

 第2の特徴量算出部202は、特徴点推定部10が推定した複数の特徴点の各推定位置D3及び物体領域D5に基づいて、複数の特徴点から構成される領域と物体領域とを入力にとる関数となる特徴量D62を算出する。より詳細には、実施の形態1では、第2の特徴量算出部202は、特徴点推定部10が推定した複数の特徴点の各推定位置D3及び物体領域D5に基づいて、複数の特徴点から構成される凸平面と物体領域との面積比を算出し、算出した面積比の逆数に比例する特徴量、又は算出した面積比と1との差の逆数に比例する特徴量D62を算出する。ここにおける「複数の特徴点から構成される凸平面」は、例えば、複数の特徴点が張る最大の凸平面、又は複数の特徴点を全て内包する最小の凸平面等である。第2の特徴量算出部202は、算出した特徴量D62を信頼度算出部22に出力する。 The second feature amount calculation unit 202 inputs a region composed of a plurality of feature points and an object region based on the estimated positions D3 and the object region D5 of the plurality of feature points estimated by the feature point estimation unit 10. The feature amount D62 which is a function to be taken is calculated. More specifically, in the first embodiment, the second feature amount calculation unit 202 has a plurality of feature points based on the estimated positions D3 and the object area D5 of the plurality of feature points estimated by the feature point estimation unit 10. The area ratio between the convex plane composed of and the object area is calculated, and the feature amount proportional to the inverse number of the calculated area ratio or the feature amount D62 proportional to the inverse number of the difference between the calculated area ratio and 1 is calculated. .. Here, the "convex plane composed of a plurality of feature points" is, for example, the largest convex plane in which a plurality of feature points are stretched, or the smallest convex plane containing all the plurality of feature points. The second feature amount calculation unit 202 outputs the calculated feature amount D62 to the reliability calculation unit 22.

 図示しない第3の特徴量算出部から第mの特徴量算出部20mまでの各特徴量算出部は、カメラ選択部40が選択したカメラ画像D2並びに特徴点推定部10が推定した複数の特徴点の各推定位置D3、特徴点の候補点群の分散D4及び物体領域D5のうちの少なくとも1つの情報に基づいて、当該少なくとも1つの情報から算出可能な特徴量(特徴量D63~D6m)を算出する。図示しない第3の特徴量算出部から第mの特徴量算出部20mまでの各特徴量算出部は、算出した特徴量を信頼度算出部22に出力する。 Each feature amount calculation unit from the third feature amount calculation unit to the mth feature amount calculation unit 20m (not shown) includes a camera image D2 selected by the camera selection unit 40 and a plurality of feature points estimated by the feature point estimation unit 10. Based on the information of at least one of each estimated position D3, the dispersion D4 of the candidate point group of the feature points, and the object area D5, the feature amount (feature amount D63 to D6m) that can be calculated from the at least one information is calculated. do. Each feature amount calculation unit from the third feature amount calculation unit (not shown) to the mth feature amount calculation unit 20 m outputs the calculated feature amount to the reliability calculation unit 22.

 信頼度算出部22は、特徴量算出部21が算出した特徴量に基づいて、物体の推定位置の信頼度を算出する。より詳細には、実施の形態1では、信頼度算出部22は、特徴量算出部21が複数算出した特徴量(特徴量D61~D6m)に基づいて、物体の推定位置の推定誤差に相関する信頼度D7(例えば、推定誤差に正の相関を有する信頼度D7)を算出する。 The reliability calculation unit 22 calculates the reliability of the estimated position of the object based on the feature amount calculated by the feature amount calculation unit 21. More specifically, in the first embodiment, the reliability calculation unit 22 correlates with the estimation error of the estimated position of the object based on the feature amounts (feature amounts D61 to D6m) calculated by the feature amount calculation unit 21. The reliability D7 (for example, the reliability D7 having a positive correlation with the estimation error) is calculated.

 さらに詳細には、実施の形態1では、記憶装置3が特徴量から信頼度を算出するニューラルネットワークモデルを記憶している。信頼度算出部22は、特徴量算出部21が複数算出した特徴量(特徴量D61~D6m)に対して、記憶装置3が記憶しているニューラルネットワークモデルを適用することにより、物体の推定位置の推定誤差に逆比例する信頼度D7を算出する。信頼度算出部22は、算出した信頼度D7を、カメラ選択部40、及び位置推定装置1の外部にそれぞれ出力する。 More specifically, in the first embodiment, the storage device 3 stores a neural network model that calculates the reliability from the feature amount. The reliability calculation unit 22 applies the neural network model stored in the storage device 3 to the feature amounts (feature amounts D61 to D6m) calculated by the feature amount calculation unit 21, and thereby, the estimated position of the object. The reliability D7, which is inversely proportional to the estimation error of, is calculated. The reliability calculation unit 22 outputs the calculated reliability D7 to the outside of the camera selection unit 40 and the position estimation device 1, respectively.

 上述のカメラ選択部40は、信頼度算出部22が算出した信頼度D7に基づいて、N個のカメラ画像D11~D1Nのうちから少なくとも1つのカメラ画像D2を選択する。カメラ選択部40は、信頼度D7に基づいて選択したカメラ画像D2を特徴点推定部10及び特徴量算出部21にそれぞれ出力する。 The camera selection unit 40 described above selects at least one camera image D2 from the N camera images D11 to D1N based on the reliability D7 calculated by the reliability calculation unit 22. The camera selection unit 40 outputs the camera image D2 selected based on the reliability D7 to the feature point estimation unit 10 and the feature amount calculation unit 21, respectively.

 位置算出部30は、特徴点推定部10が推定した特徴点情報に基づいて、物体の推定位置D8を算出する。より詳細には、実施の形態1では、記憶装置3が物体の特徴点位置を記憶している。位置算出部30は、特徴点推定部10が推定した複数の特徴点の各推定位置D3、及び記憶装置3が記憶している物体の特徴点位置に基づいて、PnP問題を解くことにより、物体の推定位置D8を算出する。位置算出部30が算出する推定位置D8の例として、物体の3次元位置(3次元姿勢)等が挙げられる。位置算出部30は、算出した推定位置D8を位置推定装置1の外部に出力する。
 なお、信頼度算出部22が位置推定装置1の外部に出力した信頼度D7は、例えば、位置算出部30が位置推定装置1の外部に出力した推定位置D8の評価に用いられる。
The position calculation unit 30 calculates the estimated position D8 of the object based on the feature point information estimated by the feature point estimation unit 10. More specifically, in the first embodiment, the storage device 3 stores the position of the feature point of the object. The position calculation unit 30 solves the PnP problem by solving the PnP problem based on each estimated position D3 of the plurality of feature points estimated by the feature point estimation unit 10 and the feature point position of the object stored in the storage device 3. The estimated position D8 of is calculated. An example of the estimated position D8 calculated by the position calculation unit 30 is a three-dimensional position (three-dimensional posture) of an object. The position calculation unit 30 outputs the calculated estimated position D8 to the outside of the position estimation device 1.
The reliability D7 output by the reliability calculation unit 22 to the outside of the position estimation device 1 is used, for example, for the evaluation of the estimated position D8 output to the outside of the position estimation device 1 by the position calculation unit 30.

 以下で、実施の形態1に係る位置推定装置1の動作について図面を参照して説明する。図2は、実施の形態1に係る位置推定装置1による信頼度推定方法を示すフローチャートである。なお、以下で説明する各ステップの前に、N個のカメラ2は、それぞれ、物体を撮影することによりカメラ画像を取得し、取得したカメラ画像をカメラ選択部40に出力したものとする。 Hereinafter, the operation of the position estimation device 1 according to the first embodiment will be described with reference to the drawings. FIG. 2 is a flowchart showing a reliability estimation method by the position estimation device 1 according to the first embodiment. Before each step described below, it is assumed that each of the N cameras 2 acquires a camera image by photographing an object and outputs the acquired camera image to the camera selection unit 40.

 図2が示すように、カメラ選択部40は、複数のカメラ画像のうちから少なくとも1つのカメラ画像D2を選択する(ステップST1)。カメラ選択部40は、選択したカメラ画像D2を特徴点推定部10及び特徴量算出部21にそれぞれ出力する。 As shown in FIG. 2, the camera selection unit 40 selects at least one camera image D2 from a plurality of camera images (step ST1). The camera selection unit 40 outputs the selected camera image D2 to the feature point estimation unit 10 and the feature amount calculation unit 21, respectively.

 なお、ステップST1において、カメラ選択部40は、後述するステップST4の工程によって信頼度D7がまだ算出されていない場合、複数のカメラ画像のうちから任意のカメラ画像D2を選択する。または、ステップST1において、カメラ選択部40は、後述するステップST4の工程によって信頼度D7が既に算出されている場合、当該信頼度D7に基づいて、N個のカメラ画像D11~D1Nのうちから少なくとも1つのカメラ画像D2を選択する。 In step ST1, the camera selection unit 40 selects an arbitrary camera image D2 from a plurality of camera images when the reliability D7 has not yet been calculated by the step of step ST4 described later. Alternatively, in step ST1, when the reliability D7 has already been calculated by the step of step ST4 described later, the camera selection unit 40 has at least one of N camera images D11 to D1N based on the reliability D7. Select one camera image D2.

 次に、特徴点推定部10は、カメラ選択部40が選択したカメラ画像D2に基づいて、特徴点情報として、複数の特徴点の各推定位置D3、特徴点の候補点群の分散D4、及び物体領域D5を推定する(ステップST2)。特徴点推定部10は、推定した複数の特徴点の各推定位置D3、特徴点の候補点群の分散D4及び物体領域D5を特徴量算出部21に出力する。 Next, the feature point estimation unit 10 uses the camera image D2 selected by the camera selection unit 40 as feature point information such as each estimated position D3 of a plurality of feature points, a dispersion D4 of a candidate point group of the feature points, and a feature point cloud. The object region D5 is estimated (step ST2). The feature point estimation unit 10 outputs each estimated position D3 of the estimated plurality of feature points, the variance D4 of the candidate point group of the feature points, and the object region D5 to the feature amount calculation unit 21.

 次に、特徴量算出部21は、カメラ選択部40が選択したカメラ画像D2並びに特徴点推定部10が推定した複数の特徴点の各推定位置D3、特徴点の候補点群の分散D4及び物体領域D5を特徴量算出部21のうちの少なくとも1つの情報に基づいて、物体の特徴量D61~D6mを算出する(ステップST3)。特徴量算出部21は、算出した特徴量D61~D6mを信頼度算出部22に出力する。 Next, the feature amount calculation unit 21 includes a camera image D2 selected by the camera selection unit 40, each estimated position D3 of a plurality of feature points estimated by the feature point estimation unit 10, a dispersion D4 of a group of candidate points of the feature points, and an object. The feature amounts D61 to D6m of the object are calculated in the area D5 based on the information of at least one of the feature amount calculation units 21 (step ST3). The feature amount calculation unit 21 outputs the calculated feature amounts D61 to D6m to the reliability calculation unit 22.

 次に、信頼度算出部22は、特徴量算出部21が算出した特徴量D61~D6mに基づいて、物体の推定位置の推定誤差に相関する信頼度D7を算出する(ステップST4)。なお、図示しないが、位置推定装置1は、上記のステップST1からステップST4までの一連の工程を繰り返し実行する。 Next, the reliability calculation unit 22 calculates the reliability D7 that correlates with the estimation error of the estimated position of the object based on the feature amounts D61 to D6m calculated by the feature amount calculation unit 21 (step ST4). Although not shown, the position estimation device 1 repeatedly executes a series of steps from step ST1 to step ST4.

 以下で、ステップST3の詳細について説明する。図3は、ステップST3の特徴量算出方法の詳細を示すフローチャートである。
 図3が示すように、第1の特徴量算出部201は、特徴点推定部10が推定した特徴点の候補点群の分散D4に基づいて、当該分散D4の逆数に比例する物体の特徴量D61を算出する(ステップST31)。第1の特徴量算出部201は、算出した特徴量D61を信頼度算出部22に出力する。
The details of step ST3 will be described below. FIG. 3 is a flowchart showing details of the feature amount calculation method in step ST3.
As shown in FIG. 3, the first feature amount calculation unit 201 is based on the variance D4 of the candidate point group of the feature points estimated by the feature point estimation unit 10, and the feature amount of the object proportional to the reciprocal of the variance D4. Calculate D61 (step ST31). The first feature amount calculation unit 201 outputs the calculated feature amount D61 to the reliability calculation unit 22.

 第2の特徴量算出部202は、特徴点推定部10が推定した複数の特徴点の各推定位置D3及び物体領域D5に基づいて、複数の特徴点から構成される凸平面と物体領域との面積比を算出し、算出した面積比の逆数に比例する特徴量、又は算出した面積比と1との差の逆数に比例する特徴量D62を算出する(ステップST32)。第2の特徴量算出部202は、算出した特徴量D62を信頼度算出部22に出力する。 The second feature amount calculation unit 202 includes a convex plane composed of a plurality of feature points and an object area based on the estimated positions D3 and the object area D5 of the plurality of feature points estimated by the feature point estimation unit 10. The area ratio is calculated, and the feature amount proportional to the inverse number of the calculated area ratio or the feature amount D62 proportional to the inverse number of the difference between the calculated area ratio and 1 is calculated (step ST32). The second feature amount calculation unit 202 outputs the calculated feature amount D62 to the reliability calculation unit 22.

 第3の特徴量算出部から第mの特徴量算出部20mまでの各特徴量算出部は、カメラ選択部40が選択したカメラ画像D2並びに特徴点推定部10が推定した複数の特徴点の各推定位置D3、特徴点の候補点群の分散D4及び物体領域D5のうちの少なくとも1つの情報に基づいて、その他の特徴量D63~D6mを算出する(ステップST33)。第3の特徴量算出部から第mの特徴量算出部20mまでの各特徴量算出部は、算出した特徴量を信頼度算出部22に出力する。 Each feature amount calculation unit from the third feature amount calculation unit to the mth feature amount calculation unit 20m is a camera image D2 selected by the camera selection unit 40 and each of a plurality of feature points estimated by the feature point estimation unit 10. The other feature quantities D63 to D6m are calculated based on the information of at least one of the estimated position D3, the dispersion D4 of the candidate point group of the feature points, and the object region D5 (step ST33). Each feature amount calculation unit from the third feature amount calculation unit to the mth feature amount calculation unit 20m outputs the calculated feature amount to the reliability calculation unit 22.

 以下で、実施の形態1に係る位置推定装置1による信頼度推定方法におけるカメラ選択方法(ステップST1)の具体例について詳細に説明する。なお、以下で説明するステップST1の工程は、ステップST1からステップST4までの一連の工程を1回以上行った後に、再度行われたものとする。 Hereinafter, a specific example of the camera selection method (step ST1) in the reliability estimation method by the position estimation device 1 according to the first embodiment will be described in detail. It is assumed that the step of step ST1 described below is performed again after performing a series of steps from step ST1 to step ST4 once or more.

 当該具体例では、ステップST1において、カメラ選択部40は、信頼度算出部22が上述のステップST4において算出した信頼度D7に基づいて、N個のカメラ画像D11~D1Nのうちから少なくとも1つのカメラ画像D2を選択する。より詳細には、当該具体例では、カメラ選択部40は、N個のカメラ画像D11~D1N、及び信頼度算出部22が算出した信頼度D7に基づいて、1つのカメラ画像を用いて以降の位置推定を行うモード、又は全てのカメラ画像を順次用いて位置推定を行うモードの何れか1つのモードを選択する。 In the specific example, in step ST1, the camera selection unit 40 has at least one camera out of N camera images D11 to D1N based on the reliability D7 calculated by the reliability calculation unit 22 in step ST4 described above. Select image D2. More specifically, in the specific example, the camera selection unit 40 uses one camera image based on the N camera images D11 to D1N and the reliability D7 calculated by the reliability calculation unit 22. Select either one of the modes for estimating the position or the mode for estimating the position by sequentially using all the camera images.

 さらに詳細には、当該具体例では、ステップST1において、カメラ選択部40は、信頼度算出部22が算出した信頼度D7が閾値以上の場合、前回選択したカメラ画像に対応するカメラ2からのカメラ画像D2を選択する。換言すれば、カメラ選択部40は、信頼度算出部22が算出した信頼度D7が閾値以上の場合、前回選択したカメラ画像を撮影したカメラ2からのカメラ画像D2を選択する。その場合、上述のステップST2において、上述の通り、特徴点推定部10は、カメラ選択部40が選択したカメラ画像D2に基づいて、特徴点情報を推定する。 More specifically, in the specific example, in step ST1, when the reliability D7 calculated by the reliability calculation unit 22 is equal to or greater than the threshold value, the camera selection unit 40 is a camera from the camera 2 corresponding to the previously selected camera image. Select image D2. In other words, when the reliability D7 calculated by the reliability calculation unit 22 is equal to or greater than the threshold value, the camera selection unit 40 selects the camera image D2 from the camera 2 that captured the previously selected camera image. In that case, in step ST2 described above, as described above, the feature point estimation unit 10 estimates the feature point information based on the camera image D2 selected by the camera selection unit 40.

 一方で、ステップST1において、カメラ選択部40は、信頼度算出部22が算出した信頼度が閾値未満の場合、N個のカメラ画像D11~D1Nを全て選択する。次に、上述のステップST2において、特徴点推定部10は、カメラ選択部40がステップST1においてN個のカメラ画像D11~D1Nを全て選択した場合、カメラ画像毎に、特徴点情報を推定する。 On the other hand, in step ST1, the camera selection unit 40 selects all N camera images D11 to D1N when the reliability calculated by the reliability calculation unit 22 is less than the threshold value. Next, in step ST2 described above, when the camera selection unit 40 selects all N camera images D11 to D1N in step ST1, the feature point estimation unit 10 estimates the feature point information for each camera image.

 次に、上述のステップST3において、特徴量算出部21は、特徴点推定部10がステップST2においてカメラ画像毎に推定した特徴点情報に基づいて、カメラ画像毎に、物体の特徴量を算出する。
 次に、上述のステップST4において、信頼度算出部22は、特徴量算出部21がカメラ画像毎に算出した特徴量に基づいて、カメラ画像毎に、物体の推定位置の信頼度を算出する。
Next, in step ST3 described above, the feature amount calculation unit 21 calculates the feature amount of the object for each camera image based on the feature point information estimated by the feature point estimation unit 10 for each camera image in step ST2. ..
Next, in step ST4 described above, the reliability calculation unit 22 calculates the reliability of the estimated position of the object for each camera image based on the feature amount calculated by the feature amount calculation unit 21 for each camera image.

 次に、位置推定装置1は、上述のステップST1の工程に戻り、カメラ選択部40は、信頼度算出部22がカメラ画像毎に信頼度を算出した場合、信頼度算出部22が算出したカメラ画像毎の信頼度のうちの最大の信頼度に基づいて、N個のカメラ画像D11~D1Nのうちから少なくとも1つのカメラ画像D2を選択する。より詳細には、例えば、カメラ選択部40は、信頼度算出部22がカメラ画像毎に信頼度を算出した場合、信頼度算出部22が算出したカメラ画像毎の信頼度のうちの最大の信頼度に対応するカメラ2が撮影したカメラ画像D2を選択する。 Next, the position estimation device 1 returns to the step of step ST1 described above, and the camera selection unit 40 is a camera calculated by the reliability calculation unit 22 when the reliability calculation unit 22 calculates the reliability for each camera image. At least one camera image D2 is selected from the N camera images D11 to D1N based on the maximum reliability of the reliability of each image. More specifically, for example, when the reliability calculation unit 22 calculates the reliability for each camera image, the camera selection unit 40 has the maximum reliability of the reliability for each camera image calculated by the reliability calculation unit 22. The camera image D2 taken by the camera 2 corresponding to the degree is selected.

 位置推定装置1における、カメラ選択部40、特徴点推定部10、信頼度推定装置20の特徴量算出部21及び信頼度算出部22、並びに位置算出部30の各機能は、処理回路により実現される。すなわち、位置推定装置1は、図2及び図3に示した各ステップの処理を実行するための処理回路を備える。この処理回路は、専用のハードウェアであってもよいが、メモリに記憶されたプログラムを実行するCPU(Central Processing Unit)であってもよい。 Each function of the camera selection unit 40, the feature point estimation unit 10, the feature amount calculation unit 21 and the reliability calculation unit 22 of the reliability estimation device 20, and the position calculation unit 30 in the position estimation device 1 is realized by a processing circuit. To. That is, the position estimation device 1 includes a processing circuit for executing the processing of each step shown in FIGS. 2 and 3. This processing circuit may be dedicated hardware, or may be a CPU (Central Processing Unit) that executes a program stored in the memory.

 図4Aは、位置推定装置1の機能を実現するハードウェア構成を示すブロック図である。図4Bは、位置推定装置1の機能を実現するソフトウェアを実行するハードウェア構成を示すブロック図である。 FIG. 4A is a block diagram showing a hardware configuration that realizes the function of the position estimation device 1. FIG. 4B is a block diagram showing a hardware configuration for executing software that realizes the functions of the position estimation device 1.

 上記処理回路が図4Aに示す専用のハードウェアの処理回路50である場合、処理回路50は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)又はこれらを組み合わせたものが該当する。 When the processing circuit is the processing circuit 50 of the dedicated hardware shown in FIG. 4A, the processing circuit 50 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, or an ASIC (Application Specific Integrated Circuitd). Circuit), FPGA (Field-Programmable Gate Array) or a combination thereof is applicable.

 位置推定装置1における、カメラ選択部40、特徴点推定部10、信頼度推定装置20の特徴量算出部21及び信頼度算出部22、並びに位置算出部30の各機能を別々の処理回路で実現してもよいし、これらの機能をまとめて1つの処理回路で実現してもよい。 The functions of the camera selection unit 40, the feature point estimation unit 10, the feature amount calculation unit 21 and the reliability calculation unit 22 of the reliability estimation device 20, and the position calculation unit 30 in the position estimation device 1 are realized by separate processing circuits. Alternatively, these functions may be integrated into one processing circuit.

 上記処理回路が図4Bに示すプロセッサ51である場合、位置推定装置1における、カメラ選択部40、特徴点推定部10、信頼度推定装置20の特徴量算出部21及び信頼度算出部22、並びに位置算出部30の各機能は、ソフトウェア、ファームウェア又はソフトウェアとファームウェアとの組み合わせによって実現される。
 なお、ソフトウェア又はファームウェアは、プログラムとして記述されてメモリ52に記憶される。
When the processing circuit is the processor 51 shown in FIG. 4B, the camera selection unit 40, the feature point estimation unit 10, the feature amount calculation unit 21 and the reliability calculation unit 22 of the reliability estimation device 20 in the position estimation device 1, and the reliability calculation unit 22. Each function of the position calculation unit 30 is realized by software, firmware, or a combination of software and firmware.
The software or firmware is described as a program and stored in the memory 52.

 プロセッサ51は、メモリ52に記憶されたプログラムを読み出して実行することにより、位置推定装置1における、カメラ選択部40、特徴点推定部10、信頼度推定装置20の特徴量算出部21及び信頼度算出部22、並びに位置算出部30の各機能を実現する。すなわち、位置推定装置1は、これらの各機能がプロセッサ51によって実行されるときに、図2及び図3に示した各ステップの処理が結果的に実行されるプログラムを記憶するためのメモリ52を備える。 By reading and executing the program stored in the memory 52, the processor 51 reads and executes the camera selection unit 40, the feature point estimation unit 10, the feature amount calculation unit 21 of the reliability estimation device 20, and the reliability in the position estimation device 1. Each function of the calculation unit 22 and the position calculation unit 30 is realized. That is, the position estimation device 1 stores a memory 52 for storing a program in which the processing of each step shown in FIGS. 2 and 3 is executed as a result when each of these functions is executed by the processor 51. Be prepared.

 これらのプログラムは、位置推定装置1における、カメラ選択部40、特徴点推定部10、信頼度推定装置20の特徴量算出部21及び信頼度算出部22、並びに位置算出部30の各手順又は方法をコンピュータに実行させる。メモリ52は、コンピュータを、位置推定装置1における、カメラ選択部40、特徴点推定部10、信頼度推定装置20の特徴量算出部21及び信頼度算出部22、並びに位置算出部30として機能させるためのプログラムが記憶されたコンピュータ可読記憶媒体であってもよい。 These programs are the procedures or methods of the camera selection unit 40, the feature point estimation unit 10, the feature amount calculation unit 21 and the reliability calculation unit 22 of the reliability estimation device 20, and the position calculation unit 30 in the position estimation device 1. Let the computer run. The memory 52 causes the computer to function as a camera selection unit 40, a feature point estimation unit 10, a feature amount calculation unit 21 and a reliability calculation unit 22 of the reliability estimation device 20, and a position calculation unit 30 in the position estimation device 1. The program may be stored in a computer-readable storage medium.

 プロセッサ51には、例えば、CPU(Central Processing Unit)、処理装置、演算装置、プロセッサ、マイクロプロセッサ、マイクロコンピュータ、またはDSP(Digital Signal Processor)などが該当する。 The processor 51 corresponds to, for example, a CPU (Central Processing Unit), a processing device, a computing device, a processor, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), or the like.

 メモリ52には、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically-EPROM)などの不揮発性又は揮発性の半導体メモリ、ハードディスク、フレキシブルディスク等の磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、CD(Compact Disc)、DVD(Digital Versatile Disc)などが該当する。 The memory 52 may include, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), an EPROM (Electrically-volatile) semiconductor, or an EPROM (Electrically-EPROM). This includes hard disks, magnetic disks such as flexible disks, flexible disks, optical discs, compact disks, mini disks, CDs (Compact Disc), DVDs (Digital Versailles Disc), and the like.

 位置推定装置1における、カメラ選択部40、特徴点推定部10、信頼度推定装置20の特徴量算出部21及び信頼度算出部22、並びに位置算出部30の各機能について一部を専用のハードウェアで実現し、一部をソフトウェア又はファームウェアで実現してもよい。 Dedicated hardware for each function of the camera selection unit 40, the feature point estimation unit 10, the feature amount calculation unit 21 and the reliability calculation unit 22 of the reliability estimation device 20, and the position calculation unit 30 in the position estimation device 1. It may be realized by hardware and partly by software or firmware.

 例えば、カメラ選択部40、特徴点推定部10及び位置算出部30の機能は、専用のハードウェアとしての処理回路で機能を実現する。信頼度推定装置20の特徴量算出部21及び信頼度算出部22については、プロセッサ51がメモリ52に記憶されたプログラムを読み出して実行することにより機能を実現してもよい。
 このように、処理回路は、ハードウェア、ソフトウェア、ファームウェア又はこれらの組み合わせにより上記機能のそれぞれを実現することができる。
For example, the functions of the camera selection unit 40, the feature point estimation unit 10, and the position calculation unit 30 are realized by a processing circuit as dedicated hardware. The features of the feature amount calculation unit 21 and the reliability calculation unit 22 of the reliability estimation device 20 may be realized by the processor 51 reading and executing the program stored in the memory 52.
As described above, the processing circuit can realize each of the above functions by hardware, software, firmware, or a combination thereof.

 以上のように、実施の形態1に係る信頼度推定装置20は、カメラ画像に基づいて推定された物体の推定位置の信頼度を推定する信頼度推定装置20であって、カメラ画像における物体の複数の特徴点に関する特徴点情報に基づいて、物体の特徴量を算出する特徴量算出部21と、特徴量算出部21が算出した特徴量に基づいて、物体の推定位置の信頼度を算出する信頼度算出部22と、を備えている。 As described above, the reliability estimation device 20 according to the first embodiment is a reliability estimation device 20 that estimates the reliability of the estimated position of the object estimated based on the camera image, and is the reliability estimation device 20 of the object in the camera image. The reliability of the estimated position of the object is calculated based on the feature amount calculation unit 21 that calculates the feature amount of the object based on the feature point information about a plurality of feature points and the feature amount calculated by the feature amount calculation unit 21. A reliability calculation unit 22 is provided.

 上記の構成によれば、特徴点情報から算出した物体の特徴量に基づいて物体の推定位置の信頼度を算出する。これにより、推定位置から直接信頼度を算出する場合よりも、推定位置の信頼度自体の信頼性が推定位置の信頼性に依存しないため、推定位置の信頼度自体の信頼性を向上させることができる。よって、例えば、推定位置を推定することが困難な画像が入力された場合でも、推定位置の信頼度自体の信頼性を向上させることができる。 According to the above configuration, the reliability of the estimated position of the object is calculated based on the feature amount of the object calculated from the feature point information. As a result, the reliability of the estimated position itself does not depend on the reliability of the estimated position, as compared with the case of calculating the reliability directly from the estimated position, so that the reliability of the estimated position itself can be improved. can. Therefore, for example, even when an image for which it is difficult to estimate the estimated position is input, the reliability of the estimated position itself can be improved.

 実施の形態1に係る信頼度推定装置20における特徴点情報は、特徴点の候補点群の分散を含み、特徴量算出部21は、特徴点の候補点群の分散に基づいて、物体の特徴量を算出する。
 上記の構成によれば、特徴点の候補点群の分散から算出した物体の特徴量に基づいて物体の推定位置の信頼度を算出する。これにより、推定位置の信頼度自体の信頼性を好適に向上させることができる。
The feature point information in the reliability estimation device 20 according to the first embodiment includes the variance of the candidate point cloud of the feature point, and the feature amount calculation unit 21 is based on the variance of the candidate point cloud of the feature point, and the feature of the object. Calculate the amount.
According to the above configuration, the reliability of the estimated position of the object is calculated based on the feature amount of the object calculated from the variance of the candidate point group of the feature points. As a result, the reliability of the reliability of the estimated position itself can be suitably improved.

 実施の形態1に係る信頼度推定装置20における特徴量算出部21は、分散の逆数に比例する特徴量を算出する。
 上記の構成によれば、分散の逆数に比例する特徴量に基づいて物体の推定位置の信頼度を算出する。これにより、推定位置の信頼度自体の信頼性を好適に向上させることができる。
The feature amount calculation unit 21 in the reliability estimation device 20 according to the first embodiment calculates a feature amount proportional to the reciprocal of the variance.
According to the above configuration, the reliability of the estimated position of the object is calculated based on the feature amount proportional to the reciprocal of the variance. As a result, the reliability of the reliability of the estimated position itself can be suitably improved.

 実施の形態1に係る信頼度推定装置20における特徴点情報は、複数の特徴点の各推定位置、及びカメラ画像において物体が占める領域である物体領域を含み、特徴量算出部21は、複数の特徴点の各推定位置、及び物体領域に基づいて、複数の特徴点から構成される領域と当該物体領域とを入力にとる関数となる特徴量を算出する。 The feature point information in the reliability estimation device 20 according to the first embodiment includes each estimated position of a plurality of feature points and an object area which is an area occupied by an object in a camera image, and the feature amount calculation unit 21 has a plurality of features. Based on each estimated position of the feature point and the object area, a feature amount that is a function of taking a region composed of a plurality of feature points and the object region as an input is calculated.

 上記の構成によれば、複数の特徴点から構成される領域と当該物体領域とを入力にとる関数となる特徴量に基づいて物体の推定位置の信頼度を算出する。これにより、推定位置の信頼度自体の信頼性を好適に向上させることができる。 According to the above configuration, the reliability of the estimated position of the object is calculated based on the feature amount that is a function that takes the region composed of a plurality of feature points and the object region as inputs. As a result, the reliability of the reliability of the estimated position itself can be suitably improved.

 実施の形態1に係る信頼度推定装置20における特徴点情報は、複数の特徴点の各推定位置、及びカメラ画像において物体が占める領域である物体領域を含み、特徴量算出部21は、複数の特徴点の各推定位置、及び物体領域に基づいて、複数の特徴点から構成される凸平面と当該物体領域との面積比を算出し、算出した面積比の逆数に比例する特徴量、又は算出した面積比と1との差の逆数に比例する特徴量を算出する。 The feature point information in the reliability estimation device 20 according to the first embodiment includes each estimated position of a plurality of feature points and an object area which is an area occupied by an object in a camera image, and the feature amount calculation unit 21 has a plurality of features. Based on each estimated position of the feature point and the object area, the area ratio between the convex plane composed of a plurality of feature points and the object area is calculated, and the feature amount or calculation is proportional to the inverse of the calculated area ratio. The feature amount proportional to the inverse number of the difference between the area ratio and 1 is calculated.

 上記の構成によれば、複数の特徴点の各推定位置、及び物体領域から算出した面積比の逆数に比例する特徴量又は当該面積比と1との差の逆数に比例する特徴量に基づいて物体の推定位置の信頼度を算出する。これにより、推定位置の信頼度自体の信頼性を好適に向上させることができる。 According to the above configuration, based on each estimated position of a plurality of feature points, a feature amount proportional to the reciprocal of the area ratio calculated from the object area, or a feature amount proportional to the reciprocal of the difference between the area ratio and 1. Calculate the reliability of the estimated position of the object. As a result, the reliability of the reliability of the estimated position itself can be suitably improved.

 実施の形態1に係る信頼度推定装置20における特徴量算出部21は、物体の特徴量を複数算出し、信頼度算出部22は、特徴量算出部21が複数算出した特徴量に基づいて、物体の推定位置の推定誤差に相関する信頼度を算出する。
 上記の構成によれば、物体の推定位置の推定誤差に相関する信頼度を用いて、物体の推定位置を評価することができる。
The feature amount calculation unit 21 in the reliability estimation device 20 according to the first embodiment calculates a plurality of feature amounts of the object, and the reliability calculation unit 22 is based on the feature amounts calculated by the feature amount calculation unit 21. Calculate the reliability that correlates with the estimation error of the estimated position of the object.
According to the above configuration, the estimated position of the object can be evaluated using the reliability that correlates with the estimation error of the estimated position of the object.

 実施の形態1に係る位置推定装置1は、実施の形態1に係る信頼度推定装置20と、カメラ画像に基づいて、特徴点情報を推定する特徴点推定部10と、特徴点推定部10が推定した特徴点情報に基づいて、物体の推定位置を算出する位置算出部30と、を備え、特徴量算出部21は、特徴点推定部10が推定した特徴点情報に基づいて、物体の特徴量を算出する。 The position estimation device 1 according to the first embodiment includes a reliability estimation device 20 according to the first embodiment, a feature point estimation unit 10 that estimates feature point information based on a camera image, and a feature point estimation unit 10. A position calculation unit 30 for calculating an estimated position of an object based on the estimated feature point information is provided, and the feature amount calculation unit 21 is characterized by an object based on the feature point information estimated by the feature point estimation unit 10. Calculate the amount.

 上記の構成によれば、実施の形態1に係る信頼度推定装置20が奏する上述の各効果を位置推定装置1において実現することができる。例えば、実施の形態1に係る位置推定装置1は、カメラ画像による物体の位置推定に適用可能なものである。例えば、自動炎天車両又はロボットアーム等のFA機器などに利用することが可能である。 According to the above configuration, each of the above-mentioned effects of the reliability estimation device 20 according to the first embodiment can be realized in the position estimation device 1. For example, the position estimation device 1 according to the first embodiment is applicable to the position estimation of an object by a camera image. For example, it can be used for an automatic burning vehicle or an FA device such as a robot arm.

 例えば、ロボットアームにより物体の把持を行う場合、当該物体の位置検出が必要となる。そこで、アームに取り付けた単眼カメラなどで撮影した画像を上述のカメラ画像として実施の形態1に係る信頼度推定装置20に適用することにより、撮影条件の悪い位置推定困難な画像が入力された場合でも、信頼性が高い上述の信頼度を算出することができ、当該信頼度に基づいて、推定位置の評価を行うことができる。よって、安定したロボット制御を実現することが可能となる。 For example, when gripping an object with a robot arm, it is necessary to detect the position of the object. Therefore, when an image taken by a monocular camera or the like attached to the arm is applied as the above-mentioned camera image to the reliability estimation device 20 according to the first embodiment, an image in which position estimation is difficult due to poor shooting conditions is input. However, the above-mentioned reliability with high reliability can be calculated, and the estimated position can be evaluated based on the reliability. Therefore, stable robot control can be realized.

 実施の形態1に係る位置推定装置1は、信頼度算出部22が算出した信頼度に基づいて、複数のカメラ画像のうちから少なくとも1つのカメラ画像を選択するカメラ選択部40をさらに備え、特徴点推定部10は、カメラ選択部40が選択したカメラ画像に基づいて、特徴点情報を推定する。 The position estimation device 1 according to the first embodiment further includes a camera selection unit 40 that selects at least one camera image from a plurality of camera images based on the reliability calculated by the reliability calculation unit 22. The point estimation unit 10 estimates feature point information based on the camera image selected by the camera selection unit 40.

 上記の構成によれば、推定位置の信頼度に基づいて選択したカメラ画像から特徴点情報を推定する。これにより、推定した特徴点情報の信頼性を向上させることができる。よって、特徴点情報に基づく推定位置及び信頼度の各信頼性を向上させることができる。 According to the above configuration, feature point information is estimated from the camera image selected based on the reliability of the estimated position. This makes it possible to improve the reliability of the estimated feature point information. Therefore, it is possible to improve the reliability of the estimated position and the reliability based on the feature point information.

 実施の形態1に係る位置推定装置1におけるカメラ選択部40は、信頼度算出部22が算出した信頼度が閾値以上の場合、前回選択したカメラ画像に対応するカメラ2からのカメラ画像を選択し、信頼度算出部22が算出した信頼度が閾値未満の場合、複数のカメラ画像を全て選択する。 When the reliability calculated by the reliability calculation unit 22 is equal to or greater than the threshold value, the camera selection unit 40 in the position estimation device 1 according to the first embodiment selects a camera image from the camera 2 corresponding to the previously selected camera image. If the reliability calculated by the reliability calculation unit 22 is less than the threshold value, all the plurality of camera images are selected.

 上記の構成によれば、推定位置の推定に適したカメラ画像を好適に選択することができる。よって、当該カメラ画像に基づく特徴点情報、推定位置及び信頼度の各信頼性を向上させることができる。 According to the above configuration, a camera image suitable for estimating the estimated position can be suitably selected. Therefore, it is possible to improve the reliability of the feature point information, the estimated position, and the reliability based on the camera image.

 実施の形態1に係る位置推定装置1における特徴点推定部10は、カメラ選択部40が複数のカメラ画像を全て選択した場合、カメラ画像毎に、特徴点情報を推定し、特徴量算出部21は、特徴点推定部10がカメラ画像毎に推定した特徴点情報に基づいて、カメラ画像毎に、物体の特徴量を算出し、信頼度算出部22は、特徴量算出部21がカメラ画像毎に算出した特徴量に基づいて、カメラ画像毎に、物体の推定位置の信頼度を算出し、カメラ選択部40は、信頼度算出部22がカメラ画像毎に信頼度を算出した場合、信頼度算出部22が算出したカメラ画像毎の信頼度のうちの最大の信頼度に基づいて、複数のカメラ画像のうちから少なくとも1つのカメラ画像を選択する。 When the camera selection unit 40 selects all a plurality of camera images, the feature point estimation unit 10 in the position estimation device 1 according to the first embodiment estimates feature point information for each camera image, and the feature amount calculation unit 21. Calculates the feature amount of the object for each camera image based on the feature point information estimated by the feature point estimation unit 10 for each camera image, and the reliability calculation unit 22 has the feature amount calculation unit 21 for each camera image. The reliability of the estimated position of the object is calculated for each camera image based on the feature amount calculated in the above, and the camera selection unit 40 calculates the reliability for each camera image when the reliability calculation unit 22 calculates the reliability for each camera image. At least one camera image is selected from a plurality of camera images based on the maximum reliability of the reliability of each camera image calculated by the calculation unit 22.

 上記の構成によれば、前回選択したカメラ画像が推定位置の推定に適していない場合でも、次回では、推定位置の推定に適したカメラ画像を選ぶことができる。よって、当該カメラ画像に基づく特徴点情報、推定位置及び信頼度の各信頼性を向上させることができる。 According to the above configuration, even if the previously selected camera image is not suitable for estimating the estimated position, the camera image suitable for estimating the estimated position can be selected next time. Therefore, it is possible to improve the reliability of the feature point information, the estimated position, and the reliability based on the camera image.

 実施の形態1に係る信頼度推定方法は、カメラ画像に基づいて推定された物体の推定位置の信頼度を推定する信頼度推定方法であって、カメラ画像における物体の複数の特徴点に関する特徴点情報に基づいて、物体の特徴量を算出する特徴量算出ステップと、特徴量算出ステップで算出した特徴量に基づいて、信頼度を算出する信頼度算出ステップと、を含む。
 上記の構成によれば、実施の形態1に係る信頼度推定装置20が奏する上述の効果と同様の効果を奏する。
 なお、実施の形態の任意の構成要素の変形、もしくは実施の形態の任意の構成要素の省略が可能である。
The reliability estimation method according to the first embodiment is a reliability estimation method for estimating the reliability of an estimated position of an object estimated based on a camera image, and is a feature point relating to a plurality of feature points of the object in the camera image. It includes a feature amount calculation step for calculating the feature amount of the object based on the information, and a reliability calculation step for calculating the reliability based on the feature amount calculated in the feature amount calculation step.
According to the above configuration, the same effect as the above-mentioned effect played by the reliability estimation device 20 according to the first embodiment is obtained.
It is possible to modify any component of the embodiment or omit any component of the embodiment.

 本開示に係る信頼度推定装置は、推定位置の信頼度自体の信頼性を向上させることができるため、位置推定装置に利用可能である。 The reliability estimation device according to the present disclosure can be used as a position estimation device because the reliability of the reliability of the estimated position itself can be improved.

 1 位置推定装置、2 カメラ、3 記憶装置、10 特徴点推定部、20 信頼度推定装置、21 特徴量算出部、22 信頼度算出部、30 位置算出部、40 カメラ選択部、50 処理回路、51 プロセッサ、52 メモリ、100 位置推定システム、201 第1の特徴量算出部、202 第2の特徴量算出部、20m 第mの特徴量算出部。 1 position estimation device, 2 camera, 3 storage device, 10 feature point estimation unit, 20 reliability estimation device, 21 feature quantity calculation unit, 22 reliability calculation unit, 30 position calculation unit, 40 camera selection unit, 50 processing circuit, 51 processor, 52 memory, 100 position estimation system, 201 first feature amount calculation unit, 202 second feature amount calculation unit, 20m second feature amount calculation unit.

Claims (10)

 カメラ画像に基づいて推定された物体の推定位置の信頼度を推定する信頼度推定装置であって、
 前記カメラ画像における前記物体の複数の特徴点に関する特徴点情報に基づいて、前記物体の特徴量を算出する特徴量算出部と、
 前記特徴量算出部が算出した特徴量に基づいて、前記信頼度を算出する信頼度算出部と、を備えていることを特徴とする、信頼度推定装置。
It is a reliability estimation device that estimates the reliability of the estimated position of an object estimated based on a camera image.
A feature amount calculation unit that calculates a feature amount of the object based on feature point information about a plurality of feature points of the object in the camera image.
A reliability estimation device comprising a reliability calculation unit for calculating the reliability based on the feature amount calculated by the feature amount calculation unit.
 前記特徴点情報は、前記特徴点の候補点群の分散を含み、
 前記特徴量算出部は、前記特徴点の候補点群の分散に基づいて、前記特徴量を算出することを特徴とする、請求項1に記載の信頼度推定装置。
The feature point information includes the variance of the candidate point group of the feature points.
The reliability estimation device according to claim 1, wherein the feature amount calculation unit calculates the feature amount based on the variance of the candidate point group of the feature points.
 前記特徴量算出部は、前記分散の逆数に比例する特徴量を算出することを特徴とする、請求項2に記載の信頼度推定装置。 The reliability estimation device according to claim 2, wherein the feature amount calculation unit calculates a feature amount proportional to the reciprocal of the variance.  前記特徴点情報は、複数の特徴点の各推定位置、及び前記カメラ画像において前記物体が占める領域である物体領域を含み、
 前記特徴量算出部は、前記複数の特徴点の各推定位置、及び前記物体領域に基づいて、前記複数の特徴点から構成される領域と当該物体領域とを入力にとる関数となる特徴量を算出することを特徴とする、請求項1に記載の信頼度推定装置。
The feature point information includes each estimated position of a plurality of feature points and an object region which is a region occupied by the object in the camera image.
The feature amount calculation unit obtains a feature amount that is a function of taking a region composed of the plurality of feature points and the object region as inputs based on each estimated position of the plurality of feature points and the object region. The reliability estimation device according to claim 1, wherein the calculation is performed.
 前記特徴量算出部は、前記特徴量を複数算出し、
 前記信頼度算出部は、前記特徴量算出部が複数算出した特徴量に基づいて、前記物体の推定位置の推定誤差に相関する信頼度を算出することを特徴とする、請求項1に記載の信頼度推定装置。
The feature amount calculation unit calculates a plurality of the feature amounts, and
The first aspect of the present invention, wherein the reliability calculation unit calculates a reliability that correlates with an estimation error of an estimated position of the object based on a plurality of feature amounts calculated by the feature amount calculation unit. Reliability estimation device.
 請求項1に記載の信頼度推定装置と、
 前記カメラ画像に基づいて、前記特徴点情報を推定する特徴点推定部と、
 前記特徴点推定部が推定した特徴点情報に基づいて、前記物体の推定位置を算出する位置算出部と、を備え、
 前記特徴量算出部は、前記特徴点推定部が推定した特徴点情報に基づいて、前記特徴量を算出することを特徴とする、位置推定装置。
The reliability estimation device according to claim 1 and
A feature point estimation unit that estimates the feature point information based on the camera image,
A position calculation unit for calculating an estimated position of the object based on the feature point information estimated by the feature point estimation unit is provided.
The feature amount calculation unit is a position estimation device, characterized in that the feature amount is calculated based on the feature point information estimated by the feature point estimation unit.
 前記信頼度算出部が算出した信頼度に基づいて、複数のカメラ画像のうちから少なくとも1つのカメラ画像を選択するカメラ選択部をさらに備え、
 前記特徴点推定部は、前記カメラ選択部が選択したカメラ画像に基づいて、前記特徴点情報を推定することを特徴とする、請求項6に記載の位置推定装置。
A camera selection unit that selects at least one camera image from a plurality of camera images based on the reliability calculated by the reliability calculation unit is further provided.
The position estimation device according to claim 6, wherein the feature point estimation unit estimates the feature point information based on a camera image selected by the camera selection unit.
 前記カメラ選択部は、
  前記信頼度算出部が算出した信頼度が閾値以上の場合、前回選択したカメラ画像に対応するカメラからのカメラ画像を選択し、
  前記信頼度算出部が算出した信頼度が閾値未満の場合、前記複数のカメラ画像を全て選択することを特徴とする、請求項7に記載の位置推定装置。
The camera selection unit
When the reliability calculated by the reliability calculation unit is equal to or higher than the threshold value, the camera image from the camera corresponding to the previously selected camera image is selected.
The position estimation device according to claim 7, wherein when the reliability calculated by the reliability calculation unit is less than the threshold value, all the plurality of camera images are selected.
 前記特徴点推定部は、前記カメラ選択部が前記複数のカメラ画像を全て選択した場合、カメラ画像毎に、前記特徴点情報を推定し、
 前記特徴量算出部は、前記特徴点推定部がカメラ画像毎に推定した特徴点情報に基づいて、カメラ画像毎に、前記特徴量を算出し、
 前記信頼度算出部は、前記特徴量算出部がカメラ画像毎に算出した特徴量に基づいて、カメラ画像毎に、前記信頼度を算出し、
 前記カメラ選択部は、前記信頼度算出部がカメラ画像毎に前記信頼度を算出した場合、前記信頼度算出部が算出したカメラ画像毎の信頼度のうちの最大の信頼度に基づいて、前記複数のカメラ画像のうちから少なくとも1つのカメラ画像を選択することを特徴とする、請求項8に記載の位置推定装置。
When the camera selection unit selects all of the plurality of camera images, the feature point estimation unit estimates the feature point information for each camera image.
The feature amount calculation unit calculates the feature amount for each camera image based on the feature point information estimated by the feature point estimation unit for each camera image.
The reliability calculation unit calculates the reliability for each camera image based on the feature amount calculated by the feature amount calculation unit for each camera image.
When the reliability calculation unit calculates the reliability for each camera image, the camera selection unit is based on the maximum reliability of the reliability for each camera image calculated by the reliability calculation unit. The position estimation device according to claim 8, wherein at least one camera image is selected from a plurality of camera images.
 カメラ画像に基づいて推定された物体の推定位置の信頼度を推定する信頼度推定方法であって、
 前記カメラ画像における前記物体の複数の特徴点に関する特徴点情報に基づいて、前記物体の特徴量を算出する特徴量算出ステップと、
 前記特徴量算出ステップで算出した特徴量に基づいて、前記信頼度を算出する信頼度算出ステップと、を含むことを特徴とする、信頼度推定方法。
It is a reliability estimation method that estimates the reliability of the estimated position of an object estimated based on a camera image.
A feature amount calculation step for calculating a feature amount of the object based on feature point information about a plurality of feature points of the object in the camera image, and a feature amount calculation step.
A reliability estimation method comprising a reliability calculation step for calculating the reliability based on the feature amount calculated in the feature amount calculation step.
PCT/JP2020/046360 2020-12-11 2020-12-11 Reliability estimation device, position estimation device, and reliability estimation method Ceased WO2022123786A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN202080107744.7A CN116547714B (en) 2020-12-11 2020-12-11 Reliability estimation device, position estimation device, and reliability estimation method
JP2022568023A JP7221469B2 (en) 2020-12-11 2020-12-11 Reliability estimation device, position estimation device, and reliability estimation method
DE112020007687.3T DE112020007687T5 (en) 2020-12-11 2020-12-11 RELIABILITY DETERMINATION DEVICE, POSITION DETERMINATION DEVICE AND RELIABILITY DEGREE DETERMINATION METHOD
PCT/JP2020/046360 WO2022123786A1 (en) 2020-12-11 2020-12-11 Reliability estimation device, position estimation device, and reliability estimation method
TW110114658A TW202232433A (en) 2020-12-11 2021-04-23 Reliability estimating device, position estimating device, and reliability estimating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/046360 WO2022123786A1 (en) 2020-12-11 2020-12-11 Reliability estimation device, position estimation device, and reliability estimation method

Publications (1)

Publication Number Publication Date
WO2022123786A1 true WO2022123786A1 (en) 2022-06-16

Family

ID=81974299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/046360 Ceased WO2022123786A1 (en) 2020-12-11 2020-12-11 Reliability estimation device, position estimation device, and reliability estimation method

Country Status (5)

Country Link
JP (1) JP7221469B2 (en)
CN (1) CN116547714B (en)
DE (1) DE112020007687T5 (en)
TW (1) TW202232433A (en)
WO (1) WO2022123786A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024127824A1 (en) * 2022-12-12 2024-06-20 ソニーグループ株式会社 Information processing device, information processing method, and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008186246A (en) * 2007-01-30 2008-08-14 Aisin Seiki Co Ltd Moving object recognition device
JP2008310775A (en) * 2007-06-18 2008-12-25 Canon Inc Facial expression recognition apparatus and method, and imaging apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5057183B2 (en) * 2010-03-31 2012-10-24 アイシン・エィ・ダブリュ株式会社 Reference data generation system and position positioning system for landscape matching
WO2013054499A1 (en) * 2011-10-11 2013-04-18 パナソニック株式会社 Image processing device, imaging device, and image processing method
JP6694234B2 (en) * 2015-01-23 2020-05-13 シャープ株式会社 Distance measuring device
JP6710190B2 (en) * 2017-09-29 2020-06-17 クラリオン株式会社 Marking line recognition device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008186246A (en) * 2007-01-30 2008-08-14 Aisin Seiki Co Ltd Moving object recognition device
JP2008310775A (en) * 2007-06-18 2008-12-25 Canon Inc Facial expression recognition apparatus and method, and imaging apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024127824A1 (en) * 2022-12-12 2024-06-20 ソニーグループ株式会社 Information processing device, information processing method, and program

Also Published As

Publication number Publication date
TW202232433A (en) 2022-08-16
DE112020007687T5 (en) 2023-09-07
JP7221469B2 (en) 2023-02-13
JPWO2022123786A1 (en) 2022-06-16
CN116547714B (en) 2024-12-31
CN116547714A (en) 2023-08-04

Similar Documents

Publication Publication Date Title
US8755630B2 (en) Object pose recognition apparatus and object pose recognition method using the same
US12165358B2 (en) Main subject determining apparatus, image capturing apparatus, main subject determining method, and storage medium
CN118196446A (en) Matching local image feature descriptors
JP2014098898A (en) Multi resolution depth-from-defocus based autofocus
CN114694005A (en) Target detection model training method and device, and target detection method and device
CN111507132A (en) A positioning method, device and equipment
CN114387197B (en) Binocular image processing method, binocular image processing device, binocular image processing equipment and storage medium
JP6395429B2 (en) Image processing apparatus, control method thereof, and storage medium
WO2022123786A1 (en) Reliability estimation device, position estimation device, and reliability estimation method
JP2020004179A (en) Image processing program, image processing apparatus, and image processing method
CN113674319B (en) Target tracking method, system, equipment and computer storage medium
US9621780B2 (en) Method and system of curve fitting for common focus measures
CN115546681A (en) A method and system for asynchronous feature tracking based on events and frames
CN109079786A (en) Mechanical arm grabs self-learning method and equipment
US20190205666A1 (en) Method for evaluating credibility of obstacle detection
CN113345026A (en) Camera parameter calibration method and device, storage medium and electronic equipment
CN114821407B (en) Equipment inspection method, processor and device based on feature point matching
KR101483549B1 (en) Method for Camera Location Estimation with Particle Generation and Filtering and Moving System using the same
CN113689422B (en) Image processing method and device and electronic equipment
WO2024009377A1 (en) Information processing device, self-position estimation method, and non-transitory computer-readable medium
CN112927268B (en) Corner point tracking method and device, computer equipment and readable storage medium
CN117474961A (en) Method, device, equipment and storage medium for reducing depth estimation model error
JP7107440B2 (en) LEARNING DATA GENERATION DEVICE, LEARNING DATA GENERATION METHOD, AND PROGRAM
TWI814500B (en) Method for reducing error of a depthe stimation model, device, equipment and storage media
JP2017034616A (en) Image processing apparatus and control method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20965167

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022568023

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 202080107744.7

Country of ref document: CN

122 Ep: pct application non-entry in european phase

Ref document number: 20965167

Country of ref document: EP

Kind code of ref document: A1

WWG Wipo information: grant in national office

Ref document number: 202080107744.7

Country of ref document: CN