[go: up one dir, main page]

JPH05303629A - Shape synthesis method - Google Patents

Shape synthesis method

Info

Publication number
JPH05303629A
JPH05303629A JP3315874A JP31587491A JPH05303629A JP H05303629 A JPH05303629 A JP H05303629A JP 3315874 A JP3315874 A JP 3315874A JP 31587491 A JP31587491 A JP 31587491A JP H05303629 A JPH05303629 A JP H05303629A
Authority
JP
Japan
Prior art keywords
shape
partial
texture
value
shapes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP3315874A
Other languages
Japanese (ja)
Other versions
JP2953154B2 (en
Inventor
Makoto Maruie
丸家誠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to JP3315874A priority Critical patent/JP2953154B2/en
Publication of JPH05303629A publication Critical patent/JPH05303629A/en
Application granted granted Critical
Publication of JP2953154B2 publication Critical patent/JP2953154B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Generation (AREA)
  • Image Analysis (AREA)

Abstract

(57)【要約】 【目的】 ある物体に対して複数の視点から視差画像を
撮影し、視差画像間のステレオマッチングにより形成さ
れた部分形状を、形状およびテクスチャの両方の情報を
用いて合成して、物体全体の形状を復元する。 【構成】 視差画像間のステレオマッチングにより、物
体の部分形状と部分形状上のテクスチャを取得する。と
なりあう部分形状上のテクスチャとのマッチングを行
い、重なり幅を求める。重なった部分の形状を合わせる
ことにより2つの部分形状を合成する。
(57) [Abstract] [Purpose] A parallax image is captured from a plurality of viewpoints for an object, and the partial shape formed by stereo matching between parallax images is synthesized using both the shape and texture information. To restore the shape of the entire object. [Configuration] A partial shape of an object and a texture on the partial shape are acquired by stereo matching between parallax images. The overlapping width is calculated by matching with the texture on the neighboring shapes. The two partial shapes are combined by matching the shapes of the overlapping portions.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、視差画像間のステレオ
マッチングによる3次元部分形状データを合成する方法
に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method for synthesizing three-dimensional partial shape data by stereo matching between parallax images.

【0002】[0002]

【従来の技術】光学的な立体計測には、レンジファイン
ダを用いる能動的な方法と、テスレオ法に代表される受
動的な方法がある(尾上他偏「画像処理ハンドブック」
昭晃堂(1987))。いずれの方法においても、一回
の測定で得られる形状は対象となる物体の一部分だけで
あり、物体全体の形状を復元するには、部分形状の合成
が必要になる。形状合成には、各測定系間の位置関係を
求めなければならないが、それらをある程度自動的に計
算する方法が開発されている(河井他:“多視点レンジ
データからの3次元形状復元”、信学会パターン認識・
理解研究会技術報告91−19(1991)、植村、増
田“局所3次元空間統合のための座標系移動量の推
定”、信学会パターン認識・理解研究会技術報告91−
40(1991))。
2. Description of the Related Art Optical stereoscopic measurement includes an active method using a range finder and a passive method represented by the Tessreo method (Onoe et al., "Image Processing Handbook").
Shokodo (1987)). In either method, the shape obtained by one measurement is only a part of the target object, and partial shape synthesis is necessary to restore the shape of the entire object. For shape synthesis, it is necessary to obtain the positional relationship between each measurement system, but a method to calculate them to some extent has been developed (Kawai et al .: "3D shape reconstruction from multi-view range data", SIJ pattern recognition
Understanding Study Group Technical Report 91-19 (1991), Uemura, Masuda "Estimation of Coordinate System Movement Amount for Local 3D Spatial Integration", IEICE Pattern Recognition and Understanding Technical Report 91-
40 (1991)).

【0003】[0003]

【発明が解決しようとする課題】しかしながら、能動的
な方法は一般に装置が高価であり、測定範囲が限定され
る。一方、受動的な方法では上記の欠点はないが、上述
の文献に記載されている方法では、線分の対応付けに基
づいて座標系移動量を求めているため、曲面で構成され
る物体や、線分が明確に抽出されない物体、複雑なテク
スチャを持つ物体には適用できない。
However, the active method is generally expensive in equipment and has a limited measuring range. On the other hand, the passive method does not have the above-mentioned drawbacks, but in the method described in the above-mentioned document, since the coordinate system movement amount is obtained based on the correspondence of the line segments, an object composed of a curved surface or , It cannot be applied to the object where the line segment is not clearly extracted or the object with the complicated texture.

【0004】本発明の目的は、曲面で構成される物体
や、線分が明確に抽出されない物体、複雑なテクスチャ
を持つ物体に対しても有効な、スイレオ法に基づく形状
合成方法を提供することにある。
An object of the present invention is to provide a shape synthesizing method based on the Suileo method, which is effective for an object composed of a curved surface, an object for which a line segment is not clearly extracted, and an object having a complicated texture. It is in.

【0005】[0005]

【課題を解決するための手段】第1の発明の形状合成方
法は、ステレオカメラを用いて同一平面上の複数の視点
から物体の視差画像を撮影・入力し、該視差画像間のス
テレオマッチングにより得られた複数の部分形状データ
を合成して該物体の3次元形状を復元する形状合成方法
において、各部分形状上のテクスチャを用いて各部分形
状上の重なりを検出する第1のステップと、重なり部分
をつなぎ合わせることにより部分形状を合成し、物体の
3次元形状を復元する第2のステップとから成ることを
特徴とする。
A shape synthesizing method according to a first aspect of the invention uses a stereo camera to capture and input parallax images of an object from a plurality of viewpoints on the same plane, and perform stereo matching between the parallax images. In a shape synthesizing method of synthesizing a plurality of obtained partial shape data to restore a three-dimensional shape of the object, a first step of detecting an overlap on each partial shape by using a texture on each partial shape, And a second step of restoring the three-dimensional shape of the object by combining the partial shapes by connecting the overlapping portions.

【0006】第2の発明の形状合成方法は、第1の発明
において、第1のステップが、部分形状に沿って一定間
隔d毎にテクスチャの輝度値と形状を表現する座標値を
抽出するステップと、2つの部分形状上のテクスチャの
重なり幅をdづつ変化させながら重なった部分の前記輝
度値の相互相関値を求め、該相互相関値が最大となる重
なり幅を実際の重なり幅と判定し、2つの部分形状の抽
出点同士の対応を求めるステップとから成ることを特徴
とする。
In the shape synthesizing method of the second invention, in the first invention, the first step is a step of extracting the brightness value of the texture and the coordinate value expressing the shape at regular intervals d along the partial shape. Then, the cross-correlation value of the brightness value of the overlapped portion is obtained while changing the overlap width of the textures on the two partial shapes by d, and the overlap width at which the cross-correlation value is maximum is determined as the actual overlap width. And a step of obtaining a correspondence between the extraction points of the two partial shapes.

【0007】第3の発明の形状合成方法は、第1の発明
において、第1のステップが、物体とカメラとの距離、
及びカメラレンズの焦点距離に基づいて、実空間での抽
出間隔が一定値dになるように、部分形状に沿ってテク
スチャの輝度値と形状を表現する座標値を抽出するステ
ップと、2つの部分形状上のテクスチャの重なり幅をd
づつ変化させながら重なった部分の前記輝度値の相互相
関値を求め、該相互相関値が最大となる重なり幅を実際
の重なり幅と判定し、2つの部分形状の抽出点同士の対
応を求めるステップとから成ることを特徴とする。
In the shape synthesizing method of the third invention, in the first invention, the first step is the distance between the object and the camera,
And a step of extracting the brightness value of the texture and the coordinate value expressing the shape along the partial shape so that the extraction interval in the real space becomes a constant value d based on the focal length of the camera lens, and the two parts. The overlapping width of the texture on the shape is d
A step of obtaining a cross-correlation value of the luminance values of the overlapped portions while changing them one by one, determining an overlap width where the cross-correlation value is maximum as an actual overlap width, and obtaining a correspondence between extraction points of two partial shapes. It consists of and.

【0008】第4の発明の形状合成方法は、第1の発明
において、第2のステップが、2つの部分形状の対応す
るすべての抽出点について、対応する抽出点における座
標値間の距離の自乗和を最小にするように、座標系移動
パラメータを求めることを特徴とする。
In the shape synthesizing method of the fourth invention, in the first invention, the second step is, for all the extraction points corresponding to the two partial shapes, the square of the distance between the coordinate values at the corresponding extraction points. The feature is that the coordinate system movement parameter is calculated so as to minimize the sum.

【0009】[0009]

【実施例】図1は、第1、2、4の発明の形状合成方法
の動作フローである。
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS FIG. 1 is an operation flow of the shape synthesizing method of the first, second and fourth inventions.

【0010】まず、ステップ11では複数の視差画像を
撮影する。視差画像の撮影は、ステレオカメラと物体と
の距離をほぼ同一に保ち、かつカメラを同一水平面(測
定平面と呼ぶ)上に位置させながら行なう。この時、カ
メラの縦揺れ、横揺れは水準器等を用いて防ぐ。さら
に、ステレオカメラを構成する2台のカメラの光軸は平
行とし、2つの光軸で構成される面と測定平面は平行と
する。図3にステレオカメラの配置例(水平断面図)を
示す。
First, at step 11, a plurality of parallax images are photographed. The parallax image is taken while keeping the distance between the stereo camera and the object substantially the same and positioning the cameras on the same horizontal plane (referred to as a measurement plane). At this time, the vertical and horizontal shaking of the camera is prevented by using a level or the like. Further, the optical axes of the two cameras forming the stereo camera are parallel to each other, and the plane formed by the two optical axes and the measurement plane are parallel to each other. FIG. 3 shows an arrangement example (horizontal sectional view) of the stereo camera.

【0011】ステップ12で視差画像間の相関法による
ステレオマッチングを行い、部分形状を復元し、かつ物
体表面上のテクスチャをも取得する。図4にステップ1
2で復元される部分形状およびそれを表現する座標系
(t,z)を示す。図4の黒点は、視差画像間の対応点
を示している。すなわち、この点で座標値(t,z)、
輝度値が求まっている。図4のC1 は図3の撮影位置1
で得られた部分形状であり、図4のC2 は図3の撮影位
値2で得られた部分形状である。
In step 12, stereo matching is performed by the correlation method between the parallax images, the partial shape is restored, and the texture on the object surface is also acquired. Step 1 in Figure 4
2 shows a partial shape restored in 2 and a coordinate system (t, z) expressing it. Black dots in FIG. 4 indicate corresponding points between parallax images. That is, at this point, the coordinate value (t, z),
The brightness value has been determined. C 1 in FIG. 4 is the photographing position 1 in FIG.
4 is the partial shape obtained in step 3, and C 2 in FIG. 4 is the partial shape obtained at the photographing position value 2 in FIG.

【0012】次にステップ13で部分形状を直線に引き
延ばし、あらかじめ決められた間隔(d)毎に輝度値、
形状を表現する座標値(t,z)を線形補間等により抽
出する。図5にC1 、C2 を処理した結果C’1 、C’
2 を示す。
Next, in step 13, the partial shape is stretched into a straight line, and the brightness value is set at predetermined intervals (d).
The coordinate value (t, z) expressing the shape is extracted by linear interpolation or the like. As a result of processing C 1 and C 2 in FIG. 5, C ′ 1 and C ′
2 is shown.

【0013】ステップ14ではC’1 、C’2 を一定の
幅だけ重ね、重なった部分の輝度値の相互相関値を計算
する。重なり幅をdづつ変化させながら相互相関値を求
めて、相互相関値が最大となる重なり幅を実際の重なり
幅と判定する。それにより、C’1 、C’2 上の抽出点
同士の対応が求まる。
[0013] overlaid C In step 14 '1, C' 2 only constant width, calculating the cross-correlation value of the luminance values of the overlapping portion. The cross-correlation value is obtained while changing the overlap width by d, and the overlap width having the maximum cross-correlation value is determined as the actual overlap width. Thereby, the correspondence between the extraction points on C ′ 1 and C ′ 2 is obtained.

【0014】ステップ15では、対応する抽出点間の距
離の自乗の和(Q)を最小にするように、座標系移動パ
ラメータ(p,q,θ)を定める。C’1 とC’2 の対
応する抽出点をそれぞれai 、bi とすると、Qは
In step 15, coordinate system movement parameters (p, q, θ) are determined so as to minimize the sum of squares (Q) of the distances between the corresponding extraction points. If the corresponding extraction points of C ′ 1 and C ′ 2 are a i and b i , respectively, Q is

【0015】[0015]

【数1】 [Equation 1]

【0016】となる。Qを最小にするp,q,θを求め
るために、Qをp,q,θで微分して0とおいた連立方
程式をとき、
[0016] In order to find p, q, and θ that minimizes Q, when a simultaneous equation is set by differentiating Q by p, q, and θ and setting it to 0,

【0017】[0017]

【数2】 [Equation 2]

【0018】ステップ16では、まず、C’1 、C’2
それぞれの抽出点上の座標値(t,z)をもとに、部分
形状C”1 、C”2 を生成し、さらにステップ15で求
めたp,q,θを用いてC”2 の座標値(t,z)を回
転、平行移動させてのC”1 座標系に変換し、C”1
C”2 を合成する(図6参照)。C”1 とC”2 で重な
っている部分は、両者の抽出点の座標値の平均を用い
る。
[0018] In step 16, first, C '1, C' 2
Coordinate values on each extraction point (t, z) based on the partial shape C "1, C" to generate a 2, p is further determined at step 15, q, C "2 coordinates with θ The value (t, z) is rotated and translated to be converted into the C ″ 1 coordinate system, and C ″ 1 and C ″ 2 are combined (see FIG. 6). For the portion where C ″ 1 and C ″ 2 overlap, the average of the coordinate values of both extraction points is used.

【0019】以上で2つの部分形状の合成法に付いて述
べた。3つ以上の部分形状の合成は、まず2つの部分形
状を合成して1つの部分形状とし、さらに合成を積み重
ねていけばよい。
The method of synthesizing the two partial shapes has been described above. To combine three or more partial shapes, first, two partial shapes may be combined into one partial shape, and the combination may be stacked.

【0020】図2は、第1、3、4の発明の形状合成方
法の動作フローである。
FIG. 2 is an operation flow of the shape synthesizing method of the first, third and fourth inventions.

【0021】ステップ11から12、ステップ14から
16は前述のステップと同様である。ステップ23で
は、部分形状を直線上に引き延ばすとともに、カメラレ
ンズの焦点距離を用いて、物体とカメラとの絶対距離を
求め、みかけのテクスチャを原寸大に拡大してから、抽
出、補間処理を行なう。
Steps 11 to 12 and steps 14 to 16 are the same as the above-mentioned steps. In step 23, the partial shape is stretched on a straight line, the absolute distance between the object and the camera is obtained using the focal length of the camera lens, and the apparent texture is enlarged to full size, and then extraction and interpolation processing are performed. ..

【発明の効果】本発明の形状合成方法では、部分形状上
のテクスチャを用いて各部分形状の重なりを検出し、2
つの部分形状の共通な部分を検出する。共通部分の形状
を合わせることにより2つの部分形状を合成する。
According to the shape synthesizing method of the present invention, the overlapping of each partial shape is detected by using the texture on the partial shape, and
Detect the common part of two part shapes. The two partial shapes are combined by matching the shapes of the common portions.

【0022】そのため、本発明では線分の抽出を行なう
必要が無く、曲面で構成される物体や、線分が明確に抽
出されない物体、複雑なテクスチャを持つ物体に対して
も有効な、ステレオマッチングに基づく形状合成が行え
る。
Therefore, according to the present invention, it is not necessary to extract the line segment, and stereo matching which is effective for an object formed of a curved surface, an object for which the line segment is not clearly extracted, or an object having a complicated texture is effective. Shape synthesis based on can be performed.

【0023】さらに、本発明による形状合成において必
要な機材は一組のステレオカメラだけであり、かつカメ
ラ間の平面上の位値関係およびカメラ間の偏揺れ角が自
動的に求まるため撮影時にそれらを測定する必要がな
く、撮影時の手間が大幅に削減される。
Furthermore, the equipment required for the shape synthesis according to the present invention is only one set of stereo cameras, and since the positional relationship on the plane between the cameras and the yaw angle between the cameras are automatically obtained, they are not used at the time of photographing. It is not necessary to measure, and the time and effort required for shooting are greatly reduced.

【0024】特に、第3の発明においては、見かけのテ
クスチャを原寸大に変換してからテクスチャ間の重なり
を検出するため、物体とステレオカメラとの距離がかな
り異なった視差画像により求めた部分形状間の形状合成
も可能になる。
In particular, in the third invention, since the apparent texture is converted to the original size and the overlap between the textures is detected, the partial shape obtained by the parallax image in which the distance between the object and the stereo camera is considerably different. It is also possible to synthesize shapes between them.

【図面の簡単な説明】[Brief description of drawings]

【図1】第1、2、4の発明の一実施例を示す動作フロ
ーである。
FIG. 1 is an operation flow showing an embodiment of the first, second and fourth inventions.

【図2】第1、3、4の発明の一実施例を示す動作フロ
ーである。
FIG. 2 is an operation flow showing an embodiment of the first, third and fourth inventions.

【図3】ステレオカメラの配置の一例を示す水平断面図
である。
FIG. 3 is a horizontal sectional view showing an example of an arrangement of stereo cameras.

【図4】視差画像間のステレオマッチングにより求めら
れた部分形状の例を示す図である。
FIG. 4 is a diagram illustrating an example of a partial shape obtained by stereo matching between parallax images.

【図5】テクスチャの引き延ばし、分割の説明図であ
る。
FIG. 5 is an explanatory diagram of texture extension and division.

【図6】2つの座標系の関係、および合成された形状を
示す図である。
FIG. 6 is a diagram showing a relationship between two coordinate systems and a combined shape.

【符号の説明】[Explanation of symbols]

11 視差画像入力処理 12 ステレオマッチング処理 13 テクスチャ引き延ばし・抽出・補間処理 14 テクスチャ重なり幅計算処理 15 座標系移動パラメータ計算処理 16 形状合成処理 23 テクスチャ拡大・引き延ばし・抽出・補間処理 11 Parallax image input processing 12 Stereo matching processing 13 Texture extension / extraction / interpolation processing 14 Texture overlap width calculation processing 15 Coordinate system movement parameter calculation processing 16 Shape synthesis processing 23 Texture enlargement / extension / extraction / interpolation processing

Claims (4)

【特許請求の範囲】[Claims] 【請求項1】 ステレオカメラを用いて同一平面上の複
数の視点から物体の視差画像を撮影・入力し、該視差画
像間のステレオマッチングにより得られた複数の部分形
状データを合成して該物体の3次元形状を復元する形状
合成方法において、各部分形状上のテクスチャを用いて
各部分形状上の重なりを検出する第1のステップと、重
なり部分をつなぎ合わせることにより部分形状を合成
し、物体の3次元形状を復元する第2のステップとから
成ることを特徴とする形状合成方法。
1. A stereo camera is used to capture and input parallax images of an object from a plurality of viewpoints on the same plane, and a plurality of partial shape data obtained by stereo matching between the parallax images are combined to synthesize the object. In the shape synthesizing method for restoring the three-dimensional shape of, the first step of detecting an overlap on each partial shape by using the texture on each partial shape and the partial shape are combined by connecting the overlapping parts to each other. And a second step of restoring the three-dimensional shape of.
【請求項2】 第1のステップは、部分形状上に沿って
一定間隔d毎にテクスチャの輝度値と形状を表現する座
標値を抽出するステップと、2つの部分形状上のテクス
チャの重なり幅をdづつ変化させながら重なった部分の
前記輝度値の相互相関値を求め、該相互相関値が最大と
なる重なり幅を実際の重なり幅と判定し、2つの部分形
状の抽出点同士の対応を求めるステップとから成ること
を特徴とする請求項1記載の形状合成方法。
2. A first step is a step of extracting a brightness value of a texture and a coordinate value expressing the shape at regular intervals along the partial shape, and an overlapping width of the texture on the two partial shapes. The cross-correlation value of the brightness values of the overlapped portions is obtained while changing by d, the overlap width where the cross-correlation value is maximum is determined as the actual overlap width, and the correspondence between the extraction points of the two partial shapes is obtained. The shape synthesizing method according to claim 1, further comprising:
【請求項3】 第1のステップは、物体とカメラとの距
離、及びカメラレンズの焦点距離に基づいて、実空間で
の抽出間隔が一定値dになるように、部分形状に沿って
テクスチャの輝度値と形状を表現する座標値を抽出する
ステップと、2つの部分形状上のテクスチャの重なり幅
をdづつ変化させながら重なった部分の前記輝度値の相
互相関値を求め、該相互相関値が最大となる重なり幅を
実際の重なり幅と判定し、2つの部分形状の抽出点同士
の対応を求めるステップとから成ることを特徴とする請
求項1記載の形状合成方法。
3. The first step is based on the distance between the object and the camera and the focal length of the camera lens so that the extraction interval in the real space becomes a constant value d so that the texture is formed along the partial shape. The step of extracting the brightness value and the coordinate value expressing the shape, the cross-correlation value of the brightness value of the overlapping portion is obtained while changing the overlapping width of the texture on the two partial shapes by d, and the cross-correlation value is 2. The shape synthesizing method according to claim 1, further comprising a step of determining the maximum overlapping width as an actual overlapping width and obtaining a correspondence between the extraction points of the two partial shapes.
【請求項4】 第2のステップは、2つの部分形状の対
応するすべての抽出点について、対応する抽出点におけ
る座標値間の距離の自乗和を最小にするように、座標系
移動パラメータを求めることを特徴とする請求項1記載
の形状合成方法。
4. The second step obtains coordinate system movement parameters for all corresponding extraction points of the two partial shapes so as to minimize the sum of squares of the distances between the coordinate values at the corresponding extraction points. The shape synthesizing method according to claim 1, wherein
JP3315874A 1991-11-29 1991-11-29 Shape synthesis method Expired - Fee Related JP2953154B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP3315874A JP2953154B2 (en) 1991-11-29 1991-11-29 Shape synthesis method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP3315874A JP2953154B2 (en) 1991-11-29 1991-11-29 Shape synthesis method

Publications (2)

Publication Number Publication Date
JPH05303629A true JPH05303629A (en) 1993-11-16
JP2953154B2 JP2953154B2 (en) 1999-09-27

Family

ID=18070638

Family Applications (1)

Application Number Title Priority Date Filing Date
JP3315874A Expired - Fee Related JP2953154B2 (en) 1991-11-29 1991-11-29 Shape synthesis method

Country Status (1)

Country Link
JP (1) JP2953154B2 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6438507B1 (en) 1999-05-31 2002-08-20 Minolta Co., Ltd. Data processing method and processing device
JP2003042732A (en) * 2001-08-02 2003-02-13 Topcon Corp Surface shape measuring device and method, surface shape measuring program, and surface state mapping device
JP2003042730A (en) * 2001-07-30 2003-02-13 Topcon Corp Surface shape measuring device, method therefor, and surface state mapping device
JP2003065737A (en) * 2001-08-28 2003-03-05 Topcon Corp Surface shape measuring device and method, and surface state mapping device
US6621921B1 (en) 1995-12-19 2003-09-16 Canon Kabushiki Kaisha Image processing apparatus
JP2007285704A (en) * 2006-04-12 2007-11-01 Penta Ocean Constr Co Ltd Loaded soil volume measurement method for earth and sand carrier
US7315643B2 (en) 2002-03-12 2008-01-01 Nec Corporation Three-dimensional shape measurement technique
JP2008096119A (en) * 2006-10-05 2008-04-24 Keyence Corp Optical displacement meter, optical displacement measuring method, optical displacement measuring program, computer-readable recording medium, and recorded device
WO2009025323A1 (en) * 2007-08-22 2009-02-26 Katsunori Shimomura Three-dimensional image data creating system and creating method
JP2011007794A (en) * 2009-06-25 2011-01-13 Siliconfile Technologies Inc Distance measuring device equipped with dual stereo camera
US20110234759A1 (en) * 2010-03-29 2011-09-29 Casio Computer Co., Ltd. 3d modeling apparatus, 3d modeling method, and computer readable medium
JP2012142791A (en) * 2010-12-28 2012-07-26 Casio Comput Co Ltd Three-dimensional model generation system, server, and program
JP2012257282A (en) * 2012-07-26 2012-12-27 Casio Comput Co Ltd Three-dimensional image generation method
US8531505B2 (en) 2010-12-28 2013-09-10 Casio Computer Co., Ltd. Imaging parameter acquisition apparatus, imaging parameter acquisition method and storage medium
JP2015045654A (en) * 2014-09-30 2015-03-12 洋彰 宮崎 Shape recognition machine
JP2018534514A (en) * 2015-09-10 2018-11-22 ブラバ・ホーム・インコーポレイテッド Camera in oven
WO2020183711A1 (en) 2019-03-14 2020-09-17 オムロン株式会社 Image processing device and three-dimensional measuring system
US11523707B2 (en) 2015-09-10 2022-12-13 Brava Home, Inc. Sequential broiling
US12035428B2 (en) 2015-09-10 2024-07-09 Brava Home, Inc. Variable peak wavelength cooking instrument with support tray
US12320531B2 (en) 2015-09-10 2025-06-03 Brava Home, Inc. Dynamic heat adjustment of a spectral power distribution configurable cooking instrument

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5035372B2 (en) * 2010-03-17 2012-09-26 カシオ計算機株式会社 3D modeling apparatus, 3D modeling method, and program
JP5158223B2 (en) 2011-04-06 2013-03-06 カシオ計算機株式会社 3D modeling apparatus, 3D modeling method, and program

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6621921B1 (en) 1995-12-19 2003-09-16 Canon Kabushiki Kaisha Image processing apparatus
US6438507B1 (en) 1999-05-31 2002-08-20 Minolta Co., Ltd. Data processing method and processing device
JP2003042730A (en) * 2001-07-30 2003-02-13 Topcon Corp Surface shape measuring device, method therefor, and surface state mapping device
JP2003042732A (en) * 2001-08-02 2003-02-13 Topcon Corp Surface shape measuring device and method, surface shape measuring program, and surface state mapping device
JP2003065737A (en) * 2001-08-28 2003-03-05 Topcon Corp Surface shape measuring device and method, and surface state mapping device
US7315643B2 (en) 2002-03-12 2008-01-01 Nec Corporation Three-dimensional shape measurement technique
JP2007285704A (en) * 2006-04-12 2007-11-01 Penta Ocean Constr Co Ltd Loaded soil volume measurement method for earth and sand carrier
JP2008096119A (en) * 2006-10-05 2008-04-24 Keyence Corp Optical displacement meter, optical displacement measuring method, optical displacement measuring program, computer-readable recording medium, and recorded device
WO2009025323A1 (en) * 2007-08-22 2009-02-26 Katsunori Shimomura Three-dimensional image data creating system and creating method
JP2009047642A (en) * 2007-08-22 2009-03-05 Katsunori Shimomura Three-dimensional image data generation system and generation method
JP2011007794A (en) * 2009-06-25 2011-01-13 Siliconfile Technologies Inc Distance measuring device equipped with dual stereo camera
US20110234759A1 (en) * 2010-03-29 2011-09-29 Casio Computer Co., Ltd. 3d modeling apparatus, 3d modeling method, and computer readable medium
US8482599B2 (en) * 2010-03-29 2013-07-09 Casio Computer Co., Ltd. 3D modeling apparatus, 3D modeling method, and computer readable medium
JP2012142791A (en) * 2010-12-28 2012-07-26 Casio Comput Co Ltd Three-dimensional model generation system, server, and program
US8531505B2 (en) 2010-12-28 2013-09-10 Casio Computer Co., Ltd. Imaging parameter acquisition apparatus, imaging parameter acquisition method and storage medium
JP2012257282A (en) * 2012-07-26 2012-12-27 Casio Comput Co Ltd Three-dimensional image generation method
JP2015045654A (en) * 2014-09-30 2015-03-12 洋彰 宮崎 Shape recognition machine
JP2018534514A (en) * 2015-09-10 2018-11-22 ブラバ・ホーム・インコーポレイテッド Camera in oven
US11523707B2 (en) 2015-09-10 2022-12-13 Brava Home, Inc. Sequential broiling
US12035428B2 (en) 2015-09-10 2024-07-09 Brava Home, Inc. Variable peak wavelength cooking instrument with support tray
US12320531B2 (en) 2015-09-10 2025-06-03 Brava Home, Inc. Dynamic heat adjustment of a spectral power distribution configurable cooking instrument
WO2020183711A1 (en) 2019-03-14 2020-09-17 オムロン株式会社 Image processing device and three-dimensional measuring system
US11803982B2 (en) 2019-03-14 2023-10-31 Omron Corporation Image processing device and three-dimensional measuring system

Also Published As

Publication number Publication date
JP2953154B2 (en) 1999-09-27

Similar Documents

Publication Publication Date Title
JP2953154B2 (en) Shape synthesis method
US11010925B2 (en) Methods and computer program products for calibrating stereo imaging systems by using a planar mirror
JP2874710B2 (en) 3D position measuring device
JP3054681B2 (en) Image processing method
US20090167843A1 (en) Two pass approach to three dimensional Reconstruction
JPH07294215A (en) Image processing method and apparatus
US6809771B1 (en) Data input apparatus having multiple lens unit
JPWO2021200432A5 (en) Shooting method, shooting instruction method, shooting device and shooting instruction device
JP4193342B2 (en) 3D data generator
KR100574227B1 (en) Apparatus and method for extracting object motion that compensates for camera movement
CN111489384A (en) Occlusion assessment method, device, equipment, system and medium based on mutual view
JP2019032660A (en) Imaging system and imaging method
KR101673144B1 (en) Stereoscopic image registration method based on a partial linear method
CN116866522B (en) Remote monitoring method
JP2007025863A (en) Imaging system, imaging method, and image processing program
JPH0875454A (en) Ranging device
JP3504128B2 (en) Three-dimensional information restoration apparatus and method
JPH09231369A (en) Image information input device
JPH09229648A (en) Image information input / output device and image information input / output method
JPH09231371A (en) Image information input device and image information input method
KR102107465B1 (en) System and method for generating epipolar images by using direction cosine
CN111080689B (en) Method and apparatus for determining facial depth map
JPH11150741A (en) Method and apparatus for displaying three-dimensional image by stereo photography
Brunken et al. Incorporating Plane-Sweep in Convolutional Neural Network Stereo Imaging for Road Surface Reconstruction.
JP6292785B2 (en) Image processing apparatus, image processing method, and program

Legal Events

Date Code Title Description
LAPS Cancellation because of no payment of annual fees