JPH0481124B2 - - Google Patents
Info
- Publication number
- JPH0481124B2 JPH0481124B2 JP60004408A JP440885A JPH0481124B2 JP H0481124 B2 JPH0481124 B2 JP H0481124B2 JP 60004408 A JP60004408 A JP 60004408A JP 440885 A JP440885 A JP 440885A JP H0481124 B2 JPH0481124 B2 JP H0481124B2
- Authority
- JP
- Japan
- Prior art keywords
- point
- imaging
- pair
- dimensional
- axis direction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000003384 imaging method Methods 0.000 claims description 74
- 238000000691 measurement method Methods 0.000 claims description 4
- 230000001678 irradiating effect Effects 0.000 claims description 2
- 239000007787 solid Substances 0.000 description 34
- 238000005259 measurement Methods 0.000 description 15
- 238000000034 method Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 3
- 238000001444 catalytic combustion detection Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/245—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
- G01B11/2522—Projection by scanning of the object the position of the object changing and being recorded
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Description
【発明の詳細な説明】
〔産業上の利用分野〕
この発明は、人体、物体などの立体表面の位置
を非接触計測する立体計測方法に関する。DETAILED DESCRIPTION OF THE INVENTION [Field of Industrial Application] The present invention relates to a three-dimensional measurement method for non-contact measurement of the position of a three-dimensional surface of a human body, an object, etc.
従来、人体や物体などの立体の形状などを測定
する手法としては、センシングプローブを被測定
体に接触させて測定する接触法と、ステレオ写真
法、モアレトポグラフイ法、光切断法などの非接
触法とがあり、これらの手法が産業用ロボツト、
各種の検査装置などの物体認識技術として広く応
用されている。
Conventionally, methods for measuring the three-dimensional shape of a human body or an object include contact methods, in which a sensing probe is brought into contact with the object to be measured, and non-contact methods, such as stereo photography, moiré topography, and light sectioning. These methods are used for industrial robots,
It is widely applied as an object recognition technology for various inspection devices.
そして接触法の場合は、接触可能な被測定体し
か測定できず、測定可能な被測定体に制限があ
り、また、被測定体表面の各点の位置を接触計測
するため、測定に著しく長時間を要する。 In the case of the contact method, only objects to be measured that can be touched can be measured, which limits the number of objects that can be measured.Also, since the position of each point on the surface of the object is measured by contact, the measurement time is significantly longer. It takes time.
したがつて、被測定体の形状などの測定は、前
述の非接触法のように、被測定体に接触すること
なく行なうことが望まれる。 Therefore, it is desirable to measure the shape of the object to be measured without contacting the object, as in the non-contact method described above.
〔発明が解決しようとする問題点〕
ところで前記従来の非接触法の場合は、被測定
体の形状認識にもとづいて、計測する立体表面の
各点の位置を算出する手法を採つているため、立
体表面の各点の位置を測定するには、得られた形
状情報から対象とすべき測定点を求めるととも
に、求めた測定点の二次元あるいは三次元の位置
を算出しなければならず、この場合関数演算など
の複雑な算出処理を行なう必要があるとともに算
出に時間のかかる問題点がある。[Problems to be Solved by the Invention] However, in the case of the conventional non-contact method, the position of each point on the three-dimensional surface to be measured is calculated based on shape recognition of the object to be measured. To measure the position of each point on a three-dimensional surface, it is necessary to determine the target measurement point from the obtained shape information and calculate the two-dimensional or three-dimensional position of the determined measurement point. There are problems in that it is necessary to perform complex calculation processing such as case function calculations, and the calculation takes time.
また、前述の両手法を実現する測定装置は分解
能が非常に低く、被測定体そのものが小さい場
合、あるいは被測定体表面に凹凸がある場合には
測定誤差の増大あるいは測定不能の事態が生じ、
信頼性に欠ける問題点がある。 In addition, the measuring devices that implement both of the above methods have very low resolution, and if the object to be measured is small or the surface of the object to be measured is uneven, measurement errors may increase or measurements may not be possible.
There is a problem of lack of reliability.
この発明は、基台上の水平な支持台に被計測用
の立体を載置し、基台の周囲に複数の支柱を立設
し、前記各支柱それぞれに水平方向の線状のスリ
ツト光を前記立体に照射する投光手段および前記
立体のスリツト光の照射部分を前記投光手段の
上、下の2方向から撮像する1対の撮像手段を有
する非接触測定器を上下動自在に取付け、前記各
測定器を上下方向に移動するとともに、前記各測
定器それぞれの1対の撮像出力中のスリツト光の
位置情報にもとづき、水平方向、上下方向をZ軸
方向、X軸方向とする3次元座標での照射部分の
点G(x、y、z)の位置を、
x=(d−Kx)・{L/L+(d−e)−1}+d
y=Ky・L/L+(d−e)
z=Kz・r
d、eは1対の撮像出力それぞれでの点GのX
軸方向の距離データ
Lは1対の撮像手段のX軸方向の間隔Kx、Ky
は1対の撮像手段のレンズ倍率、位置などに基づ
いて設定されるX軸方向、Y軸方向の係数
rは1対の撮像出力中での点Gの走査線番号
Kzは走査線の本数、幅およびレンズ倍率によ
り定まる係数
の四則演算から求め、前記立体の表面の各位置を
算出して測定することを特徴とする立体測定方法
である。
In this invention, a three-dimensional object to be measured is placed on a horizontal support stand on a base, a plurality of pillars are erected around the base, and a horizontal linear slit light is emitted to each of the pillars. a non-contact measuring instrument having a light projecting means for irradiating the three-dimensional object and a pair of imaging means for capturing images of the irradiated portion of the three-dimensional object from two directions, above and below the light projecting means, is mounted so as to be vertically movable; While moving each of the measuring instruments in the vertical direction, the horizontal direction and the vertical direction are three-dimensional, with the Z-axis direction and the The position of point G (x, y, z) of the irradiated part in the coordinates is x = (d-Kx)・{L/L+(de-e)-1}+d y=Ky・L/L+(d- e) z=Kz・r d and e are the X of point G at each of the pair of imaging outputs
Axial distance data L is the distance Kx, Ky between the pair of imaging means in the X-axis direction
are coefficients in the X-axis direction and Y-axis direction that are set based on the lens magnification and position of the pair of imaging means. r is the scanning line number of point G in the pair of imaging outputs. Kz is the number of scanning lines. This is a three-dimensional measurement method characterized by calculating and measuring each position on the surface of the three-dimensional object by calculating the four arithmetic operations of coefficients determined by the width and lens magnification.
〔作用〕
そして、各測定器の投光手段により立体を囲む
ようにして立体の周面に水平方向のスリツト光が
照射されるとともに、各測定器の上下動により各
スリツト光の照射位置が上下方向に移動する。[Operation] Then, the light emitting means of each measuring instrument illuminates the circumferential surface of the solid body with horizontal slit light so as to surround the solid body, and the vertical movement of each measuring device causes the irradiation position of each slit light to move vertically. Moving.
さらに、各測定器の1対の撮像手段により各ス
リツト光の照射部分が投光手段の上、下の2方向
からそれぞれ撮像され、この撮像により得られた
各1対の撮像出力中のスリツト光の位置情報にも
とづく簡単な四則演算により、各スリツト光の照
射部分の各点の位置が算出されて立体の表面の各
位置が測定される。 Furthermore, the irradiated portion of each slit light is imaged by a pair of imaging means of each measuring device from two directions, above and below the light projecting means, and the slit light in each pair of imaging outputs obtained by this imaging is By simple arithmetic operations based on the position information, the position of each point on the irradiated portion of each slit light is calculated, and each position on the surface of the three-dimensional object is measured.
つぎに、この発明を、その1実施例を示した図
面とともに詳細に説明する。
Next, the present invention will be described in detail with reference to drawings showing one embodiment thereof.
まず、計測装置を示した第1図において、1は
基台、2は基台1に載置された水平な支持台、3
は支持台2に載置された被測定用の立体、4a,
4b,4c,4dは基台1の四隅にそれぞれ立設
された4本の支柱であり、支柱4a,4cの内側
面が立体3を介して対向するとともに、支柱4
b,4dの内側面が立体3を介して対向してい
る。 First, in FIG. 1 showing the measuring device, 1 is a base, 2 is a horizontal support placed on the base 1, and 3 is a horizontal support stand placed on the base 1.
are the three-dimensional object to be measured placed on the support stand 2, 4a,
4b, 4c, and 4d are four columns erected at the four corners of the base 1, and the inner surfaces of the columns 4a and 4c face each other via the solid body 3, and the columns 4
The inner surfaces of b and 4d face each other with the solid body 3 in between.
5a,5b,5c,5dは4本の連結杆であ
り、隣合う支柱4aと4b、4bと4c、4cと
4d、4dと4aの間に設けられる各支柱4a〜
4dを固定する。6a,6b,6c,6dは各支
柱4a〜4dの内側面それぞれに上下方向に設け
られたラツクである。 5a, 5b, 5c, and 5d are four connecting rods, and each of the pillars 4a to 4a is provided between adjacent pillars 4a and 4b, 4b and 4c, 4c and 4d, and 4d and 4a.
Fix 4d. Racks 6a, 6b, 6c, and 6d are provided vertically on the inner surfaces of the respective pillars 4a to 4d.
7a,7b,7c,7dは各ラツク6a〜6d
に噛合するピニオンが設けられた4個の非接触測
定器であり、内蔵のモータによりピニオンが回転
駆動されると、各測定器7a〜7dが支柱4a〜
4dに沿つてそれぞれ上下移動し、支柱4a〜4
dに上下動自在に取付けられている。 7a, 7b, 7c, 7d are each rack 6a to 6d
These are four non-contact measuring instruments each equipped with a pinion that meshes with the support pillars 4a to 7d.
The pillars 4a to 4 move up and down along the pillars 4d.
d so that it can move up and down.
8は各測定器7a〜7dにそれぞれ設けられた
投光手段であり、支持台2の平面に平行な、すな
わち水平方向の線状のスリツト光を立体3に照射
し、このとき各測定器7a〜7dの高さ方向の位
置が等しければ、立体3の全周面の所定高さの部
分に、第1図の破線に示すようにスリツト光が一
様に照射される。 Reference numeral 8 denotes a light projecting means provided in each of the measuring instruments 7a to 7d, which irradiates the three-dimensional object 3 with linear slit light parallel to the plane of the support base 2, that is, in the horizontal direction. If the positions in the height direction of ~7d are equal, the slit light is uniformly irradiated onto a predetermined height portion of the entire circumferential surface of the solid body 3, as shown by the broken line in FIG.
9α,9βは各測定器7a〜7dにそれぞれ設
けられた1対の撮像手段であり、CCD型エリア
イメージセンサ装置などの2次元センサ装置から
なり、一方の撮像手段9αが投光手段8の上側に
位置し、他方の撮像手段9βが投光手段8の下側
に位置する。10は支持台2の立体3の近傍に載
置された脚体、11は脚体10に立設された円柱
の計測基準ゲージである。 Reference numerals 9α and 9β denote a pair of imaging means provided in each of the measuring instruments 7a to 7d, which are composed of a two-dimensional sensor device such as a CCD type area image sensor device. The other imaging means 9β is located below the light projecting means 8. Reference numeral 10 indicates a leg mounted near the solid body 3 of the support base 2, and reference numeral 11 indicates a cylindrical measurement reference gauge erected on the leg 10.
そして各測定器7a〜7dの投光手段8および
両撮像手段9α,9βは第2図に示すように構成
され、同図において、12aは線状のスリツト付
きキセノンランプなどからなり線状のスリツト光
を出力する光源、12bは光源12aからのスリ
ツト光の長さを長くする凸面筒レンズなどからな
る拡張レンズ、12cは反射鏡であり、レンズ1
2bを介した光源12aからの水平な線状のスリ
ツト光を立体3の表面に照射する。 The light projecting means 8 and the imaging means 9α, 9β of each of the measuring instruments 7a to 7d are constructed as shown in FIG. A light source outputs light, 12b is an expansion lens made of a convex tube lens or the like that increases the length of the slit light from the light source 12a, 12c is a reflecting mirror, and lens 1
The surface of the solid body 3 is irradiated with horizontal linear slit light from the light source 12a via the light source 12b.
13αa,13βaはそれぞれ受光素子である
CCDを縦M行、横N列の2次元マトリツクス状
に配列して形成された撮像センサ、13αb,1
3βbは立体3の表面の反射光を両撮像センサ1
3αa,13βaにそれぞれ結像する集光レンズで
あり、撮像センサ13βa、レンズ13βbにより
一方の撮像手段9αが形成され、撮像センサ13
βa、レンズ13βbにより他方の撮像手段9βが
形成されている。 13αa and 13βa are light receiving elements, respectively.
An image sensor formed by arranging CCDs in a two-dimensional matrix of M rows vertically and N columns horizontally, 13αb, 1
3βb is the reflected light from the surface of the solid body 3.
The imaging sensor 13βa and the lens 13βb form one imaging means 9α, and the imaging sensor 13
The other imaging means 9β is formed by βa and the lens 13βb.
ところで計測位置をXYZの三次元座標系で説
明するため、第2図に示すように、第1図の上下
方向をX軸方向にとるとともに、スリツト光の照
射方向、照射されたスリツト光に並行な方向を
Y、Z軸方向をそれぞれにとる。 By the way, in order to explain the measurement position in the XYZ three-dimensional coordinate system, as shown in Figure 2, the vertical direction in Figure 1 is taken as the X-axis direction, and the irradiation direction of the slit light is parallel to the irradiated slit light. The Y-axis direction is taken as the Y-axis direction, and the Z-axis direction is taken respectively.
なお、両撮像手段9α,9βは、立体3のスリ
ツト光の照射部分が撮像視野内に位置するように
投光手段8の上、下側に固定設定されている。 Note that both the imaging means 9α and 9β are fixedly set above and below the light projecting means 8 so that the portion of the solid body 3 irradiated with the slit light is located within the imaging field of view.
また、第2図のSは立体3に照射されるスリツ
ト光を示す。 Further, S in FIG. 2 indicates the slit light irradiated onto the solid body 3.
そして両撮像手段9α,9βの撮像センサ13
αa,13βaにより、立体3のスリツト光の照射
部分が撮像され、このとき両撮像センサ13αa,
13βaの撮像面Fα,Fβには、たとえば第3図
a,bそれぞれに示すように、縦方向にスリツト
光像Sα,Sβが結像し、両撮像面Fα,Fβはスリツ
ト光像Sα,Sβの部分のみが明るくなる。 And the imaging sensor 13 of both imaging means 9α, 9β
The slit light irradiated portion of the solid 3 is imaged by αa, 13βa, and at this time both image sensors 13αa,
On the imaging surfaces Fα and Fβ of 13βa, slit light images Sα and Sβ are formed in the vertical direction, as shown in FIGS. Only the part becomes brighter.
さらに、両撮像センサ13αa,13βaの各1
列の受光素子の受光出力により、両撮像センサ1
3αa,13βaの各1走査線の撮像出力が形成さ
れるとともに、前記各走査線の撮像出力が両撮像
手段9α,9βから順次に出力される。 Furthermore, one each of both image sensors 13αa and 13βa
Both image sensors 1
Imaging outputs of one scanning line each of 3αa and 13βa are formed, and the imaging outputs of each scanning line are sequentially output from both imaging means 9α and 9β.
なお、第3図a,bの横方向がX軸方向に対応
するとともに、縦方向がZ軸方向に対応し、同図
aの横方向の線A1,…、Am、Am+1,Am+2,
Am+3,…,Anが撮像センサ13αaの第1ない
し第N走査線を示すとともに、同図bの横方向の
線B1,…,Bm,Bm+1,Bm+2,Bm+3,…,Bn
が撮像センサ13βaの第1ないし第N走査線を
示し、両センサ13αa,13βaは各走査線の撮
像出力が同一タイミングで順次に読出される。 Note that the horizontal direction in FIGS. 3a and 3b corresponds to the X-axis direction, and the vertical direction corresponds to the Z-axis direction, and the horizontal lines A 1 , ..., Am, Am +1 , Am in FIG. +2 ,
Am +3 ,..., An indicate the first to Nth scanning lines of the image sensor 13αa, and horizontal lines B 1 ,..., Bm, Bm +1 , Bm +2 , Bm +3 , ..., Bn
denotes the first to Nth scanning lines of the imaging sensor 13βa, and the imaging outputs of the respective scanning lines of both sensors 13αa and 13βa are sequentially read out at the same timing.
そして各測定器7a〜7dそれぞれの両撮像セ
ンサ13αa,13βaから読出されたアナログの
1対の撮像出力は、第4図に示す電子計算機14
に設けられた測定器7a〜7d毎の画像処理手段
にそれぞれ入力される。 A pair of analog imaging outputs read from both imaging sensors 13αa and 13βa of each of the measuring instruments 7a to 7d are sent to an electronic computer 14 shown in FIG.
are respectively input to image processing means for each of the measuring instruments 7a to 7d provided in the.
この各画像処理手段は第5図に示すように構成
され、同図において、15はクロツク信号を発生
するクロツク回路、16α,16βは1対の信号
処理回路であり、両撮像手段9α,9βの撮像セ
ンサ13αa,13αaから順次に出力される各走
査線のアナログ撮像出力を前記クロツク信号のタ
イミングでそれぞれ取り込むとともに、所定のス
ライスレベルでスライスし、スリツト光像Sα,
Sβの部分のみハイレベルになるデジタル画像信
号を形成する。 Each image processing means is constructed as shown in FIG. 5, in which 15 is a clock circuit that generates a clock signal, 16α and 16β are a pair of signal processing circuits, and both image pickup means 9α and 9β are connected to each other. Analog imaging outputs of each scanning line sequentially outputted from the imaging sensors 13αa, 13αa are captured at the timing of the clock signal, and sliced at a predetermined slice level to create slit optical images Sα,
A digital image signal is formed in which only the Sβ portion is at a high level.
17α,17βは1対のアドレスカウンタであ
り、クロツク信号のタイミングで両撮像センサ1
3αa,13βaの各走査線左端部の基準点の位置
からスリツト光像Sα,Sβによつて両処理回路1
6α,16βのデジタル画像信号がハイレベルパ
ルスになる点までの両撮像センサ9α,9βの1
対の撮像出力中での距離をそれぞれカウントし、
1対の撮像出力それぞれにおける照射部分の各点
のX軸方向の距離データをそれぞれ出力する。 17α and 17β are a pair of address counters, and both image sensors 1 are
Both processing circuits 1 are processed by slit light images Sα and Sβ from the reference point position at the left end of each scanning line of 3αa and 13βa.
1 of both image sensors 9α and 9β up to the point where the digital image signals of 6α and 16β become high-level pulses.
Count the distances in the pair of image outputs,
Distance data in the X-axis direction of each point of the irradiated portion in each of the pair of imaging outputs is output.
18は演算回路であり、両カウンタ17α,1
7βから同時に入力されたX軸方向の1対の距離
データ、クロツク信号のカウントにより得られる
走査線の番号と予め設定された走査線の幅とから
なる照射部分の各点のZ軸方向のデータなどのス
リツト光の位置情報にもとづく後述の四則演算か
ら、スリツト光の照射部分の各点の三次元座標系
での位置を算出する。 18 is an arithmetic circuit, and both counters 17α, 1
A pair of distance data in the X-axis direction simultaneously input from 7β, data in the Z-axis direction at each point in the irradiation area, consisting of a scanning line number obtained by counting clock signals and a preset scanning line width. The position of each point in the slit light irradiation area in the three-dimensional coordinate system is calculated from the four arithmetic operations described below based on the position information of the slit light.
19は演算回路18により算出された照射部分
の各点の座標位置を記憶する記憶部、20は処理
回路16α,16β、カウンタ17α,17β、
演算回路18、記憶部19からなる画像処理手
段、21は表示条件設定部、22は認識回路であ
り、設定部21に設定された条件にもとづき、記
憶部19に記憶された各点の座標位置から立体3
の寸法、表面状態、形状などを識別するととも
に、記憶部19に記憶された各点の座標位置およ
び識別した寸法、表面状態、形状などの表示信号
を第4図の表示手段23に出力する。 19 is a storage unit that stores the coordinate position of each point of the irradiated portion calculated by the arithmetic circuit 18; 20 is a processing circuit 16α, 16β; counters 17α, 17β;
An image processing means consisting of an arithmetic circuit 18 and a storage section 19; 21 is a display condition setting section; 22 is a recognition circuit; based on the conditions set in the setting section 21, the coordinate position of each point stored in the storage section 19 is from solid 3
The coordinate position of each point stored in the storage section 19 and a display signal indicating the identified dimensions, surface condition, shape, etc. are output to the display means 23 in FIG. 4.
そして第2図に示すように投光手段8から線状
のスリツト光が照射されるとともに、該スリツト
光Sの照射部分が投光手段8の上、下側の撮像手
段9α,9βにより2方向から撮像され、両撮像
手段9α,9βにたとえば第3図a,bのスリツ
ト光像Sα,Sβがそれぞれ結像する。 As shown in FIG. 2, linear slit light is emitted from the light projecting means 8, and the irradiated portion of the slit light S is directed in two directions by the upper and lower imaging means 9α and 9β of the light projecting means 8. For example, slit light images Sα and Sβ shown in FIGS. 3a and 3b are formed on both imaging means 9α and 9β, respectively.
さらに、撮像手段9αの撮像センサ13αaか
ら処理回路16αに、第1走査線A1ないし第N
走査線Anの撮像出力が順次に出力され、たとえ
ば第6図aに示すように、撮像センサ13αaか
ら処理回路16αに第Nないし第N+3走査線
Am,Am+1,Am+2,Am+3の撮像出力が順次に
出力されると、このとき同一タイミングで撮像手
段9βの撮像センサ13βaから処理回路16β
に、第7図aに示すように第Mないし第M+3走
査線Bm,Bm+1,Bm+2,Bm+3の撮像出力が順
次に出力される。 Further, the first scanning line A 1 to the Nth scanning line A 1 to Nth
The imaging outputs of the scanning lines An are sequentially output, and for example, as shown in FIG.
When the imaging outputs of Am, Am +1 , Am +2 , and Am +3 are sequentially output, at the same time, from the image sensor 13βa of the imaging means 9β to the processing circuit 16β.
Then, as shown in FIG. 7a, the imaging outputs of the Mth to M+3 scanning lines Bm, Bm +1 , Bm +2 and Bm +3 are sequentially output.
そして両処理回路16α,16βにより、両撮
像センサ13αa,13βaからの走査線毎のアナ
ログの撮像出力がスライスレベルlで順次スライ
スされ、このときレベルlがスリツト光像Sα,
Sβの部分のみを抽出するレベルに設定されてい
るため、第6図b、第7図bに示すように、各走
査線出力中のスリツト光像Sα,Sβの部分のみが
抽出されて両撮像手段9α,9βの撮像出力がデ
ジタル変換される。 The processing circuits 16α and 16β sequentially slice the analog imaging outputs for each scanning line from the image sensors 13αa and 13βa at a slice level l, and at this time, the level l is the slit light image Sα,
Since the level is set to extract only the Sβ portion, only the slit light image Sα and Sβ portions in each scanning line output are extracted, as shown in Figures 6b and 7b. The imaging outputs of the means 9α and 9β are digitally converted.
そして両処理回路16α,16βのデジタル信
号が両カウンタ17α,17βにそれぞれ入力さ
れ、カウンタ17α,17βは第6図c、第7図
cに示すように、各走査線の左端の基準点d0のタ
イミングで基準点パルスをそれぞれ形成するとと
もに、各基準点パルスにもとづき、基準点d0から
各走査線出力中でのスリツト光のX軸方向の位置
am、am+1、am+2、am+3および、bm、bm+1、
bm+2、bm+3それぞれまでの距離Dam、Dam+1、
Dam+2、Dam+3および、Dbm、Dbm+1、
Dbm+2、Dbm+3をカウントし、スリツト光像Sα
のX軸方向の距離データおよびスリツト光像Sβ
のX軸方向の距離データを演算回路18に出力す
る。 Then, the digital signals of both processing circuits 16α and 16β are inputted to both counters 17α and 17β, respectively, and the counters 17α and 17β, as shown in FIG. 6c and FIG . Each reference point pulse is formed at the timing of
am, am +1 , am +2 , am +3 and bm, bm +1 ,
Distance to bm +2 , bm +3 respectively Dam, Dam +1 ,
Dam +2 , Dam +3 and Dbm, Dbm +1 ,
Count Dbm +2 and Dbm +3 , and slit light image Sα
Distance data in the X-axis direction and slit light image Sβ
The distance data in the X-axis direction is output to the arithmetic circuit 18.
つぎに、演算回路18の演算について説明す
る。 Next, the calculation of the calculation circuit 18 will be explained.
いま、説明を簡単にするため、両撮像手段9
α,9βの撮像視野が完全に等しく、かつ撮像視
野の左端がZ軸に一致するように設定され、第8
図に示すように、スリツト光Sの照射部分の点G
(x、y、z)の光が、両撮像手段9α,9βの
レンズ13αb,13βbの中心点P(a、0、z)、
Q(b、0、z)をそれぞれ介して結像したとする
と、その結像点と中心点P(a、0、z)、Q(b、0、
z)を結ぶ線分それぞれY軸の立体(被測定体)
側の任意の点cを通るXZ平面と交差する点U
(d、c、z)、V(e、c、z)を設定すること
により、点G(x、y、z)は、点P(a、0、z)、
U(d、c、z)を通る線分と、点Q(b、0、z)、
V(e、c、z)を通る線分との交点として求ま
る。 Now, to simplify the explanation, both imaging means 9
The imaging fields of α and 9β are set to be completely equal, and the left end of the imaging field of view is set to coincide with the Z axis.
As shown in the figure, a point G on the irradiated part of the slit light S
The light of (x, y, z) reaches the center point P(a, 0 , z) of the lenses 13αb, 13βb of both imaging means 9α, 9β,
If the image is formed through Q(b, 0 , z), then the image forming point and the center point P(a, 0 , z), Q(b, 0 ,
z) Each line segment connecting Y-axis solid (object to be measured)
Point U that intersects the XZ plane passing through any point c on the side
By setting (d, c, z), V (e, c, z), point G (x, y, z) becomes point P (a, 0 , z),
A line segment passing through U (d, c, z) and point Q (b, 0 , z),
It is found as the intersection with the line segment passing through V(e, c, z).
そして点P(a、0、z)、Q(b、0、z)、U(d
、
c、z)、V(e、c、z)の値にもとづぎ、点G
(x、y、z)のX、Y軸成分x、yは、つぎの
(1)、(2)式から求まる。 And the points P(a, 0 , z), Q(b, 0 , z), U(d
,
Based on the values of c, z) and V(e, c, z), point G
The X and Y axis components x and y of (x, y, z) are as follows
It can be found from equations (1) and (2).
x=a・e−b・d/a−b−d+e=b・d−a・e
/b−a+d−e
=(d−a){b−a/b−a+d−e−1}+d
……(1)式
y=c・(a−b)/a−b−d+e=c・(b−
a)/b−a+d−e
……(2)
ところで(1)、(2)式中のb−aは両撮像センサ1
3αa,13βaの間隔Lであり、d、eはレンズ
13αb,13βbの倍率および撮像手段9α,9
βの取付位置により決まる撮像面Fα,Fβ上での
点G(x、y、z)のX軸方向の位置である。x=a・e−b・d/a−b−d+e=b・d−a・e
/b-a+d-e = (d-a) {b-a/b-a+d-e-1}+d...(1) Formula y=c・(a-b)/a-b-d+e=c・(b-
a)/b-a+d-e......(2) By the way, b-a in equations (1) and (2) represents both image sensors 1
3αa and 13βa are the distances L, and d and e are the magnifications of the lenses 13αb and 13βb and the imaging means 9α and 9
This is the position of point G (x, y, z) in the X-axis direction on the imaging planes Fα, Fβ determined by the mounting position of β.
そしてd、eは撮像面Fα,Fβそれぞれの左端
の基準点d0からの距離データとして求められる。 Then, d and e are obtained as distance data from the reference point d 0 at the left end of each of the imaging planes Fα and Fβ.
またa、b、cは撮像手段9α,9βの取付位
置、レンズ13αb,13βbの倍率などにより設
定される定数である。 Further, a, b, and c are constants set by the mounting positions of the imaging means 9α and 9β, the magnifications of the lenses 13αb and 13βb, and the like.
そこで、レンズ13αb,13βbの倍率、撮像
手段9α,9βの位置などにもとづいて設定され
るX、Y軸方向の定数a、cをKx、Kyとするこ
とにより、点G(x、y、z)のX、Y軸成分x、
yはつぎの(3)、(4)式の演算から求まる。 Therefore, by setting constants a and c in the X and Y axis directions, which are set based on the magnification of the lenses 13αb and 13βb and the positions of the imaging means 9α and 9β, as Kx and Ky, the point G (x, y, z )'s X and Y axis components x,
y can be found by calculating the following equations (3) and (4).
x=(d−Kx)・{L/L+(d−e)−1}+d
……(3)式
y=Ky・L/L+(d−e) ……(4)式
一方、点G(x、y、z)のZ軸成分zは、点
G(x、y、z)の走査線番号rと、走査線の本
数、幅およびレンズ6a,6bの倍率により定ま
る係数Kzとにもとづき、つぎの(5)式の演算から
求まる。x=(d-Kx)・{L/L+(d-e)-1}+d...Formula (3) y=Ky・L/L+(d-e)...Formula (4) On the other hand, point G( The Z-axis component z of the point G (x, y, z) is based on the scanning line number r of the point G (x, y, z), and the coefficient Kz determined by the number and width of the scanning lines and the magnification of the lenses 6a and 6b. It can be found by calculating the following equation (5).
z=Kz・r ……(5)式
そして(3)、(4)式中のKx、Ky、Lおよび(5)式中
のKzが定数になり、d、eがカウンタ17α,
17βから入力されたX軸方向の1対の距離デー
タとして得られ、かつ、rがクロツク信号のカウ
ントにより得られるため、演算回路18は、予め
設定されたKx、Ky、Kz、Lのデータからなる
設定位置情報、カウンタ17α,17βから入力
された1対の距離データおよびクロツク信号のカ
ウントデータからなる検出位置情報とからなるス
リツト光の位置情報にもとづき、(3)ないし(5)式の
四則演算を行なつて点G(x、y、z)の位置を
算出し、該算出をスリツト光の照射部分の各点に
対して施すことにより、スリツト光の照射部分の
各点の第2図のXYZ座標系での三次元位置を算
出する。 z=Kz・r...Equation (5) Then, Kx, Ky, L in Equations (3) and (4) and Kz in Equation (5) are constants, and d and e are the counters 17α,
Since r is obtained as a pair of distance data in the X-axis direction input from 17β, and r is obtained by counting the clock signal, the arithmetic circuit 18 calculates Based on the position information of the slit light, which consists of the set position information consisting of the set position information, a pair of distance data input from the counters 17α and 17β, and the detected position information consisting of the count data of the clock signal, the four rules of equations (3) to (5) are determined. By performing calculations to calculate the position of point G (x, y, z) and applying the calculation to each point in the slit light irradiation area, the second figure of each point in the slit light irradiation area is calculated. Calculate the three-dimensional position of in the XYZ coordinate system.
なお、両撮像手段9α,9βの視野が完全に重
複しないときおよび、撮像面Fα,Fβの縦、横と
Z、X軸とがずれている場合などには、各式の値
に、ずれ量に相当する補正係数を掛けてスリツト
光の照射部分の各点の三次元位置を算出する。 In addition, when the fields of view of both imaging means 9α and 9β do not completely overlap, and when the vertical and horizontal directions of the imaging planes Fα and Fβ are misaligned with the Z and X axes, the amount of misalignment is added to the value of each formula. The three-dimensional position of each point in the slit light irradiation area is calculated by multiplying by a correction coefficient corresponding to .
そして各測定器7a〜7dが内蔵のモータの駆
動により各支柱4a〜4dに沿つて上下移動する
と、各測定器7a〜7dから立体3の表面に照射
される水平なスリツト光が下から上または上から
下に順次に変化し、各照射位置におけるスリツト
光の照射部分の各点の三次元位置が演算回路18
により算出され、これにより立体3の表面の各点
の位置が算出されて測定される。 When each of the measuring instruments 7a to 7d moves up and down along each of the pillars 4a to 4d by the drive of a built-in motor, the horizontal slit light irradiated from each of the measuring instruments 7a to 7d onto the surface of the solid 3 is emitted from the bottom to the top or The arithmetic circuit 18 calculates the three-dimensional position of each point of the slit light irradiation part at each irradiation position, changing sequentially from top to bottom.
Accordingly, the position of each point on the surface of the solid body 3 is calculated and measured.
ところで第2図のXYZ座標系の原点が測定器
7a〜7d毎に異なる点になるとともに、各測定
器7a〜7dが支柱4a〜4dを上下移動するこ
とにより、各測定器7a〜7dの原点がX軸上を
移動して変化するため、前述の(3)ないし(5)式の演
算で得られた三次元位置のX軸成分xは、各照射
部分において同じ値になり、Y、Z軸成分y、z
のみの二次元位置しか測定できなくなる。 By the way, the origin of the XYZ coordinate system in FIG. 2 is a different point for each of the measuring instruments 7a to 7d, and as each measuring instrument 7a to 7d moves up and down the supports 4a to 4d, the origin of each measuring instrument 7a to 7d is set to a different point. changes as it moves on the X-axis, the X-axis component axis components y, z
Only the two-dimensional position of the object can be measured.
しかし、測定器7a〜7dの上下移動によつて
は変化しないX軸方向の基準点を設定しておけ
ば、該基準点からの測定器7a〜7dの上下方向
の移動量によりX軸成分xが算出されてXYZの
三次元座標系上での位置が測定される。 However, if a reference point in the X-axis direction that does not change when the measuring instruments 7a to 7d move up and down is set, the X-axis component x is calculated and the position on the XYZ three-dimensional coordinate system is measured.
したがつて、二次元位置ではなく三次元位置を
算出して測定する場合は、ゲージ11を用いたつ
ぎの第1、第2の手法のいずれか一つにより行な
われる。 Therefore, when calculating and measuring a three-dimensional position instead of a two-dimensional position, one of the following first and second methods using the gauge 11 is used.
まず、第1の手法は、測定前に、各測定器7a
〜7dにより撮像されたゲージ11の目盛の位置
から、測定器7a〜7dそれぞれの計測前のX軸
方向すなわち上下方向の初期位置を測定し、該各
初期位置を各測定器7a〜7dのX軸方向の基準
点の位置とする。 First, in the first method, each measuring device 7a is
The initial positions of the measuring devices 7a to 7d in the X-axis direction, that is, the vertical direction before measurement, are measured from the positions of the scales of the gauges 11 imaged by the images taken by the measuring devices 7a to 7d. This is the position of the reference point in the axial direction.
そしてX軸方向の基準点の位置を設定した後
に、各測定器7a〜7dを上下移動するととも
に、前述の(3)ないし(5)式にもとづく四則演算を行
なつて各点のX、Y、Z軸成分x、y、zを算出
し、かつ、算出されたX軸成分xに前記基準点か
らの各測定器7a〜7dの移動量を加、減算して
X軸成分xを基準点からの成分に補正し、X軸方
向に対しては同一基準点を有する三次元座標上で
算出して測定する。 After setting the position of the reference point in the X-axis direction, each measuring device 7a to 7d is moved up and down, and the four arithmetic operations based on the above-mentioned formulas (3) to (5) are performed to determine the X, Y , calculates the Z-axis components x, y, and z, and adds and subtracts the amount of movement of each measuring device 7a to 7d from the reference point to the calculated X-axis component x to set the X-axis component x to the reference point. , and the X-axis direction is calculated and measured on three-dimensional coordinates having the same reference point.
つぎに、第2の手法は、測定前に、各測定器7
a〜7dにより撮像された目盛の位置にもとづ
き、全測定器7a〜7dの上下方向の位置を、た
とえばゲージ11の最下目盛点の位置に補正し、
全測定器7a〜7dのX軸方向の基準点を同一位
置に揃えて、全測定器7a〜7dの初期位置同一
のYZ平面内に設定する。 Next, in the second method, each measuring device 7
Based on the positions of the scales imaged by a to 7d, the vertical positions of all measuring devices 7a to 7d are corrected to, for example, the position of the lowest scale point of the gauge 11,
The reference points of all measuring instruments 7a to 7d in the X-axis direction are aligned at the same position, and the initial positions of all measuring instruments 7a to 7d are set within the same YZ plane.
そしてX軸方向の基準点の位置を揃えた後に、
全測定器7a〜7dを同一タイミングで同一量だ
け順次に上下移動するとともに、前述の(3)ないし
(5)式にもとづく四則演算を行なつて各点のX、
Y、Z軸成分x、y、zを算出するとともに、算
出されたX軸成分xに前記基準点からの移動量を
加、減算してX軸成分を補正し、X軸方向に対し
ては同一基準点を有する三次元座標系上で算出し
て測定する。 After aligning the reference points in the X-axis direction,
All measuring instruments 7a to 7d are sequentially moved up and down by the same amount at the same timing, and the above-mentioned (3) or
(5) By performing four arithmetic operations based on formula,
In addition to calculating the Y and Z axis components x, y, and z, the X axis component is corrected by adding and subtracting the amount of movement from the reference point to the calculated X axis component x, and in the X axis direction, Calculate and measure on a three-dimensional coordinate system with the same reference point.
さらに、立体3の内部などにX、Y、Z軸の真
の基準点を有する基準の三次元座標系を設定する
とともに、座標変換の手法により、各測定器7a
〜7cの三次元座標系の原点を、前記基準の三次
元座標系の原点に変換することにより、測定器7
a〜7d毎の算出された位置が、前記基準の三次
元座標系上での位置にそれぞれ変換され、これに
より立体3の表面の各点の三次元位置が絶対的な
三次元座標系で測定される。 Furthermore, a standard three-dimensional coordinate system having true reference points of the X, Y, and Z axes is set inside the solid body 3, and each measuring instrument 7a is
By converting the origin of the three-dimensional coordinate system of ~7c to the origin of the reference three-dimensional coordinate system, the measuring instrument 7
The calculated positions for each of a to 7d are converted to positions on the reference three-dimensional coordinate system, and thereby the three-dimensional position of each point on the surface of the solid 3 is measured in the absolute three-dimensional coordinate system. be done.
なお、座標変換は、たとえば、測定器7a〜7
d毎の三次元座標系の原点を基準の三次元座標系
の原点に変化するための補正係数を予め算出して
おくことにより、四則演算のみで行なえる。 Note that the coordinate transformation is performed, for example, by measuring instruments 7a to 7.
By calculating in advance a correction coefficient for changing the origin of the three-dimensional coordinate system for each d to the origin of the reference three-dimensional coordinate system, this can be done using only four arithmetic operations.
したがつて、計算機14は、第1または第2の
手法を用いて各画像処理手段20により算出され
た測定器7a〜7d毎の三次元座標系での位置
を、座標変換の四則演算により、基準の三次元座
標系上の位置に変換し、立体3の表面の各点の位
置を基準の三次元座標系で算出して測定する。 Therefore, the computer 14 calculates the position of each measuring device 7a to 7d in the three-dimensional coordinate system calculated by each image processing means 20 using the first or second method by four arithmetic operations of coordinate transformation. It is converted to a position on a reference three-dimensional coordinate system, and the position of each point on the surface of the solid 3 is calculated and measured using the reference three-dimensional coordinate system.
さらに、算出された立体3の表面の各点の座標
位置にもとづき、認識回路22により、立体3の
寸法、表面状態、形状などが識別されるととも
に、設定部21の設定条件にもとづき、測定され
た各点の座標位置および、識別された立体3の寸
法、表面状態、形状などが表示手段23に表示さ
れる。 Furthermore, based on the calculated coordinate position of each point on the surface of the solid 3, the recognition circuit 22 identifies the dimensions, surface condition, shape, etc. of the solid 3, and also measures the solid 3 based on the setting conditions of the setting unit 21. The coordinate position of each point and the dimensions, surface condition, shape, etc. of the identified solid 3 are displayed on the display means 23.
したがつて、前記実施例によると、立体3の各
スリツト光の照射部分が順次に上下移動するとと
もに、各スリツト光の照射部分が各測定器7a〜
7dの両撮像手段9α,9βにより2方向から撮
像され、各スリツト光の照射部分について、1対
の撮像出力がそれぞれ得られる。 Therefore, according to the embodiment, the slit light irradiation portions of the solid body 3 move up and down sequentially, and the slit light irradiation portions of the solid 3 move up and down in sequence, and the slit light irradiation portions of the solid body 3
Images are taken from two directions by both imaging means 9α and 9β 7d, and a pair of imaging outputs are obtained for each slit light irradiation portion.
さらに、各1対の撮像出力が入力される計算機
14により、各1対の撮像出力中でのスリツト光
の位置情報にもとづく簡単な四則演算から、各ス
リツト光の照射部分の各点、すなわち立体3の各
点の位置が算出して測定される。 Furthermore, the computer 14 to which each pair of imaging outputs is input calculates each point of the irradiated area of each slit light from four simple arithmetic operations based on the position information of the slit light in each pair of imaging outputs, that is, The position of each point of 3 is calculated and measured.
また、ゲージ11を利用して測定を開始する前
に、各測定器7a〜7dのX軸方向の基準点の測
定あるいは補正を行なうとともに、算出された各
点の座標変換を行なうことにより、立体3の表面
の各点の位置が測定器7a〜7d毎に異なる三次
元座標系あるいは、全測定器7a〜7dに共通の
基準の三次元座標系で算出して測定される。 In addition, before starting measurement using the gauge 11, the reference points in the X-axis direction of each of the measuring instruments 7a to 7d are measured or corrected, and the coordinates of each calculated point are transformed. The position of each point on the surface of 3 is calculated and measured using a three-dimensional coordinate system that is different for each measuring device 7a to 7d or a standard three-dimensional coordinate system that is common to all measuring devices 7a to 7d.
そして立体3の表面の各点が簡単な四則演算に
より算出して測定されるため、従来の非接触法に
より短時間で算出して測定される。 Since each point on the surface of the solid 3 is calculated and measured using simple arithmetic operations, the calculation and measurement can be performed in a short time using the conventional non-contact method.
また、各スリツト光の照射部分を各1対の撮像
手段9α,9βによりそれぞれ上、下の2方向か
ら撮像して1対の撮像出力を得るため、たとえ
ば、各撮像手段9α,9βを1個の投光手段と1
台のテレビカメラなどにより形成し、各スリツト
光の照射部分に対して1つの撮像出力を得る方法
に比して、投光手段8の照射光軸と両撮像手段9
α,9βそれぞれとのなす角を小さくし、立体3
が小さい場合および立体3の表面に凸凹がある場
合にも精度よく測定が行なえる。 In addition, in order to obtain a pair of imaging outputs by imaging the irradiated portion of each slit light from two directions, upper and lower, respectively, by a pair of imaging means 9α and 9β, for example, one imaging means 9α and 9β is provided. light projecting means and 1
Compared to a method in which one imaging output is obtained for each slit light irradiation area by using a TV camera on a stand, etc., the irradiation optical axis of the light projecting means 8 and both imaging means 9
By reducing the angles formed with each of α and 9β, solid 3
Accurate measurement can be carried out even when the surface of the solid 3 is small or when the surface of the solid 3 is uneven.
さらに、非接触で測定を行なうため、立体3が
ゴム等の柔軟で変形し易いものであつても、容易
に計測することができる。 Furthermore, since the measurement is performed without contact, even if the solid 3 is made of rubber or other flexible and deformable material, it can be easily measured.
そして線状のスリツト光を使用しているため、
エネルギー密度が低く、弱い光でもよく、照明を
使用したときの照明熱により、立体3に歪が生じ
たりすることもない。 And because it uses linear slit light,
It has a low energy density and can be used as a weak light, and the three-dimensional object 3 will not be distorted by the heat of the illumination when the illumination is used.
なお、各測定器7a〜7dの両撮像手段9α,
9βは、MOS型イメージセンサや撮像管等によ
り構成してもよい。 In addition, both the imaging means 9α of each measuring device 7a to 7d,
9β may be constituted by a MOS image sensor, an image pickup tube, or the like.
また、基台1の周囲に設ける支柱および測定器
の個数は最低2個であればよく、たとえば、第1
図の対角方向の支柱4a〜4bおよび測定器7a
〜7cのみを設けても測定することが可能であ
る。 Further, the number of columns and measuring instruments provided around the base 1 may be at least two; for example, the first
Diagonal struts 4a-4b and measuring device 7a in the figure
It is possible to measure even if only 7c is provided.
以上のように、この発明の立体計測方法による
と、基台1上の支持台2に載置された被測定用の
立体3を囲むように支柱6a〜6dを上下動する
複数の非接触測定器7a〜7dを設け、各測定器
7a〜7dを上下方向に移動するとともに、各測
定器7a〜7dの投光手段8から立体3に水平方
向の線状のスリツト光を照射し、各スリツト光の
照射部分を各測定器7a〜7dのそれぞれの投光
手段8の上、下の1対の撮像手段9α,9βによ
り撮像し、この撮像により得られた各1対の撮像
出力中でのスリツト光の位置情報にもとづき、水
平方向、上下方向をZ軸方向、X軸方向とする3
次元座標での照射部分の点G(x、y、z)の位
置を、
x=(d−Kx)・{L/L+(d−e)−1}+d
y=Ky・L/L+(d−e)
z=Kz・r
d、eは1対の撮像出力それぞれでの点GのX
軸方向の距離データ
Lは1対の撮像手段のX軸方向の間隔Kx、Ky
は1対の撮像手段のレンズ倍率、位置などに基づ
いて設定されるX軸方向、Y軸方向の係数
rは1対の撮像出力中での点Gの走査線番号
Kzは走査線の本数、幅およびレンズ倍率によ
り定まる係数
の四則演算から求め、立体3の表面の各位置を算
出して測定したため、測定中に立体3を移動する
ことなく、しかも、その大きさによらず立体3の
表面の各位置を短時間で迅速に精度よく非接触測
定することができるものである。
As described above, according to the three-dimensional measurement method of the present invention, a plurality of non-contact measurements are carried out by vertically moving the columns 6a to 6d so as to surround the three-dimensional object to be measured 3 placed on the support stand 2 on the base 1. The measuring instruments 7a to 7d are provided, and each measuring instrument 7a to 7d is moved in the vertical direction, and the light projecting means 8 of each measuring instrument 7a to 7d illuminates the solid body 3 with horizontal linear slit light. The light irradiated area is imaged by a pair of imaging means 9α and 9β located above and below the light projecting means 8 of each of the measuring instruments 7a to 7d, and in each pair of imaging outputs obtained by this imaging, Based on the position information of the slit light, the horizontal direction and the vertical direction are set as the Z-axis direction and the X-axis direction3.
The position of point G (x, y, z) of the irradiated part in dimensional coordinates is expressed as: x=(d-Kx)・{L/L+(d-e)-1}+dy -e) z=Kz・r d, e is the X of point G at each of the pair of imaging outputs
Axial distance data L is the distance Kx, Ky between the pair of imaging means in the X-axis direction
are coefficients in the X-axis direction and Y-axis direction that are set based on the lens magnification and position of the pair of imaging means. r is the scanning line number of point G in the pair of imaging outputs. Kz is the number of scanning lines. Because each position on the surface of the solid 3 was calculated and measured using the four arithmetic operations of coefficients determined by the width and lens magnification, the surface of the solid 3 was measured without moving the solid 3 during the measurement, and regardless of its size. It is possible to quickly and accurately measure each position in a short time without contact.
図面はこの発明の立体計測定方法の1実施例を
示し、第1図は計測装置の斜視図、第2図は第1
図の非接触測定器の分解斜視図、第3図a,bは
第2図の両撮像センサの撮像画面の正面図、第4
図は回路ブロツク図、第5図は第4図の電子計算
機内の両像処理手段のブロツク図、第6図a〜
c、第7図a〜cは第5図の動作説明用タイミン
グチヤート、第8図は第5図の演算回路の演算説
明用の模式図である。
3……立体、7a〜7d……非接触測定器、8
……投光手段、9α,9β……撮像手段。
The drawings show one embodiment of the stereometer measuring method of the present invention, FIG. 1 is a perspective view of the measuring device, and FIG. 2 is a perspective view of the measuring device.
Fig. 3 is an exploded perspective view of the non-contact measuring device, Fig. 3 a and b are front views of the imaging screens of both image sensors in Fig.
The figure is a circuit block diagram, FIG. 5 is a block diagram of both image processing means in the electronic computer of FIG. 4, and FIGS.
7a to 7c are timing charts for explaining the operation of FIG. 5, and FIG. 8 is a schematic diagram for explaining the operation of the arithmetic circuit of FIG. 5. 3...Stereoscopic, 7a-7d...Non-contact measuring device, 8
...Light projecting means, 9α, 9β...imaging means.
Claims (1)
置し、前記基台の周囲に複数の支柱を立設し、前
記各支柱それぞれに水平方向の線状のスリツト光
を前記立体に照射する投光手段および前記立体の
スリツト光の照射部分を前記投光手段の上、下の
2方向から撮像する1対の撮像手段を有する非接
触測定器を上下動自在に取付け、前記各測定器を
上下方向に移動するとともに、前記各測定器それ
ぞれの1対の撮像出力中のスリツト光の位置情報
にもとづき、前記水平方向、上下方向をZ軸方
向、X軸方向とする3次元座標での前記照射部分
の点G(x、y、z)の位置を、 x=(d′−Kx)・{L/L+(d−e)−1}+d y=Ky・L/L+(d−e) z=Kz・r d、eは1対の撮像出力それぞれでの点GのX
軸方向の距離データ Lは1対の撮像手段のX軸方向の間隔 Kx、Kyは1対の撮像手段のレンズ倍率、位置
などに基づいて設定されるX軸方向、Y軸方向の
係数 rは1対の撮像出力中での点Gの走査線番号 Kzは走査線の本数、幅およびレンズ倍率によ
り定まる係数 の四則演算から求め、 前記立体の表面の各位置を算出して測定するこ
とを特徴とする立体計測方法。[Scope of Claims] 1. A three-dimensional object to be measured is placed on a horizontal support stand on a base, a plurality of supports are erected around the base, and each support support is provided with a horizontal line shape. A non-contact measuring instrument having a light projection means for irradiating the three-dimensional slit light onto the three-dimensional object and a pair of imaging means for imaging the irradiated part of the three-dimensional object from two directions, above and below the light projecting means, is moved up and down. At the same time, the horizontal and vertical directions can be moved in the Z-axis direction and the The position of the point G (x, y, z) of the irradiated part in the three-dimensional coordinates as the direction is x = (d'-Kx) {L/L + (d-e)-1} + d y = Ky・L/L+(d-e) z=Kz・r d and e are the X of point G at each of the pair of imaging outputs
Axial distance data L is the distance between the pair of imaging means in the X-axis direction Kx, Ky are the coefficients in the X-axis direction and Y-axis direction that are set based on the lens magnification, position, etc. of the pair of imaging means r is The scanning line number Kz of point G in a pair of image pickup outputs is obtained from four arithmetic operations of coefficients determined by the number of scanning lines, width, and lens magnification, and each position on the surface of the three-dimensional object is calculated and measured. A three-dimensional measurement method.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP440885A JPS61162705A (en) | 1985-01-14 | 1985-01-14 | Method for measuring solid body |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP440885A JPS61162705A (en) | 1985-01-14 | 1985-01-14 | Method for measuring solid body |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| JPS61162705A JPS61162705A (en) | 1986-07-23 |
| JPH0481124B2 true JPH0481124B2 (en) | 1992-12-22 |
Family
ID=11583490
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| JP440885A Granted JPS61162705A (en) | 1985-01-14 | 1985-01-14 | Method for measuring solid body |
Country Status (1)
| Country | Link |
|---|---|
| JP (1) | JPS61162705A (en) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR2659439B1 (en) * | 1990-03-12 | 1996-03-08 | Centre Nat Rech Scient | METHOD AND SYSTEM FOR THREE-DIMENSIONAL CONTOUR MEASUREMENT AND MEASUREMENT. |
| FR3035207B1 (en) * | 2015-04-14 | 2021-01-29 | Mesure Systems3D | MODULAR CONTACTLESS MEASURING DEVICE AND CORRESPONDING MEASURING AND CONTROL SYSTEM |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS5537982A (en) * | 1978-09-11 | 1980-03-17 | Ishikawajima Harima Heavy Ind Co Ltd | Solid-shape detector for characteristic test of deformation of curved-surface body |
| AT367552B (en) * | 1979-05-11 | 1982-07-12 | Chlestil Gustav Dkfm Ing | METHOD FOR THE PHOTOGRAPHIC PRODUCTION OF DATA CARRIERS FOR REPRODUCTION OF THREE-DIMENSIONAL OBJECTS, DEVICE FOR IMPLEMENTING THE METHOD AND REPRODUCTION DEVICE |
| JPS5733304A (en) * | 1980-08-06 | 1982-02-23 | Hitachi Ltd | Method and device for shape inspection |
| JPS58206909A (en) * | 1982-05-07 | 1983-12-02 | Yokogawa Hokushin Electric Corp | Arbitrary shape measuring device for objects |
-
1985
- 1985-01-14 JP JP440885A patent/JPS61162705A/en active Granted
Also Published As
| Publication number | Publication date |
|---|---|
| JPS61162705A (en) | 1986-07-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP3511450B2 (en) | Position calibration method for optical measuring device | |
| US5612905A (en) | Three-dimensional measurement of large objects | |
| EP2132523B1 (en) | Method and device for exact measurement of objects | |
| CN108198224A (en) | A kind of line-scan digital camera caliberating device and scaling method for stereo-visiuon measurement | |
| JP4939304B2 (en) | Method and apparatus for measuring film thickness of transparent film | |
| CN115683059B (en) | A structured light three-dimensional vertical line measurement device and method | |
| US20030053045A1 (en) | System for inspecting a flat sheet workpiece | |
| JPH09113223A (en) | Non-contacting method and instrument for measuring distance and attitude | |
| JPH0481125B2 (en) | ||
| JPH04172213A (en) | Calibrating method for three-dimensional shape measuring apparatus | |
| GB2064102A (en) | Improvements in electro- optical dimension measurement | |
| JP2000081329A (en) | Shape measuring method and device | |
| JPH0481124B2 (en) | ||
| CN101113891B (en) | Optical Measuring Device | |
| JP2945448B2 (en) | Surface shape measuring instrument | |
| JPH0545135A (en) | Precision contour visual measurement method and apparatus | |
| JP4133657B2 (en) | X-ray fluoroscope for precision measurement | |
| JP2795790B2 (en) | Sensor coordinate correction method for three-dimensional measuring device | |
| JP2000249664A (en) | Method and apparatus for x-ray inspection | |
| JPS61159102A (en) | Two-dimensional measuring method | |
| JP2678127B2 (en) | Optical measuring device | |
| JP3095411B2 (en) | Calibration method of CCD camera | |
| JPH0942946A (en) | Measuring device and measuring method for electronic part and calibration mask | |
| JPS6131906A (en) | three-dimensional measuring device | |
| Kurada et al. | A trinocular vision system for close-range position sensing |