[go: up one dir, main page]

JPH0481125B2 - - Google Patents

Info

Publication number
JPH0481125B2
JPH0481125B2 JP60004409A JP440985A JPH0481125B2 JP H0481125 B2 JPH0481125 B2 JP H0481125B2 JP 60004409 A JP60004409 A JP 60004409A JP 440985 A JP440985 A JP 440985A JP H0481125 B2 JPH0481125 B2 JP H0481125B2
Authority
JP
Japan
Prior art keywords
imaging
point
pair
slit light
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
JP60004409A
Other languages
Japanese (ja)
Other versions
JPS61162706A (en
Inventor
Mitsuo Iso
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kanadevia Corp
Original Assignee
Hitachi Shipbuilding and Engineering Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Shipbuilding and Engineering Co Ltd filed Critical Hitachi Shipbuilding and Engineering Co Ltd
Priority to JP440985A priority Critical patent/JPS61162706A/en
Publication of JPS61162706A publication Critical patent/JPS61162706A/en
Publication of JPH0481125B2 publication Critical patent/JPH0481125B2/ja
Granted legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/245Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using a plurality of fixed, simultaneously operating transducers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2522Projection by scanning of the object the position of the object changing and being recorded

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Description

【発明の詳細な説明】 〔産業上の利用分野〕 この発明は、人体、物体などの立体の表面の各
点位置を非接触計測する立体計測方法に関する。
DETAILED DESCRIPTION OF THE INVENTION [Industrial Application Field] The present invention relates to a three-dimensional measurement method for non-contact measurement of the position of each point on the surface of a three-dimensional object such as a human body or an object.

〔従来の技術〕[Conventional technology]

従来、人体、物体などの立体の形状などを測定
する手法としては、センシングプローブを被測定
体に接触させて測定する接触法と、ステレオ写真
法、モアレトポグラフイ法、光切断法などの非接
触法とがあり、これらの手法が産業用ロボツト、
各種の検査装置などの物体認識技術として広く応
用されている。
Conventionally, methods for measuring the three-dimensional shape of a human body, object, etc. include a contact method, in which a sensing probe is brought into contact with the object to be measured, and a non-contact method, such as stereo photography, moiré topography, and optical sectioning. These methods are used for industrial robots,
It is widely applied as an object recognition technology for various inspection devices.

そして接触法の場合は、接触可能な被測定体し
か測定できず、測定可能な被測定体に制限があ
り、また、被測定体表面の各点の位置を接触計測
するため測定に著しく長時間を要する。
In the case of the contact method, only objects to be measured that can be touched can be measured, which limits the number of objects that can be measured.Also, since the position of each point on the surface of the object is measured by contact, it takes an extremely long time to measure. It takes.

したがつて、被測定体の形状などの測定は、前
述の非接触法のように、被測定体に接触すること
なく行なうことが望まれる。
Therefore, it is desirable to measure the shape of the object to be measured without contacting the object, as in the non-contact method described above.

〔発明が解決しようとする問題点〕[Problem that the invention seeks to solve]

ところで前記従来の非接触法の場合は、被測定
体の形状認識にもとづいて、計測する立体表面の
各点の位置を算出する手法を採つているため、立
体表面の各点の位置を測定するには、得られた形
状情報から対象とすべき測定点を求めるととも
に、求めた測定点の二次元あるいは三次元の位置
を算出しなければならず、この場合関数演算など
の複雑な算出処理を行なう必要があるとともに算
出に時間のかかる問題点がある。
By the way, in the case of the conventional non-contact method, the position of each point on the three-dimensional surface to be measured is calculated based on shape recognition of the object to be measured, so the position of each point on the three-dimensional surface is measured. To do so, it is necessary to determine the target measurement point from the obtained shape information and calculate the two-dimensional or three-dimensional position of the determined measurement point. In this case, complex calculation processes such as function calculations are required. There is a problem that it is necessary to perform calculations and that calculations take time.

また、前述の両手法を実現する測定装置は分解
能が非常に低く、被測定体そのものが小さい場
合、あるいは被測定体表面に凹凸がある場合に
は、測定誤差の増大あるいは測定不能の事態が生
じ、信頼性に欠ける問題点がある。
In addition, the measurement devices that implement both of the above methods have very low resolution, and if the object to be measured is small or the surface of the object to be measured is uneven, measurement errors may increase or measurements may not be possible. , there is a problem of lack of reliability.

〔問題点を解決するための手段〕[Means for solving problems]

この発明は、基台上の水平な支持台に被計測用
の立体を載置し、前記基台の周囲に複数の支柱を
立設し、前記各支柱それぞれに前記立体より長い
垂直方向の線状のスリツト光を左右方向に可変自
在に前記立体に照射する投光手段および前記立体
のスリツト光の照射部分を前記投光手段の左、右
の2方向から撮像する1対の撮像手段を有する非
接触測定器を固定し、前記各測定器それぞれの前
記投光手段の照射位置を可変して前記両撮像手段
の重複視野内で前記光照射部分を左右方向に移動
するとともに、前記各測定器それぞれの1対の撮
像出力中のスリツト光の位置情報にもとづき、垂
直方向、左右方向をZ軸方向、X軸方向とする3
次元座標での照射部分の点G(x、y、z)の位
置を、 x=(d−Kx)・{L/L+(d−e)−1}+d y=Ky・L/L+(d−e) z=Kz・r d、eは1対の撮像出力それぞれでの点GのX
軸方向の距離データ Lは1対の撮像手段のX軸方向の間隔 Kx、Kyは1対の撮像手段のレンズ倍率、 置
などに基づいて設定されるX軸方向、Y軸方向の
係数 rは1対の撮像出力中での点Gの走査線番号 Kzは走査線の本数、幅およびレンズ倍率によ
り定まる係数 の四則演算から求め、前記立体の表面の各位置を
算出して測定することを特徴とする立体計測方法
である。
In this invention, a solid body to be measured is placed on a horizontal support stand on a base, a plurality of pillars are erected around the base, and a vertical line longer than the solid body is attached to each of the pillars. a light projecting means for irradiating the three-dimensional object with slit light of a shape variably in the left-right direction; and a pair of imaging means for imaging the irradiated portion of the three-dimensional object from the left and right directions of the light projecting means. The non-contact measuring device is fixed, and the irradiation position of the light projecting means of each of the measuring devices is varied to move the light irradiating portion in the left and right direction within the overlapping field of view of both the imaging devices, and the non-contact measuring device is fixed. Based on the positional information of the slit light beams outputting each pair of images, the vertical direction and the left and right directions are defined as the Z-axis direction and the X-axis direction3.
The position of point G (x, y, z) of the irradiated part in dimensional coordinates is expressed as: x=(d-Kx)・{L/L+(d-e)-1}+dy -e) z=Kz・r d, e is the X of point G at each of the pair of imaging outputs
Axial distance data L is the distance between the pair of imaging means in the X-axis direction Kx, Ky are the coefficients in the X-axis and Y-axis directions that are set based on the lens magnification and position of the pair of imaging means r is The scanning line number Kz of point G in a pair of image pickup outputs is obtained from four arithmetic operations of coefficients determined by the number of scanning lines, width, and lens magnification, and each position on the surface of the three-dimensional object is calculated and measured. This is a three-dimensional measurement method.

〔作用〕[Effect]

そして、各測定器の投光手段により立体を囲む
ようにして立体の周面に垂直方向の十分な長さの
スリツト光が照射されるとともに、各測定器を移
動することなく各スリツト光の照射位置が左右方
向に移動する。
Then, the light emitting means of each measuring instrument illuminates the circumferential surface of the solid with a sufficient length of slit light in the vertical direction so as to surround the solid, and the irradiation position of each slit light can be adjusted without moving each measuring device. Move left and right.

さらに、各測定器の1対の撮像手段により各ス
リツト光の照射部分が投光手段の左、右の2方向
からそれぞれ撮像され、この撮像により得られた
各1対の撮像出力中のスリツト光の位置情報にも
とづく簡単な四則演算により、各スリツト光の照
射部分の位置が算出されて立体の表面の各位置が
算出して測定される。
Furthermore, the irradiated portion of each slit light is imaged from the left and right directions of the light projecting means by a pair of imaging means of each measuring instrument, and the slit light in each pair of image outputs obtained by this imaging is By simple arithmetic operations based on the position information, the position of the irradiated portion of each slit light is calculated, and each position on the surface of the three-dimensional object is calculated and measured.

〔実施例〕〔Example〕

つぎに、この発明を、その1実施例を示した図
面とともに詳細に説明する。
Next, the present invention will be described in detail with reference to drawings showing one embodiment thereof.

まず、計測装置を示した第1図において、1は
基台、2は基台1に載置された水平な支持台、3
は支持台2に載置された被測定用の立体、4a,
4b,4c,4dは基台1の四隅にそれぞれ立設
された4本の支柱であり、支柱4a,4cの内側
面が立体3を介して対向するとともに、支柱4
b,4dの内側面が立体3を介して対向してい
る。
First, in FIG. 1 showing the measuring device, 1 is a base, 2 is a horizontal support placed on the base 1, and 3 is a horizontal support stand placed on the base 1.
are the three-dimensional object to be measured placed on the support stand 2, 4a,
Reference numerals 4b, 4c, and 4d are four pillars erected at the four corners of the base 1, and the inner surfaces of the pillars 4a and 4c face each other via the solid body 3, and the pillars 4
The inner surfaces of b and 4d face each other with the solid body 3 in between.

5a,5b,5c,5dは固定部材6a,6
b,6c,6dを介して支柱4a〜4dそれぞれ
の上端部に取付けられて固定された4個の非接触
測定器、7は各測定器5a〜5dにそれぞれ設け
られた投光手段、8α,8βは投光手段7の左、
右側にそれぞれ設けられた1対の撮像手段であ
り、CCD型エリアイメージセンサなどの2次元
センサ装置からなる。
5a, 5b, 5c, 5d are fixed members 6a, 6
Four non-contact measuring devices are attached and fixed to the upper ends of the respective pillars 4a to 4d via b, 6c and 6d, 7 is a light projecting means provided on each of the measuring devices 5a to 5d, 8α, 8β is the left side of the light projecting means 7;
A pair of imaging means are provided on the right side, each consisting of a two-dimensional sensor device such as a CCD area image sensor.

そして各測定器5a〜5dの投光手段7および
両撮像手段8α,8βは第2図に示すように構成
され、同図において、10aは線状のスリツト付
きキセノンランプなどからなり線状のスリツト光
を出力する光源、10bは光源10aからのスリ
ツト光の長さを長くする凸面筒レンズなどからな
る拡張レンズ、10cは回転自在の反射鏡であ
り、支持台2に直角方向の線状のスリツト光を左
右方向に移動自在に立体3の表面に照射する。
The light projecting means 7 and the imaging means 8α, 8β of each of the measuring instruments 5a to 5d are constructed as shown in FIG. A light source that outputs light; 10b is an expansion lens made of a convex tube lens that increases the length of the slit light from the light source 10a; 10c is a rotatable reflecting mirror; Light is irradiated onto the surface of the solid 3 so as to be movable in the left and right directions.

11αa,11βaはそれぞれ受光素子である
CCDを縦M行、横N列の2次元マトリツクス状
に配列して形成された撮像センサ、11αb,1
1βbは立体3の表面の反射光を両撮像センサ1
1αa,11βaにそれぞれ結像する集光レンズで
あり、撮像センサ11αa、レンズ11αbにより
一方の撮像手段8αが形成され、撮像センサー1
1βa、レンズ11βbにより他方の撮像手段8β
が形成されている。
11αa and 11βa are light receiving elements, respectively.
An image sensor formed by arranging CCDs in a two-dimensional matrix of M rows vertically and N columns horizontally, 11αb,1
1βb is the reflected light from the surface of the solid body 3.
1αa and 11βa, one of the imaging means 8α is formed by the imaging sensor 11αa and the lens 11αb, and the imaging sensor 1
1βa, the other imaging means 8β by lens 11βb
is formed.

ところで計測位置をXYZの三次元座標系で説
明するため、第2図に示すように、レンズ11
αb,11βbの中心点を結ぶ第1図の左右方向を
X軸方向にとるとともに、投光手段7から立体3
へのスリツト光の照射方向、照射されたスリツト
光に並行な第1図の上下方向をY、Z軸方向それ
ぞれにとるとる。
By the way, in order to explain the measurement position in the XYZ three-dimensional coordinate system, as shown in Fig. 2, the lens 11
The left and right direction in FIG. 1 connecting the center points of αb and 11βb is taken as the X-axis direction, and
The irradiation direction of the slit light is taken as the Y- and Z-axis directions, respectively, and the vertical direction in FIG. 1 parallel to the irradiated slit light.

そしてスリツト光の長さが立体3の高さより十
分長く設定されるとともに、両撮像手段8α,8
βは、立体3のスリツト光の照射部分を重複して
撮像するように撮像視野が固定設定されている。
The length of the slit light is set to be sufficiently longer than the height of the solid body 3, and both imaging means 8α, 8
For β, the imaging field of view is fixed so that the portion of the solid 3 irradiated with the slit light is imaged overlappingly.

また、反射鏡10cは図外の回転制御手段によ
り回転制御され、反射鏡10cが回転することに
より、投光手段7のスリツト光の照射位置が、両
撮像手段8α,8βの重複視野内で、計測方向に
直角方向すなわちX軸方向に可変し、このとき立
体3のスリツト光照射部分もX軸方向に移動す
る。
Further, the rotation of the reflecting mirror 10c is controlled by a rotation control means (not shown), and by rotating the reflecting mirror 10c, the irradiation position of the slit light of the light projecting means 7 is within the overlapping field of view of both the imaging means 8α and 8β. It is variable in a direction perpendicular to the measurement direction, that is, in the X-axis direction, and at this time, the slit light irradiated portion of the solid 3 also moves in the X-axis direction.

すなわち、第2図のスリツト光Sが両撮像手段
8α,8βの重複視野内で、X軸方向に順次に変
化する。
That is, the slit light S shown in FIG. 2 changes sequentially in the X-axis direction within the overlapping fields of view of both imaging means 8α and 8β.

そして両撮像手段8α,8βの撮像センサ11
αa,11βaにより、立体3の測定器5a〜5d
毎のスリツト光の照射部分がそれぞる撮像され、
このとき両撮像センサ11αa,11βaの撮像面
Fα,Fβには、たとえば第3図a,bそれぞれに
示すように、縦方向にスリツト光像Sα,Sβが結
像し、両撮像面Fα,Fβはスリツト光像Sα,Sβの
部分のみが明るくなる。
And the imaging sensor 11 of both imaging means 8α, 8β
By αa, 11βa, measuring instruments 5a to 5d of solid 3
Each slit light irradiation area is imaged,
At this time, the imaging surfaces of both image sensors 11αa and 11βa
For example, as shown in Fig. 3a and b, respectively, slit light images Sα and Sβ are formed in the vertical direction on Fα and Fβ, and only the portions of the slit light images Sα and Sβ are formed on both imaging planes Fα and Fβ. It becomes brighter.

さらに、両撮像センサ11αa,11βaの各1
列の受光素子の受光出力により、両撮像センサ1
1αa,11βaの各1走査線の撮像出力が形成さ
れるとともに、前記各走査線の撮像出力が両撮像
手段8α,8βから順次に出力される。
Furthermore, each of the image sensors 11αa and 11βa
Both image sensors 1
Imaging outputs of one scanning line each of 1αa and 11βa are formed, and the imaging outputs of each scanning line are sequentially output from both imaging means 8α and 8β.

なお、第3図a,bの横方向がX軸方向に対応
するとともに、縦方向がZ軸方向に対応し、同図
aの横方向の線A1,…,Am,Am+1,Am+2
Am+3,…,Anが撮像センサ11αaの第1ない
し第N走査線を示すとともに、同図bの横方向の
線B1,…,Bm,Bm+1,Bm+2,Bm+3,…,Bn
が撮像センサ11βaの第1ないし第N走査線を
示し、両センサ11αa,11βaは各走査線の撮
像出力が同一タイミングで順次に読出される。
Note that the horizontal direction in Figures a and b corresponds to the X-axis direction, and the vertical direction corresponds to the Z-axis direction, and the horizontal lines A 1 , ..., Am, Am +1 , Am +2 ,
Am +3 ,..., An indicate the first to Nth scanning lines of the image sensor 11αa, and horizontal lines B 1 ,..., Bm, Bm +1 , Bm +2 , Bm +3 , ..., Bn
denotes the first to Nth scanning lines of the imaging sensor 11βa, and the imaging outputs of each scanning line of both sensors 11αa and 11βa are sequentially read out at the same timing.

そして各測定器5a〜5dそれぞれの両撮像セ
ンサ11αa,11βaから読出されたアナログの
1対の撮像出力は、第4図に示す電子計算機12
に設けられた測定器5a〜5d毎の画像処理手段
にそれぞれ入力される。
A pair of analog imaging outputs read out from both imaging sensors 11αa and 11βa of each of the measuring instruments 5a to 5d are sent to an electronic computer 12 shown in FIG.
are respectively input to image processing means for each of the measuring instruments 5a to 5d provided in the.

この、各画像処理手段は第5図に示すように構
成され、同図において、13はクロツク信号を発
生するクロツク回路、14α,14βは1対の信
号処理回路であり、両撮像手段8α,8βの撮像
センサ11αa,11βaから順次に出力される各
走査線のアナログ撮像出力を前記クロツク信号の
タイミングでそれぞれ取り込むとともに、所定の
スライスレベルでスライスし、スリツト光像Sα,
Sβの部分のみハイレベルになるデジタル画像信
号を形成する。
Each of the image processing means is constructed as shown in FIG. Analog imaging outputs of each scanning line sequentially outputted from the imaging sensors 11αa and 11βa are captured at the timing of the clock signal, and are sliced at a predetermined slice level to create slit optical images Sα,
A digital image signal is formed in which only the Sβ portion is at a high level.

15α,15βは1対のアドレスカウンタであ
り、クロツク信号のタイミングで両撮像センサ1
1αa,11βaの各走査線左端部の基準点の位置
からスリツト光像Sα,Sβによつて両処理回路1
4α,14βのデジタル画像信号がハイレベルパ
ルスになる点までの両撮像センサ11αa,11
βaの1対の撮像出力中での距離をそれぞれカウ
ントし、1対の撮像出力それぞれにおける照射部
分の各点のX軸方向の距離データをそれぞれ出力
する。
15α and 15β are a pair of address counters, and both image sensors 1 are
Both processing circuits 1 are processed by slit light images Sα and Sβ from the reference point position at the left end of each scanning line 1αa and 11βa.
Both image sensors 11αa and 11 up to the point where the digital image signals of 4α and 14β become high-level pulses
The distance of βa in a pair of imaging outputs is counted, and distance data in the X-axis direction of each point of the irradiated portion in each of the pair of imaging outputs is output.

16は演算回路であり、両カウンタ15α,1
5βから同時に入力されたX軸方向の1対の距離
データ、クロツク信号のカウントにより得られる
走査線の番号と予め設定された走査線の幅とから
なるスリツト光照射部分の各点のZ軸方向のデー
タなどのスリツト光の位置情報にもとづく後述の
四則演算から、スリツト光照射部分の各点の三次
元座標系での位置を算出する。
16 is an arithmetic circuit, and both counters 15α, 1
The Z-axis direction of each point in the slit light irradiation area consists of a pair of distance data in the X-axis direction simultaneously input from 5β, the scanning line number obtained by counting the clock signal, and the preset scanning line width. The position of each point in the slit light irradiation area in the three-dimensional coordinate system is calculated from the four arithmetic operations described below based on the position information of the slit light such as data.

17は演算回路16により算出された照射部分
の各点の座標位置を記憶する記憶部、19は処理
回路14α,14β、カウンタ15α,15β、
演算回路16、記憶部17からなる画像処理手
段、20は表示条件設定部、21は認識回路であ
り、設定部20に設定された条件にもとづき、記
憶部17に記憶された各点の座標位置から立体3
の寸法、表面状態、形状などを識別するととも
に、記憶部17に記憶された各点の座標位置およ
び識別した寸法、表面状態、形状などの表示信号
を第4図の表示手段22に出力する。
17 is a storage unit that stores the coordinate position of each point of the irradiation part calculated by the arithmetic circuit 16; 19 is a processing circuit 14α, 14β; counters 15α, 15β;
An image processing means consisting of an arithmetic circuit 16 and a storage section 17; 20 is a display condition setting section; 21 is a recognition circuit; based on the conditions set in the setting section 20, the coordinate position of each point stored in the storage section 17 is from solid 3
The coordinate position of each point stored in the storage section 17 and a display signal indicating the identified dimensions, surface condition, shape, etc. are output to the display means 22 in FIG. 4.

そして第2図に示すように投光手段7から線状
のスリツト光が照射されるとともに、該スリツト
光の照射部分が投光手段7の左、右側の撮像手段
8α,8βにより2方向から撮像され、両撮像手
段8α,8βにたとえば第3図a,bのスリツト
光像Sα,Sβがそれぞれ結像する。
As shown in FIG. 2, linear slit light is emitted from the light projecting means 7, and the irradiated portion of the slit light is imaged from two directions by imaging means 8α and 8β on the left and right sides of the light projecting means 7. For example, slit light images Sα and Sβ shown in FIGS. 3a and 3b are formed on both imaging means 8α and 8β, respectively.

さらに、撮像手段8αの撮像センサ11αaか
ら処理回路14αに、第1走査線A1ないし第N
走査線Anの撮像出力が順次に出力され、たとえ
ば第6図aに示すように、撮像センサ11αaか
ら処理回路14αに第NMいし第M+3走査線
Am,Am+1,Am+2,Am+3の撮像出力が順次に
出力されると、このとき同一タイミングで撮像手
段8βの撮像センサ11βaから処理回路14β
に、第7図aに示すように第Mないし第M+3走
査線Bm,Bm+1,Bm+2,Bm+3の撮像出力が順
次に出力される。
Further, the first scanning line A 1 to the Nth scanning line A 1 to Nth
The imaging outputs of the scanning lines An are sequentially output, for example, as shown in FIG.
When the imaging outputs Am, Am +1 , Am +2 , and Am +3 are sequentially output, at the same timing, the processing circuit 14β is output from the image sensor 11βa of the imaging means 8β.
Then, as shown in FIG. 7a, the imaging outputs of Mth to M+3 scanning lines Bm, Bm +1 , Bm +2 and Bm +3 are sequentially output.

そして両処理回路14α,14βにより、両撮
像センサ11αa,11βaからの走査線毎のアナ
ログの撮像出力がスライスレベルlで順次スライ
スされ、このときレベルlがスリツト光像Sα,
Sβの部分のみを抽出するレベルに設定されてい
るため、第6図b、第7図bに示すように、各走
査線出力中のスリツト光像Sα,Sβの部分のみが
抽出されて両撮像手段8α,8βの撮像出力がデ
ジタル変換される。
The processing circuits 14α and 14β sequentially slice the analog imaging outputs for each scanning line from the image sensors 11αa and 11βa at a slice level l, and at this time, the level l is the slit optical image Sα,
Since the level is set to extract only the Sβ portion, only the slit light image Sα and Sβ portions in each scanning line output are extracted, as shown in Figures 6b and 7b. The imaging outputs of the means 8α and 8β are digitally converted.

そして両処理回路14α,14βのデジタル信
号が両カウンタ15α,15βにそれぞれ入力さ
れ、カウンタ15α,15βは第6図c、第7図
cに示すように、各走査線の左端の基準点d0のタ
イミングで基準点パルスをそれぞれ形成するとと
もに、各基準点パルスにもとづき、基準点d0から
各走査線出力中でのスリツト光のX軸方向の位置
am、am+1、am+2、am+3および、bm、bm+1
bm+2、bm+3それぞれまでの距離Dam、Dam+1
Dam+2、Dam+3および、Dbm、Dbm+1
Dbm+2、Dbm+3をカウントし、スリツト光像Sα
のX軸方向の距離データおよびスリツト光像Sβ
のX軸方向の距離データを演算回路16に出力す
る。
Then, the digital signals of both processing circuits 14α and 14β are input to both counters 15α and 15β, respectively, and the counters 15α and 15β receive the reference point d 0 at the left end of each scanning line, as shown in FIGS. 6c and 7c. Each reference point pulse is formed at the timing of
am, am +1 , am +2 , am +3 and bm, bm +1 ,
Distance to bm +2 , bm +3 respectively Dam, Dam +1 ,
Dam +2 , Dam +3 and Dbm, Dbm +1 ,
Count Dbm +2 and Dbm +3 , and slit light image Sα
Distance data in the X-axis direction and slit light image Sβ
The distance data in the X-axis direction is output to the arithmetic circuit 16.

つぎに、演算回路16の演算について説明す
る。
Next, the calculation of the calculation circuit 16 will be explained.

いま、説明を簡単にするため、両撮像手段8
α,8βの撮像視野が完全に等しく、かつ撮像視
野の左端がZ軸に一致するように設定され、第8
図に示すように、ある時点のスリツト光Sの照射
部分の点G(x、y、z)の光が、両撮像手段8
α,8βのレンズ11αb,11βbの中心点P
(a、o、z)、Q(b、o、z)をそれぞれ介し
て結像したとすると、その結像点と中心点P(a、
o、z)、Q(b、o、z)を結ぶ線分それぞれが
Y軸の立体(被測定体)側の任意の点cを通る
XZ平面と交差する点U(d、c、z)、V(e、
c、z)を設定することにより、点G(x、y、
z)は、点P(a、o、z)、U(d、c、z)を
通る線分と、点Q(b、o、z)、V(e、c、z)
を通る線分との交点として求まる。
Now, to simplify the explanation, both imaging means 8
It is set so that the imaging fields of α and 8β are completely equal, and the left end of the imaging field of view coincides with the Z axis.
As shown in the figure, light at a point G (x, y, z) of the irradiated part of the slit light S at a certain point in time is transmitted to both imaging means 8.
Center point P of lenses 11αb and 11βb of α and 8β
(a, o, z) and Q (b, o, z), respectively, the image forming point and the center point P(a,
o, z) and Q (b, o, z) each passes through an arbitrary point c on the solid (object to be measured) side of the Y axis.
Points U (d, c, z), V (e,
c, z), the point G(x, y,
z) is a line segment passing through points P (a, o, z), U (d, c, z), and points Q (b, o, z), V (e, c, z)
It is found as the intersection with the line segment passing through.

そして点P(a、o、z)、Q(b、o、z)、U
(d、c、z)、V(e、c、z)の値にもとづぎ、
点G(x、y、z)のX、Y軸成分x、yは、つ
ぎの(1)、(2)式から求まる。
And the points P (a, o, z), Q (b, o, z), U
(d, c, z), based on the values of V (e, c, z),
The X and Y axis components x and y of point G (x, y, z) can be found from the following equations (1) and (2).

x=a・e−b・d/a−b−d+e=b・d−a・e
/b−a+d−e =(d−a){b−a/b−a+d−e−1}+d ……(1)式 y=c・(a−b)/a−b−d+e=c・(b−
a)/b−a+d−e ……(3) ところで(1)、(2)式中のb−aは両撮像センサ1
1αa,11βaの間隔Lであり、d、eはレンズ
11αb,11βbの倍率および撮像手段8α,8
βの取付位置により決まる撮像面Fα,Fβ上での
点G(x、y、z)のX軸方向の位置である。
x=a・e−b・d/a−b−d+e=b・d−a・e
/b-a+d-e = (d-a) {b-a/b-a+d-e-1}+d...(1) Formula y=c・(a-b)/a-b-d+e=c・(b-
a)/b-a+d-e...(3) By the way, b-a in equations (1) and (2) is for both image sensors 1
1αa and 11βa are the distances L, and d and e are the magnifications of the lenses 11αb and 11βb and the imaging means 8α and 8
This is the position of point G (x, y, z) in the X-axis direction on the imaging planes Fα, Fβ determined by the mounting position of β.

そしてd、eは撮像面Fα,Fβそれぞれの左端
の基準点d0からの距離データとして求められる。
Then, d and e are obtained as distance data from the reference point d 0 at the left end of each of the imaging planes Fα and Fβ.

またa、b、cは撮像手段8α,8βの取付位
置、レンズ11αb,11βbの倍率などにより設
定される定数である。
Furthermore, a, b, and c are constants set by the mounting positions of the imaging means 8α and 8β, the magnifications of the lenses 11αb and 11βb, and the like.

そこで、レンズ11αb,11βbの倍率、撮像
手段8α,8βの位置などにもとづいて設定され
るX、Y軸方向の定数a、cをKx、Kyとするこ
とにより、点G(x、y、z)のX、Y軸成分x、
yはつぎの(3)、(4)式の演算から求まる。
Therefore, by setting constants a and c in the X and Y axis directions, which are set based on the magnification of the lenses 11αb and 11βb and the positions of the imaging means 8α and 8β, as Kx and Ky, the point G (x, y, z )'s X and Y axis components x,
y can be found by calculating the following equations (3) and (4).

x=(d−Kx)・{L/L+(d−e)−1}+d ……(3)式 y=Ky・L/L+(d−e) ……(4)式 一方、点G(x、y、z)のZ軸成分zは、点
G(x、y、z)の走査線番号rと、走査線の本
数、幅およびレンズ11αb,11βbの倍率によ
り定まる係数Kzとにもとづき、つぎの(5)式の演
算から求まる。
x=(d-Kx)・{L/L+(d-e)-1}+d...Formula (3) y=Ky・L/L+(d-e)...Formula (4) On the other hand, point G( The Z-axis component z of the point G (x, y, z) is based on the scanning line number r of the point G (x, y, z) and the coefficient Kz determined by the number and width of the scanning lines and the magnification of the lenses 11αb and 11βb. It can be found by calculating the following equation (5).

z=Kz・r ……(5)式 そして(3)、(4)式中のKx、Ky、Lおよび(5)式中
のKzが定数になり、d、eがカウンタ15α,
15βから入力されたX軸方向の1対の距離デー
タとして得られ、かつ、rがクロツク信号のカウ
ントにより得られるため、演算回路16は、予め
設定されたKx、Ky、Kz、Lのデータからなる
設定位置情報と、カウンタ15α,15βから入
力された1対の距離データおよびクロツク信号の
カウントデータにより形成される検出位置情報と
からなるスリツト光の位置情報にもとづき、(3)な
いし(5)式の四則演算を行なつて点G(x、y、z)
の位置を算出し、該算出をスリツト光照射部分の
各点に対して施すことにより、スリツト光照射部
分の各点の第2図のXYZ座標系での三次元求位
置を算出する。
z=Kz・r...Equation (5) Then, Kx, Ky, L in Equations (3) and (4) and Kz in Equation (5) are constants, and d and e are the counters 15α,
Since r is obtained as a pair of distance data in the X-axis direction input from 15β, and r is obtained by counting the clock signal, the arithmetic circuit 16 calculates (3) to (5) based on the position information of the slit light consisting of the set position information and the detected position information formed by a pair of distance data input from the counters 15α and 15β and the count data of the clock signal. Point G (x, y, z) by performing the four arithmetic operations of the formula
By calculating the position of and applying the calculation to each point of the slit light irradiation part, the three-dimensional position of each point of the slit light irradiation part in the XYZ coordinate system of FIG. 2 is calculated.

なお、両撮像手段8α,8βの視野が完全に重
複しないときおよび、撮像面Fα,Fβの縦、横と
Z、X軸とがずれている場合などには、各式の値
に、ずれ量に相当する補正係数を掛けてスリツト
光照射部分の各点の三次元位置を算出する。
Note that when the fields of view of both imaging means 8α and 8β do not completely overlap, and when the vertical and horizontal directions of the imaging planes Fα and Fβ are misaligned with the Z and X axes, the amount of misalignment is added to the value of each formula. The three-dimensional position of each point in the slit light irradiation area is calculated by multiplying by a correction coefficient corresponding to .

そして各測定器5a〜5dの投光手段7に設け
られた反射鏡10cが同一タイミングで同一量だ
け回転し、測定器5a〜5dのスリツト光照射位
置がそれぞれ可変すると、測定器5a〜5dそれ
ぞれのスリツト光照射部分が、測定器5a〜5d
それぞれの両撮像手段8α,8βの重複視野内で
X軸方向に移動し、このとき前述の四則演算にも
とづき、測定器5a〜5dそれぞれの各スリツト
光照射部分の各点の三次元位置が算出され、これ
により立体3の表面の各点の位置が算出されて測
定される。
Then, when the reflecting mirror 10c provided in the light projection means 7 of each of the measuring instruments 5a to 5d rotates by the same amount at the same timing and the slit light irradiation position of the measuring instruments 5a to 5d changes, each of the measuring instruments 5a to 5d The slit light irradiation part is the measuring device 5a to 5d.
It moves in the X-axis direction within the overlapping fields of view of both imaging means 8α and 8β, and at this time, the three-dimensional position of each point of each slit light irradiation portion of each of the measuring instruments 5a to 5d is calculated based on the four arithmetic operations described above. As a result, the position of each point on the surface of the solid body 3 is calculated and measured.

ところで第2図のXYZ座標系の原点が測定器
5a〜5d毎に異なる点になるため、前述の演算
により算出された各点の位置の値が、測定器5a
〜5d毎に異なる基準点からの値になる。
By the way, since the origin of the XYZ coordinate system in FIG. 2 is a different point for each measuring device 5a to 5d, the value of the position of each point calculated by the above calculation is
The value is from a different reference point every ~5d.

したがつて、測定器5a〜5d毎の算出された
各点の位置の値を、測定器5a〜5dによらず一
定の基準の三次元座標系での値に変換する必要が
ある。
Therefore, it is necessary to convert the value of the position of each point calculated for each measuring device 5a to 5d into a value in a constant standard three-dimensional coordinate system regardless of the measuring device 5a to 5d.

そして各点の位置の値を基準の三次元座標系で
の値に変換するために、実際は、支持台2上の各
測定器5a〜5dにより撮像される位置に、計測
基準用ゲージを立設し、該ゲージを用いたつぎの
手法により測定が行なわれる。
In order to convert the value of the position of each point into a value in the standard three-dimensional coordinate system, measurement standard gauges are actually set up at the positions imaged by each measuring device 5a to 5d on the support stand 2. However, the measurement is performed using the gauge as follows.

すなわち、測定前に、撮像された計測基準用ゲ
ージの目盛の位置などから、測定器5a〜5dそ
れぞれの三次元座標源の原点の位置を測定し、各
測定器5a〜5dの三次元座標系の原点の位置
を、たとえば立体3の内部などに予め設定した基
準の三次元座標系の位置に変換するための座標変
換用のX、Y、Z軸成分をそれぞれ算出する。
That is, before measurement, the position of the origin of the three-dimensional coordinate source of each measuring device 5a to 5d is measured from the position of the scale of the imaged measurement reference gauge, etc., and the three-dimensional coordinate system of each measuring device 5a to 5d is determined. X-, Y-, and Z-axis components are respectively calculated for coordinate transformation to transform the position of the origin of , to the position of a reference three-dimensional coordinate system set in advance inside the solid 3, for example.

そして測定器5a〜5dそれぞれの三次元座標
系で算出された各点の位置のX、Y、Z軸成分の
値に、前述の座標変換用の各軸成分を加算または
乗算し、各点の位置の値を、基準の三次元座標系
での値に変換する。
Then, the values of the X, Y, and Z axis components of the position of each point calculated in the three-dimensional coordinate system of each of the measuring instruments 5a to 5d are added or multiplied by each axis component for the coordinate transformation described above, and the Converts a position value to a value in the standard three-dimensional coordinate system.

以上により、立体3の表面の各点の3次元位置
が算出して測定される。
As described above, the three-dimensional position of each point on the surface of the solid 3 is calculated and measured.

そして測定された立体3の表面の各点の座標位
置にもとづき、認識回路21により、立体3の寸
法、表面状態、形状などが識別されるとともに、
設定部20の設定条件にもとづき、測定された各
点の座標位置および、識別された立体3の寸法、
表面状態、形状などが表示手段22に表示され
る。
Based on the measured coordinate positions of each point on the surface of the solid 3, the recognition circuit 21 identifies the dimensions, surface condition, shape, etc. of the solid 3, and
Based on the setting conditions of the setting unit 20, the coordinate position of each measured point and the dimensions of the identified solid 3,
The surface condition, shape, etc. are displayed on the display means 22.

したがつて、前記実施例によると、測定器5a
〜5d毎のスリツト光照射部分が測定器5a〜5
dそれぞれの両撮像手段8α,8βの重複視野内
で左右方向に移動するとともに、各スリツト光照
射部分が測定器5a〜5dそれぞれの両撮像手段
8α,8βにより2方向から撮像される。
Therefore, according to the embodiment, the measuring device 5a
The slit light irradiation portion every ~5d is measured by measuring instruments 5a~5.
While moving in the left-right direction within the overlapping fields of view of both the imaging means 8α and 8β of each of the measuring instruments 5a to 5d, each slit light irradiated portion is imaged from two directions by the imaging means 8α and 8β of each of the measuring instruments 5a to 5d.

さらに、各測定器5a〜5dの1対の撮像出力
が入力される計算機12により、スリツト光照射
部分毎の各1対の撮像出力中でのスリツト光の位
置情報にもとづく簡単な四則演算から、各スリツ
ト光の照射部分の各点、すなわち立体3の各点の
位置が算出して測定される。
Furthermore, the computer 12 to which a pair of imaging outputs from each of the measuring instruments 5a to 5d is input, performs four simple arithmetic operations based on the positional information of the slit light in each pair of imaging outputs for each slit light irradiation area. The position of each point on the irradiated portion of each slit light, that is, the position of each point on the solid 3, is calculated and measured.

そして、各測定器5a〜5dを動かさずにそれ
ぞれのスリツト光の照射位置を左右方向に移動し
て測定位置が変えられ、しかも、立体3の表面の
各点が簡単な四測演算により算出して測定される
ため、機械的な移動なく測定位置を迅速に変えな
がら従来の非接触法より短時間で算出して立体3
の表面の各位置が測定される。
Then, the measurement position can be changed by moving the irradiation position of each slit light in the left and right direction without moving each measuring device 5a to 5d, and each point on the surface of the solid 3 can be calculated by a simple four-measure calculation. Since the measurement is carried out using the same method, the measurement position can be quickly changed without any mechanical movement, and calculations can be made in a shorter time than with conventional non-contact methods.
Each position on the surface of is measured.

また、各スリツト光の照射部分を各1対の撮像
手段8α,8βによりそれぞれ左、右の2方向か
ら撮像して1対の撮像出力を得るため、たとえ
ば、各測定器5a〜5dを1個の投光手段と1台
のテレビカメラなどにより形成し、各スリツト光
の照射部分に対して1つの撮像出力を得る方法に
比して、投光手段7の照射光軸と両撮像手段8
α,8βそれぞれとのなす角を小さくし、立体3
が小さい場合および立体3の表面に凸凹がある場
合にも精度よく測定が行なえる。
In addition, in order to obtain a pair of imaging outputs by imaging the irradiated portion of each slit light from two directions, left and right, by each pair of imaging means 8α and 8β, for example, each of the measuring instruments 5a to 5d is Compared to a method in which one image pickup output is obtained for the irradiated portion of each slit light by using a light projection means 7 and one television camera, etc., the irradiation optical axis of the light projection means 7 and both imaging means 8
Reduce the angles formed with each of α and 8β, and
Accurate measurement can be carried out even when the surface of the solid 3 is small or when the surface of the solid 3 is uneven.

さらに、非接触で測定を行なうため、立体3が
ゴム等の柔軟で変形し易いものであつても、容易
に計測することができる。
Furthermore, since the measurement is performed without contact, even if the solid 3 is made of rubber or other flexible and deformable material, it can be easily measured.

そして、線状のスリツト光を使用しているた
め、エネルギー密度が低く、弱い光でもよく、照
明を使用したときの照明熱により、立体3に歪が
生じたりすることもない。
Furthermore, since linear slit light is used, the energy density is low and weak light is sufficient, and the three-dimensional object 3 will not be distorted by the heat of the illumination when the illumination is used.

なお、各測定器5a〜5dの両撮像手段8α,
8βは、MOS型イメージセンサや撮像管等によ
り構成してもよい。
In addition, both imaging means 8α of each measuring device 5a to 5d,
8β may be constituted by a MOS image sensor, an image pickup tube, or the like.

また、投光手段7から照射するスリツト光の照
射位置は、反射鏡10cを回転することなく、た
とえば磁気的な作用により可変することも可能で
ある。
Further, the irradiation position of the slit light irradiated from the light projecting means 7 can also be varied by, for example, magnetic action without rotating the reflecting mirror 10c.

さらに、基台1の周部に設ける支柱および測定
器の個数は2個以上であればよいのは勿論であ
る。
Furthermore, it goes without saying that the number of columns and measuring devices provided around the base 1 may be two or more.

〔発明の効果〕 以上のように、この発明の立体計測方法による
と、基台1上の支持台2に載置された被測定用の
立体3を囲むように支柱4a〜4dに取付けた複
数の非接触測定器5a〜5dを設け、各測定器5
a〜5dの投光手段7から立体3にその高さによ
り長い垂直方向の線状のスリツト光を左右方向に
移動自在に照射し、各スリツト光の照射部分を各
測定器5a〜5dの1対の撮像手段8α,8βに
より撮像し、この撮像により得られた各1対の撮
像出力中でのスリツト光の位置情報にもとづき、
垂直方向、左右方向をZ軸方向、X軸方向とする
3次元座標での照射部分の点G(x、y、z)の
位置を、 x=(d−Kx)・{L/L+(d−e)−1}+d y=Ky・L/L+(d−e) z=Kz・r d、eは1対の撮像出力それぞれでの点GのX
軸方向の距離データ Lは1対の撮像手段のX軸方向の間隔 Kx、Kyは1対の撮像手段のレンズ倍率、位置
などに基づいて設定されるX軸方向、Y軸方向の
係数 rは1対の撮像出力中での点Gの走査線番号 Kzは走査線の本数、幅およびレンズ倍率によ
り定まる係数 の簡単な四則演算から求め、精度よく非接触測定
することができるものである。
[Effects of the Invention] As described above, according to the three-dimensional measurement method of the present invention, a plurality of three-dimensional measurement objects are attached to the supports 4a to 4d so as to surround the three-dimensional object to be measured 3 placed on the support stand 2 on the base 1. Non-contact measuring instruments 5a to 5d are provided, and each measuring instrument 5
A long vertical linear slit light is irradiated from the light projecting means 7 of a to 5d to the solid body 3 according to its height so as to be movable in the left and right direction, and the irradiated portion of each slit light is directed to one of the measuring instruments 5a to 5d. Images are taken by a pair of imaging means 8α and 8β, and based on the position information of the slit light in each pair of imaging outputs obtained by this imaging,
The position of the point G (x, y, z) of the irradiated part in three-dimensional coordinates with the vertical and horizontal directions as the Z-axis direction and the X-axis direction is x = (d - Kx) -e)-1}+d y=Ky・L/L+(d-e) z=Kz・r d and e are the X of point G at each of the pair of imaging outputs
Axial distance data L is the distance between the pair of imaging means in the X-axis direction Kx, Ky are the coefficients in the X-axis direction and Y-axis direction that are set based on the lens magnification, position, etc. of the pair of imaging means r is The scanning line number Kz of point G in a pair of image pickup outputs can be obtained from simple arithmetic operations using coefficients determined by the number of scanning lines, width, and lens magnification, and can be accurately measured in a non-contact manner.

【図面の簡単な説明】[Brief explanation of the drawing]

図面はこの発明の立体計測方法の1実施例を示
し、第1図は計測装置の斜視図、第2図は第1図
の非接触測定器の分解斜視図、第3図a,bは第
2図の両撮像センサの撮像画面の正面図、第4図
は回路ブロツク図、第5図は第4図の電子計算機
内の画像処理手段のブロツク図、第6図a〜c、
第7図a〜cは第5図の動作説明用タイミングチ
ヤート、第8図は第5図の演算回路の演算説明用
の模式図である。 3……立体、5a〜5d……非接触測定器、7
……投光手段、8α,8β……撮像手段。
The drawings show one embodiment of the three-dimensional measurement method of the present invention, in which FIG. 1 is a perspective view of a measuring device, FIG. 2 is an exploded perspective view of the non-contact measuring device shown in FIG. 1, and FIGS. 2 is a front view of the imaging screen of both image sensors, FIG. 4 is a circuit block diagram, FIG. 5 is a block diagram of the image processing means in the computer of FIG. 4, and FIGS. 6 a to c,
7a to 7c are timing charts for explaining the operation of FIG. 5, and FIG. 8 is a schematic diagram for explaining the operation of the arithmetic circuit of FIG. 3...Stereoscopic, 5a-5d...Non-contact measuring device, 7
...Light projecting means, 8α, 8β...imaging means.

Claims (1)

【特許請求の範囲】 1 基台上の水平な支持台に被計測用の立体を載
置し、前記基台の周囲に複数の支柱を立設し、前
記各支柱それぞれに前記立体より長い垂直方向の
線状のスリツト光を左右方向に可変自在に前記立
体に照射する投光手段および前記立体のスリツト
光の照射部分を前記投光手段の左、右の2方向か
ら撮像する1対の撮像手段を有する非接触測定器
を固定し、前記各測定器それぞれの前記投光手段
の照射位置を可変して前記両撮像手段の重複視野
内で前記照射部分を左右方向に移動するととも
に、前記各測定器それぞれの1対の撮像出力中の
スリツト光の位置情報にもとづき、前記垂直方
向、前記左右方向をZ軸方向、X軸方向とする3
次元座標での前記照射部分の点G(x、y、z)
の位置を、 x=(d−Kx)・{L/L+(d−e)−1}+d y=Ky・L/L+(d−e) z=Kz・r d、eは1対の撮像出力それぞれでの点GのX
軸方向の距離データ Lは1対の撮像手段のX軸方向の間隔 Kx、Kyは1対の撮像手段のレンズ倍率、位置
などに基づいて設定されるX軸方向、Y軸方向の
係数 rは1対の撮像出力中での点Gの走査線番号 Kzは走査線の本数、幅およびレンズ倍率によ
り定まる係数 の四則演算から求め、 前記立体の表面の各点の位置を算出して測定す
ることを特徴とする立体計測方法。
[Scope of Claims] 1. A three-dimensional object to be measured is placed on a horizontal support stand on a base, a plurality of pillars are erected around the base, and each pillar has a vertical support that is longer than the three-dimensional object. A light projecting means for irradiating the three-dimensional object with linear slit light in a variable direction in the left and right directions; and a pair of imaging devices for imaging the irradiated portion of the three-dimensional object from the left and right directions of the light projecting means. A non-contact measuring device having means is fixed, and the irradiation position of the light projecting means of each of the measuring devices is varied to move the irradiated portion in the left and right direction within the overlapping field of view of both the imaging means, and Based on the positional information of the slit light in the pair of imaging outputs of each measuring device, the vertical direction and the horizontal direction are determined to be the Z-axis direction and the X-axis direction3.
Point G (x, y, z) of the irradiated part in dimensional coordinates
The position of, X of point G at each output
Axial distance data L is the distance between the pair of imaging means in the X-axis direction Kx, Ky are the coefficients in the X-axis direction and Y-axis direction that are set based on the lens magnification, position, etc. of the pair of imaging means r is The scanning line number Kz of point G in a pair of image pickup outputs is obtained from four arithmetic operations of coefficients determined by the number of scanning lines, width, and lens magnification, and the position of each point on the surface of the three-dimensional object is calculated and measured. A three-dimensional measurement method characterized by:
JP440985A 1985-01-14 1985-01-14 Method for measuring solid body Granted JPS61162706A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP440985A JPS61162706A (en) 1985-01-14 1985-01-14 Method for measuring solid body

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP440985A JPS61162706A (en) 1985-01-14 1985-01-14 Method for measuring solid body

Publications (2)

Publication Number Publication Date
JPS61162706A JPS61162706A (en) 1986-07-23
JPH0481125B2 true JPH0481125B2 (en) 1992-12-22

Family

ID=11583516

Family Applications (1)

Application Number Title Priority Date Filing Date
JP440985A Granted JPS61162706A (en) 1985-01-14 1985-01-14 Method for measuring solid body

Country Status (1)

Country Link
JP (1) JPS61162706A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010069301A (en) * 2008-09-18 2010-04-02 Steinbichler Optotechnik Gmbh Device for determining three-dimensional coordinate of object, tooth in particular

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB8719951D0 (en) * 1987-08-24 1987-09-30 Lbp Partnership Three-dimensional scanner
WO1994015173A1 (en) * 1992-12-18 1994-07-07 3D Scanners Ltd. Scanning sensor
JPH11108633A (en) * 1997-09-30 1999-04-23 Peteio:Kk Three-dimensional shape measuring device and three-dimensional engraving device using the same
KR100394208B1 (en) * 2000-07-10 2003-08-09 남윤자 Apparatus and method for measuring human body
JP4851931B2 (en) * 2006-12-28 2012-01-11 株式会社島精機製作所 Human body shape measuring device and measuring method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5537982A (en) * 1978-09-11 1980-03-17 Ishikawajima Harima Heavy Ind Co Ltd Solid-shape detector for characteristic test of deformation of curved-surface body
AT367552B (en) * 1979-05-11 1982-07-12 Chlestil Gustav Dkfm Ing METHOD FOR THE PHOTOGRAPHIC PRODUCTION OF DATA CARRIERS FOR REPRODUCTION OF THREE-DIMENSIONAL OBJECTS, DEVICE FOR IMPLEMENTING THE METHOD AND REPRODUCTION DEVICE
JPS5733304A (en) * 1980-08-06 1982-02-23 Hitachi Ltd Method and device for shape inspection
JPS58206909A (en) * 1982-05-07 1983-12-02 Yokogawa Hokushin Electric Corp Arbitrary shape measuring device for objects

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010069301A (en) * 2008-09-18 2010-04-02 Steinbichler Optotechnik Gmbh Device for determining three-dimensional coordinate of object, tooth in particular

Also Published As

Publication number Publication date
JPS61162706A (en) 1986-07-23

Similar Documents

Publication Publication Date Title
EP2132523B1 (en) Method and device for exact measurement of objects
JP3511450B2 (en) Position calibration method for optical measuring device
CA2957077C (en) Method and device for three-dimensional surface detection with a dynamic reference frame
US5612905A (en) Three-dimensional measurement of large objects
US20010030754A1 (en) Body spatial dimension mapper
JPH06123610A (en) Method and apparatus for optical measurement of objective
US6927864B2 (en) Method and system for determining dimensions of optically recognizable features
JPH0481125B2 (en)
Araki et al. High speed and continuous rangefinding system
JPH0481124B2 (en)
JP2002005622A (en) Method for detecting arrangement parameter in optical shaping measuring apparatus provided with plural light- section sensors
JP2795790B2 (en) Sensor coordinate correction method for three-dimensional measuring device
JP2678127B2 (en) Optical measuring device
JPS61159102A (en) Two-dimensional measuring method
JPH02271208A (en) Measuring apparatus of three-dimensional shape
Kurada et al. A trinocular vision system for close-range position sensing
JPS6131906A (en) three-dimensional measuring device
JP2588597Y2 (en) Optical length measuring device
Hedstrand et al. Improving Photogrammetry Instrument Performance through Camera Calibration for Precision Digital Manufacturing
JP3415921B2 (en) Length or distance measurement method and calibration jig for measurement
JP3477921B2 (en) Projector
JPS6180008A (en) Shape measuring apparatus
JPS6228402B2 (en)
JPH04200196A (en) Calibrating method for ccd camera
JPS63250501A (en) Composite image sensor