CA1255788A - Method and apparatus for 3-dimensional stereo vision - Google Patents
Method and apparatus for 3-dimensional stereo visionInfo
- Publication number
- CA1255788A CA1255788A CA000506640A CA506640A CA1255788A CA 1255788 A CA1255788 A CA 1255788A CA 000506640 A CA000506640 A CA 000506640A CA 506640 A CA506640 A CA 506640A CA 1255788 A CA1255788 A CA 1255788A
- Authority
- CA
- Canada
- Prior art keywords
- image
- focusing systems
- points
- image sensing
- sensing areas
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/257—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Measurement Of Optical Distance (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Generation (AREA)
- Image Analysis (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Image Input (AREA)
Abstract
ABSTRACT OF THE DISCLOSURE
A method of forming a three-dimensional stereo vision is disclosed in which, in order to prevent the erroneous detection of object point in the conventional method using two image focusing lens systems, three image points of an object point formed by logically and/or physically three image focusing lens systems and three image sensing surfaces are selected from a multiplicity of image points on an image sensing surface by using the fact that a positional relation among three image points of the same object point is similar to the positional relation among the three image focusing lens systems, and the positional information of three selected image points on the image sensing surface is used for obtaining three-dimensional distance information of the object point.
A method of forming a three-dimensional stereo vision is disclosed in which, in order to prevent the erroneous detection of object point in the conventional method using two image focusing lens systems, three image points of an object point formed by logically and/or physically three image focusing lens systems and three image sensing surfaces are selected from a multiplicity of image points on an image sensing surface by using the fact that a positional relation among three image points of the same object point is similar to the positional relation among the three image focusing lens systems, and the positional information of three selected image points on the image sensing surface is used for obtaining three-dimensional distance information of the object point.
Description
t~
BACKGROUND OF THE INVENTION
The present in~ention relates to a visual system, and more particularly to a method of and an apparatus for ~orming a passive three-dimensional stereo vision which is capable of extracting three-dimensional distance information.
The above method and apparatus are applicable to various fields requiring an accurate, high-speed passive visual system such as an automobile (for the purpose of guiding the automobile when the automobile isentered into a garage, is parked, or runs on a snarl-up road or a high-way, and for the purpose of facilitating the navigation of the automobile on an ordinary road), vehicles other than the automobile, a robot, factory automation, laboratory lS automation, office automation, building automation, home automation, and precise measurement.
In other words, the method and apparatus axe used in operatorless repairs, an inspecting operation, an ope-eratorless wagon, and operatorless crane, operatorless construction, civil engineering machinery, the assembly of parts, measurement of the number of queuing persons, a burglar-proof system, a disaster prevention system, a blindman guiding system, a speed detector, a distance detector, an object detector, an automatic focusing 2S mechanismfor each of a microscope, an enlarger, a projector, .a~ r St~
1 a copying machine, an optical disc apparatus, an image pickup device, a camera and others, character/figure recognition, the recognition of a number plate, a stereo-scopic camera (for a still or moving object) and a game/5 leisure machine.
In a conventional method of forming a three-dimensional stereo vision, two brightness data obtained by two eyes are caused to correspond to each other (that is, corresponding points of the two brightness data are determined) by pattern matching techniques. ~hat is meant by pattern matching techniques is that the scale (namely, measure~ of a coordinate space is enlarged or contracted at each of the points in that portion of the coordinate space which has brightness distribution, so that the difference between the brightness distribution in the coordinate space whose scale has been changed and reference brightness distribution becomes minimum under some criterion.
S~ch pattern matching tecnniques encounter two problems, the first one of which is as follows. The number of coordinate points wh~re the scale of the coordinate space is to be enlarged or contracted, is equal to the number of bright points (herein referred to as "bright lines") formed on a sensing surface, and the posi-tion of each of the coordinate points can be freely changed, provided that the configurational order of the bright lines is not changed. Accordingly, a vast number of combinations of scale transformation are basically allowed, and a combl~
nation capable of minimizing the difference between the 1 brightness distribution obtained after scale transformation and the reference brightness distribution, is selected from a multiplicity of combinations. Thus, it takes a lot of time to carry out pattern matching by digital processing.
The second problem of the pattern matching techniques is as follows. As mentioned above, the scale transformation is made so that the configurational order of the bright lines is not changed. In a case where the brightness distribution at an object system is such tha~ a first group of bright lines which is formed on a sensing surface by the first image focusing lens system, is different in the configurational order of brig~t lines from the second group of bright lines which is formed on the sensing surface by the second image focusing lens system, pattern patching is attended with a fundamental error. This error is attended for an object system in which the spacing between two object points in the direction of depth is greater than the spacing between the object points in a direction correspond-ing to the change of longitudinal azimuth angle. ~t is to ~o ~e noted that, when the object points and the image focusing lens systems are in the same plane surface, the direction from the image focusing lens systems towards the object point.s in the above plane surface is herein referred to as "direction of depth", and an angle between the straight line connecting the image focusing lens systems and the straight line connecting one of the image focusing lens systems with one of the object points is herein referred to as "a horizontal azimuth angle." Incidental~y, an angle , 5.;~
1 ~etween the above plane surface and the image sensing sur ace is herein referred to as the "vertical azimuth angle", which will be mentioned later.
In a case where an object system is not formed of independent object points but has continuous spatial distribution, the continuous distribution is converted into discrete distribution to make it possible to use digital processing. Accordingly, this case also encounters the above-mentioned two problems.
Further, even when a large number of brightness distribution data are used for increasing the accuracy of pattern matching, these two problems are unavoidable, provided that the above-mentioned pattern matching techniques are used.
SUMMARY OF THE INVENTION
An object of the present invention is to provide a method of and an apparatus for forming three-dimensional distance information which are able to solve the problems of the prior art and to form a stereo vision in a short time without producing the fundamental error based upon the spatial distribution object points within an object system.
In order to attain the above object, a method of and an apparatus ror forming three-dimensional distance inform-ation in accorddnce with the present invention use three or 2S more image focusing lens systems. In more detail, in the above mentioned and apparatus, special attention is paid to the fact that a special relation exists between the ~5~
1 geometrical positional relation among ~mage points (namely, bright lines) of the same object point which are formed on a sensing surface by three or more image focusing lens systems, and the geometrical positional relations among the image focusing lens systems themselves and between the sensing surface and the image focusing lens systems, and bright lines of the same object point are selected from a multiplicity of bright lines on the sensing surface by using a sp~cial relation, to eliminate the two problems of the conventional pattern matching techniques.
An example of the above special relation is that, when three or more image focusing lens systems (for examples, eyes) are placed in a plane, and the plane is parallel to an image sensing surface, image points (namely, bright lines) of the same object point which are formed on the sensing surface by the image focusing lens systems, are spaced apart from one another so as to have a geometrical positional relation similar to the geometrical positional relation among the image focusing lens systems. In other words, a figure formed by straight lines which connects the image focusing lens systems is similar to a figure formed by straight lines which connects image points of the same object point. Accordingly, when those ones o~ the bright lines on the sensing surface which are formed by each o~ the image focusing lenc systems, can be known, for e~ample, when a plurality of sensing areas each corresponding to one of the image focusing lens systems are provi~.ed o~ imaginary planes, ~r~g~ lnès which are formed by different image 1 focusing lens systems and have a geometrical positional relation similar to the geometrical positional relation among the image focusing lens systems, can be extracted, as candidates for corresponding points, from a multiplicity of bright lines. (Incidentally, even in a case where a single physical sensing surface is used, if the image ~ocusing lens systems form the image of an object system at different time momen~s, a plurality of sensing areas will be formed in effect).
Those candidates for corresponding points which are extracted in the above-mentioned manner, can indicate all of the object points, excepting an object point whose image point is not formed on a sensing surface~ Moreover, even in a case where the brightness distribution at the object system is such that the configuration order of bright points which are formed on a sensing surface by one of the image focusing lens systems, is different from the configurational order of bright points which are formed on the sensing surface by another image focusing lens system, the above-mentioned fundamental error is not generated in selecting the candidates.
In some cases, however, a virtual object point which does not exist actually, may be extracted as an object~ since bright lines satisfying the above special relation can be found. The probability of extracting the virtual object point can be greatly reduced by increasing the number of image focusing lens systems, disposing image focusing lens systems in a plane at random, or disposing ~5~
1 image focusing lens systems in different planes. This is because these countermeasures make conditions for determin-ing correspondin~ points severe.
All the candidates for corresponding points th-s obtained may be used as the corresponding points. In fact, the candidates can be reduced on the basis of brightness data. In general, when it is possible to detect brightness data r color data (namely, data on the frequency character-istics of incident light) r data on the phase of incident light r data on the plane of polarization of incident light r and the differential ~-alue of each of these data with respect to time or space, these data and their differ-entiated values can be used for selecting apppropria-te ones from the candidates. Nowr explanation will be made on a case where brightness data is used for reducing the candidates r by way of example. When brightness ratio among bright lines of the same object point which are formed on the sensing surface by the image focusing lens systems is previously known r only corresponding points having the above ~0 brightness ratio are extracted. The brightness of each bright line is dependent upon the direction dependence of the reflectivity of the object point and the distance between the object point and the sensing surface. The distance between the object point and the sensing surface has been known when the candidates for the corresponding points are extracted. Accordingly, only corresponding points based upon realistic direction dependence of reflectivity are extracted. In a case where da-ta other than ~55'~
1 the brightness data is used, corresponding points can be extracted by a me~hod corresponding to the data.
BRI:EF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic diagram for explaining a method of forming a three-dimensional stereo vision through a conventional pattern matching technique using two bright-ness data which are obtained by two image focusing lens systems.
Fig. 2 is a schematic diagram for explaining the generation of a fundamental error in the method of forming a three-dimensional stereo vision through the conventional pattern matching technique.
Fig. 3 is a schematic diagram for explaining the principle of a method of forming a three-dimensional stereo vision in accordance with the present invention.
Fig. 4 is a block diagram showing an apparatus for carrying out the first embodiment of a method of forming a three-dimensional stereo vision in accordance with the present invention.
Fig. 5 is a block diagram (partly pictorial and partly schematic) showing an apparatus for carrying out the second embodiment of a method of forming a three-dimensional stereo vision in accordance with the present invention.
Figs. 6 and 7 are schemiatic diagrams for explain-ing a method of reducing candidates for corresponding points in accordance with the present invention.
3 0 ~3 1 DESCRIPTION OF THE PREFERRED EMsoDIMENTs In order to facilitate an understanding of the present invention, the prior art will first be explained, by reference to Figs. 1 and 2. Fig. 1 is a schematic diagram for explaining the pattern matching for two brightness data which are obtained by two image focusing lens systems,~nd this diagram shows a case where an object system is formed of three object points a, b and c. In a prior art, the first image focusing lens system and a second image focusing lens system are used, and brightness distribution is formed on a sensing surface by each of these image focusing lens systems. Referring to Fig. 1, three points a, b and c which ara spaced apart from one another, make up the object system, and the brightness distribution which is formed on 15`` the sensing surface by the first image focusing lens system, is the discrete brightness distribution made up of three brigh~ lines al, bl and cl. Further, the bright-ness distribution which is formed on the sensing surface by the second image focusing lens system is the discrete brightness distribution made up of three bright lines a2, b2 and c2. When the bright lines al, bl, cl, a2, b2 and c2 are obtained, it is determined by some method which of the bright lines obtained by one of the image focusing lens systems corresponds to each of the brigh~ lines ob~ained by the other image focusing lens system. Then, the distance between the sensing surface and the object system in the direction of depth is determined on the basis of the distance ~ between corresponding bright line~, and the horizontal ~ ~ ~ 5 ~
1 azimuth angle is determined on the basis of the dis~ance Qg indicating the geometrical positional relation bet~een one image focusing lens system and a bright line formed by this focusing lens system. Three-dimensional distance S information with respect to the object system is then obtained by using the above distance in the direction of depth, the above horizontal azimuth angle, and other factors.
In the prior art, the bright lines obtained by the first image focusing lens system are forced to correspond to the bright lines obtained by the second image focusing lens system (that is, corresponding bright lines are determined? by some pattern matching technique. Funda-mentally, what is meant by pattern matching techniques is that the scale (namely, measure) of a coordinate space is enlarged or contracted at each of the points in that portion of the coordinate space which has a brightness distribution, so that the difference between the brightness distribution in the coordinate space whose scale has been changed and reerence brightness distribution becomes minimum under some ~o criterion. In the case shown in Fig. 1, the change of ~he scale corresponds to the change of each of the distances between the bright lines a2 and b2 and between the bright lines b2 and c2, that is, the change of the positions of the bright lines a2, b2 and c2 without being attended with a change in the configuration order of these bright lines. It is meant by pattern matching that by changing the positions of the bright lines a2, b2 and c2 in the above manner, the positional relation among the ?l ~S'î ~
1 bright lines a2, b2 and c2 becomes equal to the positional relation among the bright lines al, bl and cl (that is, the brightness distribution including the bright lines a2, b2 and c2 becomes entirely equal to the brightness distribu-tion including the bright lines al, bl and cl).
Fig. 2 is a schematic diagram for explaining the generation of the previously-mentioned fundamental error in a case where an object system includes two object points so that the spacing between the objec-t points in the direc-tion of depth is greater than the spacing between theobject points in the direction corresponding to the change of horizontal azimuth angle. Referring to Fig. 2, the object system includes an object point a', instead of the object point a of Fig~ 1, and the spacing between the points a' and b in the direction of depth is greater than the spacing between the points a' and b in the direction corresponding to the change of horizontal azimuth angle, to such an extent that the point a' lies between two straight lines which connect the point b with the first and second image focusing lens systems. As shown in Fig~ 2, the brightness distribution which is formed on the sensing surface by the first image focusing lens system, is dif-ferent in the configuration order of bright lines from the brightness distribution which is formed on the sensing surface by the second image focusing lens system.
That is, the bright lines formed by the first image focusing lens system are arranged in the order of cl, a'l ~ ~ r 5~
1 and bl. While, the bright lines formed by the second image focusing lens system are arranged in the order of c2, b2 and a'2. When pattern matching is performed for these brightness distribution data, there is a s-trong probability that the bright line a'l is forced to correspond to the bright line b2 and the bright line bl is forc~d to corr-spond to the bright line a'2. As a result, the true object points a' and b are not detected from the brightness distribution data on the sensing surface, but the object system is judged to include the false object points m and n. That is, a fundamental error is generated.
Now, a case where pattern matching is performed for brightness distribution data obtained by three image focusing lens systems which are disposed on a straight line lS parallel to a sensing surface, by utilizing the principle mentioned in "SUMMARY OF THE INVENTION", will be briefly explained by reference to Fig. 3.
Fig. 3 shows a case where the third image focusing lens system is added to the first and second image focusing ~0 lens systems of Fig. 2. In Fig. 3, the first brightness ~istribution, which is formed on the sensing surface by the first focusing system, is disposed in a first imaginary plane which is spaced apart from a second imaginary plane in which there is disposed the second brightness distribution, which is formed on the sensing surface by the second focusing system, by the distance Q12 between the first and second focusing systems, and the third brightness distribution, which is formed on the sensing surface by the third focusing system, is disposed in a third imaginary plane which is spaced apart from the second ~s~
imaginary plane by the distance Q23 between the second and third focusing systems.
The above arrangement o~ the first, second and third brightness distribution in respective first, second and third imaginary planes can clearly show a special ~elation which e~ists between the geometrical positional relation among bright lines of the same object point which are formed by the focusing systems, and the geometrical positional relations among the focusing systems themselves and between the sensing surface and the focusing systems.
In the case of Fig. 3, the special relation is that a ratio of the distance between the first and second ones of three bxight lines of the same object point to the distance between ~he second and third bright lines is equal to a ratio of the distance between the first and second focusing systems to the distance between the second and third focusing systems. Accordingly, when the first brightness distribution formed by the first focusing system, the second brightness distribution formed by the second focusing system and the third brightness distribution formed by the third focusing system are arranged so that the ratio of the distance between the first brightness dis~ribution and the second brightness distribution to the distance between the second brightness distribution and the third brightness distribution is equal to the ratio of the distance Q12 and the distance Q23' three bright lines caused by the same object point lie on a straight line.
Conversely, when a group of bright lines lying on the same straight line is found, candidates for 1 corresponding points can be obtained. In Fi~. 3, straight lines each indicating corresponding bright lines are expressed by brokeA lines A, A', B, C and P. One of the above straight lines can be determined in the following S manner. That is, when a bright line in the brightness distribution formed by a focusing system (for example, the second focusin~ system), that is, a bright line in ~he second brightness distribution is specified, and a bright line is present at each of two positions one of which is spaced apart from the point on the first brightness distri-bution corresponding to the specified bright line in the leftward direction by a given distance, and the other of which is spaced apart from the point on the third brightness distribution corresponding to the specified bright line in the rightward direction by the distance equal to the product o~ the given distance and a ratio Q23/Q12' the above-mentioned straight line can be obtained by connecting these positions. The extraction of the above positions in the fixst and third brightness distribution capable of satisfy-ing the condition with respec~ to the ratio Q23/Q12 andthe AND operation for two pieces of brightness information at the above positions can be readily and rapidly performed by hardware or software.
The remaining straight lines are determined in such a manner that each of the bri~ht lines in the second brightness distribution other than the speci~ied bright line is selected, and the presence of a bright line is confirmed at those positions on the first and third brightness 1 distribution which can satisfy the above-mentioned condition with respect to the ratio Q23/Q12 and another condition that the distance between the selected bright line and the above position on the first brightness distribution is different from the distance between the specified bright line and that position on the first bright-ness distribution which has a bright line in the preceding processing.
In this case, a vast number of positioning opera-tions are performed on the first, second and third bright-ness distribution, and the number of positioning operations to be performed is substantially equal to the product of the number of bright lines in the second brightness distribution (that is, the number o~ object points in a plane which i.s defined by the focusing systems and bright lines on the actual sensing surface), the number of bright lines in the first or third brightness distribution which are to be combined with the bright lines in the second brightness distribution to determine the distance between ~0 the actual sensing surface and each object point in the direction of depth, and the number of bright lines which can be found on the actual sensing surface along a direc-tion corresponding to the change of vertical azimuth angle (that is, a direction parallel to a straight line at which a plane perpendicular to a straight line connecting the first, second and third focusing systems intersects with the actual sensing surface).
Now, let us consider an extreme case where the l number of bright lines in each of the first, second and third brightness distribution is equal to the spatial resolving power (namely, resolution). When it is assumed that the resolution is equal in all directions, the number of positioning operations is equal to the cube of the resolution at most.
In a case where the resolution is equal to 500, the number of positioning operatio~ is 125 x 106. If the positioning operations are performed by serial, digital processing which uses a clock signal of l MHz, a processing time proportional to 125 sec will be necessary. If a clock signal of lO MHz is used, a processing time proportional to 12.5 sec will be required.
In fact, the above positioning operations can be lS carried out by parallel processing, and thus the processing time can be greatly reduced. For example, brightness distribution data corresponding to those straight lines in an actual sensing surface which are parallel to the straight line connecting the first, second and third focusing systems and are equal in number to the resolution, can be processed at once by parallel processing. According-ly, the frequency of processing becomes apparently equal to 104. Thus, if the serial, digital processing using a clock signal of l MHz is carried out, a processing time proportional to 0.25 sec will be necessary. If the clock signal of 10 MHz is used, a processing time proportional to 0.025 sec. will be required. Further, the positioning operations for one of the second brightness distribution ~5 ~
1 and the first or third brightness distribution can ~e per~ormed in parallel when a bright line in the other brightnessdistribution is specified. Thus, the frequency of processing becomes apparently equal to S00~ If the serial, digital processing using the clock signal of 1 MHz is used for the above processing, a processing time proportional to 0.5 msec. will be necessary. If the clock signal of 10 MHz is used, a processing time proportional to O.OS msec. will be required. The pipeline method may be applied to the above S00 processing.
Further, all the possible combinations of posi-tions in the second brightness distribution and positions in the first or third brightness distributio~ are funda-mentally known. Accordingly, all processing can be carried lS out in parallel. In this case, the processing is apparently carried out onl~ once. ~ccordingly, if the serial, digital processing using the clock signal of 1 MHæ is utilized, the processing time will be in the order of 1 ~s. If the clock signal of 10 MHz is used, the processing time will be in the order of 0.1 ~s.
In a case where an object system is investigated in a limited range of each of the distance in the direction of depth, the horizontal azimuth angle and the vertical azimuth angle, for example, only 10~ of each of the second brightness distribution, the first or third brightness distribution and the whole range of vertical azimuth angle, the number of positioning operations is as small as 125 ~ 103 even when the parallel processing is not used.
~S~'f'~
1 Accordingl~, if the serial, digital processing using the clock signal of 1 MHz is utilized, t~e processing time will be about 0.125 sec. If the clock signal of 10 MHz is used, the processing time will be about 0.0125 sec.
Further, let us consider a case where the bright-ness distribution indicating edges in an object system can be used instead of the brightnes~ distribution indicating object points. When bright lines in the second brightness distribution which indicate edges are 20~ of bright lines in the same brightness distribution which indicate object points, bright lines in the first or third brightness distribution which indicate edges are 20% of bright lines in the same brightness distribution which indicate object points, and those bright lines arranged on the actual sensing lS surface along a direction corresponding to the change of vertical azimuth angle which indicate edges, are 20~ of bright lines which are arranged along the above direction and indicate object points, the number of positioning operations is equal to 106. Accordingly, if the serial, digital processing using the clock signal of 1 MHz is utilized, the processing time will be about 1 sec. I~
the clock signal of 10 MHz is used, the processing time will be about 0.1 sec.
The straight lines for extracting candidates for corresponding points, that is, the broken lines A, A', B, C and P can be detel~mined in the above-mentioned processing time. As can be seen from Fig. 3, the broken lines A, A', B and C correspond to actual object points ~55~
1 a, a', b and c, respectively, but an object point ~
corresponding to the broken line P does not e~ist actually.
The appearance of such a virtual object point is caused by the fact that the bright lines al, a'2 and b3 are formed on the sensing surface accidentally so that the ratio of the distance between the bright lines al and a'2 to the distance between the bright lines a'2 and b3 is equal to the ratio of the distance between the first and second focusing systems to the distance between the second and third focusing systems.
It can be known from Fig. 3 that the possibility of appearance of the virtual object point is greatly reduced by additionally disposing a fourth focusing system at a given position. If the fourth focusing system is disposed on the straight line connecting the first, second and third focusing systems, the reduction of the above possibility will be readily seen from Fig. 3.
Further, it can be supposed that the possibility of appearance of the virtual object point is reduced by changing the geometrical arrangement of the first, second and third focusing systems, instead of adding the fourth focusing system to these focusing systems. It can be readily seen from Fig. 3 that the above possibility will be reduced, if the first, second and third focusing systems are moved on the straight line connecting these focusing systems so that the ratio of the distance between the first and second focusing systems to the distance between the second and third focusing systems differs from the original - 19 ~
1 rati Q12/Q23-~ n e~treme case where a continuous straight line is used as an object system, will be explained below, by reference to Fig. 6. Fig. 6 shows a case where not only a line segment a c which is an object system, but also first, second and third focusing systems are disposed in the same plane. In this case, the image of every point in a common area of three triangular regions each bounded by two straight lines connecting one of the first, second and third focusing systems with both ends of the line segment a c and their extensions, can be formed on a sensing surface by each of these focusing systems. Accordingly, it seems as if an object system were present throughout the above common area, (namely, hatched area). In a case where an object system has a planar shape, a common portion of three conical regions each bounded by a curved surface connecting one of the first, second and third focusing systems with the periphery of the object system and its axtension, corresponds to the common area.
The hatched area of Fig. 6 can be eliminated by placing the third focusing system at a position deviating from the plane, in which the first and second focusing systems and the line segment a c lie, as shown in Fig. 7.
In Fig. 7, reference symbol 3' designates the new position of the third focusing system. In this case, a triangular region bounded by two straight lines connecting the third focusing system with both ends of the line segment a c and their extensions intersects with the above plane only ~ st~
1 at the line segment a c. Accordingly, the hatched area is extinguished.
Further, it can be readily known that the hatched portion can be made small by additionally disposing the fourth focusing system in the plane which contains the line segment a c and the first, second and third focusing systems, instead of deviating the third focusing system from the plane.
Also, in a case where an object system has a planar shape, the hatched portion can be reduced by chang-ing the seometrical arrangement of the focusing systems or by increasing the number of focusing systems. Further, when the planar shape is converted into a linear shape by preprocessing,such as edge extraction, or by decreasirg the size of the plate, the portion can be made extremely small.
The broken line P which appears in Fig. 3 and corresponds to the virtual object point p, can be removed b~ utilizing the brightness information of bright lines.
Now, let us consider a case where image points of the same object point formed by three focusing systems are equal in brightness to one another, as shown in Fig~ 3. When at least one of three bright lines extracted by a straight line is different in brightness from the remaining bright lines, this straight line is eliminated. Bright lines extracted by each of the broken lines A, A', B and C have the same brightness, but two bright lines al and a'2 of three bright lines al, a'2 and b3 extracted by the broken ~ 7 1 line P are different in brightness from the remaining bright line b3. Accordingly, the broken line P is elimi-nated.
The processing for eliminating an unwanted straight line can be carried out in a short time by software or hardware. Further, this processing and the above-mentioned positioning processing can be combined so as to be simultaneously performed.
Next, let us consider a case where image points (namely, bright lines) of the same object point formed by a plurality of focusing systems do not have the same brightness but the reflectivity of an object point (or the radiant intensity of a luminous object point) is equal in all directions. The brightness of a bright line on the sensing surface is inversely proportional to the square of the distance between the object point and the bright line. Accordingly, when brightness values each obtained by multiplying the brightness of one of bright lines extracted by a straight line by the square of the corre-sponding distance, are not equal, the straight line iseliminated.
In most cases, the above-mentioned conditions with respect to brightness are satisfied. Even in a case where the reflectivity of an object point varies with direction, if the extent of variations in reflectivity is previously known, the brightness of each of briyht points extracted by a straight line will be corrected by the above information, and thus it will be judged on the basis 1 of the corrected brightness values of t~e bright lines whether the straight line is to be eliminated or not. Even in a case where the distance correction or the correction for variations of the reflectivity of an object point with S direction is necessary, an unwanted straight line can be readily eliminated in a short time by software or hardware, as in a case where bright points of the same object point formed by a plurality of focusing systems have the same brightness.
It is to be noted that the prior art using a pattern matching technique determines corresponding points on the assumption that bright lines on the sensing surface are all equal in brightness.
After corresponding points have been determined in the above-mentioned manner, both the information on the positions of at least two of three bright lines indicated b~ the corresponding points, on the sensing surface, and the information on the geometrical positional relation between the focusing systems and the sensing surface are u5ed for obtaining two-dimensional distance information of an object point, that is, the distance between the object point and the sensing surface in the direction of depth and one or two horizontal azimuth angles. The two-dimensional distance information thus obtained is combined with the vertical azimuth angle information which is obtained independently of the above two-dimensional distance information, to obtain three-dimensional distance informa-tion of the object point.
~ 7 ~ ~
1 The processing for obtaining each of the two-dimensional distance information and the three-dimensional distance information is a processing based upon simple abgebraic equations, and can be readily performed in a short time. The whole quantity o~ processing necessary for obtaining three-dimensional distance information of an ob~ect system is proportional to the number of ob~ect points included in the object system.
Now, explanation will be made of a first embodiment of a method of forming a three-dimensional stereo vision in accordance with the present in~ention, by reference to Fig. 4.
The first embodiment is carried out by an apparatus which, as shown in Fig. 4, includes image focusing systems 11 to 13 each formed of a lens or mirror, image sensing surfaces 21 to 23, a detected information preprocessing system 30, a detected information storage system 40, a candidate-for-corresponding-point decision ~ system 50, a false-corresponding-point removing system 60, a three-dimensional distance information calculation sys~em 70, and a configuration change system 80 for changing the configuration of the image focusing systems 11 to 13.
The image focusing systems 11 to 13 are not always re~uired to lie on a straight line parall~1 to the sensing surfaces 21 to 23,~but the positions of the image focusing systems can be changed by the configuration change system 80 so that a plane containing the image focusing systems 11 to 13 makes a desired angle with the 1 image sensing surfaces 21 to 23.
The imaqes of an object system which are formed on the sensing surfaces 21 to 23 by the focusing systems 11 to 13, are sent to the detected information preprocessing system 30, which corrects errors due to distortions and various aberrations in the focusing system, and extracts bright lines which satisfy a brightness or color condition required for extracting edges or obtaining three-dimensional distance information, if necessary. The image information thus obtained is sent to the detected information storage system 40. When it is necessary to reduce false correspond-ing points by using four or more image focusing systems, the configuration o~ the focusing systems 11 to 13 and sensing surfaces 21 to 23 is changed by the configuration change system 80, and the image information obtained on the basis of the new configuration is also sent to the detected information storage system 40 through the preprocessing system 30. Next, the candidate-for-correspond-ing-point decision system 50 extracts candidates for corresponding points from a multiplicity of image points stored in the storage system 40, by making use of a special geometrical relation, as mentioned in "SUMM~RY OF THE
INVENTION". The extracted candidates are sent to the false-corresponding-point removing system 60, in which candidates corresponding to a false (namely, non-existing) object point are removed from the extracted candidates by making use of a brightness condition or others, as mentioned in '`SUMMARY OF THE INVENTION". The remaining candidates are ~S5'7l~
1 sent to the three-dimensional distance information calculation system 70, to obtain the three-dimensional distance information of the object system in a manner mentioned in "SU~AR~ OF THE INVENTION".
According to the apparatus of Fig. 4, the configuration of the focusing systems 11 to 13 and sensing surfaces 21 to 23 can be freely changed, and moreover, the information obtained on the basis of the new configuration is also stored. Thus, the apparatus of Fig. 4 can produce the same effect as an apparatus provided with four or more focusing systems, and hence can greatly reduce the redundancy in extracting candidates for corresponding points, no matter what structure the object system may hàve.
Next, explanation will be made of a second embodiment of a method of forming a three-dimensional stereo vision in accordance with the present invention, by reference to Fig. 5.
The second embodiment is carried out by an apparatus which, as shown in Fig. 5, includes color filters 101 to 103 corresponding to three primary colors, optical path length correction systems 112 and 113, mirrors 123, 133, 121 and 131, a preprocessing system 200, an image focusing system 300, a color image sensor 400, a horizontal addressing system 500, a brightness correcting circuit group 600, a brightness correction system 700, an AND
circuit group`800, a two-dimensional distance information calculation system 900, and a three-dimensional distance 1 information calculation system 1000.
The color filters 101, 102 and 103 transmit red light, green light and blue light, respectively, and correspond to three primary color filters mounted on the color image sensor 400. The color filters 101 to 103, the optical path length correction systems 112 and 113, and the mirrors 123, 133, 121 and 131 are disposed so that the light paths rom the color filters 101 to 103 to the color image sensor 400 have the same optical path length, and respective optical axes of three light beams from the color filters 101 to 103 lie in the same horizontal plane and are spaced apart from one another at the light receiving surface of the color image sensor 400 by an amount corresponding to the width (in a horizontal direc-tion) of each of the color filters mounted on the colorimage sensor 400. As shown in Fig. 5, each of the mirrors 133 and 131 can transmit light incident upon the back surface thereof ~eferring to Fig. 5, light which is emitted or ~0 re~lected from an object system and passes through an optical system which includes the color filters 101 to 103, the optical path length correction systems 112 and 113 and the mirrors 123, 133, 121 and 131, is subjected to necessary brightness modification processing and polarization processing in the preprocessing system 200, and then i5 focused on the color image sensor 400 by the image focusing system 300. The color image sensor 400 is able to be addressed by horizontal direction addressing system 500 ~5'~ ~
1 in respective colors so that three pixels adjacent to one another and correspondiny to three colors in a horizontal direction which are provided with the red, green and blue filters, can be separately addressed by three address lines prepared independently for each color and pixels arranged in a vertical direction and receiving light of the same color can be simultaneously addressed. When the addressing for the color image sensor 400 is made in a horizontal direction and in a predetermined order, pixels corresponding to an address simultaneously deliver informa-tion corresponding to the color filters mounted on these pixels, and the above information is sent to the brightness correcting circuit group 600. The circuit group 600 includes three correcting circuits for each of pixel layers arranged in a vertical direction. The output of each pixel included in the color image sensor 40Q is modified in accordance with that circuit constant of each correcting circuit which is specified by the brightness correction system 700, and then three color outputs from three pixels adjacent to one another in a horizontal direction are applied to one of AND circuits included in the AND circuit group 800. The brightness correction system 700 gives each correcting circuit of the brightness correcting circuit group 600 a circuit constant necessary for the correction of brightness balance among three primary colors, the brightness correction resulting-from the characteristics of the object system, and the classification of brightness into a plurality of levels. Three color outputs having been ~S5';'~
1 subjected to the brightness correction are applied to one AND circuit of the AND circuit group 800. When the three color outputs lie in a predetermined brightness range, the A~D circuit delivers ON~information. When at least one of the three color outputs does not lie i.n the predetermined brightness range, the AND circuit deli~ers OFF-information.
The ON- or OFF-information thus obtained is applied to one two-dimensional distance information calculation circuit of the two-dimensional distance information calculation system 900. When ON-information is applied to the above calculation circuit, the presence of an object point is known, and the two-dimensional distance information of the object point is obtained on the basis of the address information from the horizontal addressing system 500 by the method mentioned in "SUMMARY OF THE INVENTION". When the OFF-information is applied to the two-dimensional distance information calculation circuit, the absence of an object point is indicated, and thus no processing is carried out. Thè above two-dimensional distance informa-tion obtained in each of pixel layers arranged in the vertical direction is sent to the three-dimensional distance information calculation system 1000, to be used for obtain-ing the three-dimensional distance information of the object system.
As can be seen from the above explanation, according to the second embodiment, data from pixel layers arranged in the vertical direction are processed in parallel, and hence the processing time is greatly shortened.
~ss~
1 Further, the whole processing can be carried out by hardware, and hence not only the productivity and reliability are improved but also the processing speed is increased.
Furthermore, only the light receiving surface of a single S color image sensor is used as the image sensing surface, and hence the apparatus ror carrying out the second embodiment is simple in structure and can be constructed at a low cost.
As has been explained in the foregoing, accord-ing to the present invention, a method of forming a three-dimensional stereo vision is provided in which the processing time is short, and the fundamental error dependent upon the spatial distribution of object points can be avoided.
Further, the above mentioned according to the present invention has the following advantages.
(1) Even when an object system has a special brightness pattern, precise distance information can be formed by appropriately setting the number and configuration of image focusing systems.
BACKGROUND OF THE INVENTION
The present in~ention relates to a visual system, and more particularly to a method of and an apparatus for ~orming a passive three-dimensional stereo vision which is capable of extracting three-dimensional distance information.
The above method and apparatus are applicable to various fields requiring an accurate, high-speed passive visual system such as an automobile (for the purpose of guiding the automobile when the automobile isentered into a garage, is parked, or runs on a snarl-up road or a high-way, and for the purpose of facilitating the navigation of the automobile on an ordinary road), vehicles other than the automobile, a robot, factory automation, laboratory lS automation, office automation, building automation, home automation, and precise measurement.
In other words, the method and apparatus axe used in operatorless repairs, an inspecting operation, an ope-eratorless wagon, and operatorless crane, operatorless construction, civil engineering machinery, the assembly of parts, measurement of the number of queuing persons, a burglar-proof system, a disaster prevention system, a blindman guiding system, a speed detector, a distance detector, an object detector, an automatic focusing 2S mechanismfor each of a microscope, an enlarger, a projector, .a~ r St~
1 a copying machine, an optical disc apparatus, an image pickup device, a camera and others, character/figure recognition, the recognition of a number plate, a stereo-scopic camera (for a still or moving object) and a game/5 leisure machine.
In a conventional method of forming a three-dimensional stereo vision, two brightness data obtained by two eyes are caused to correspond to each other (that is, corresponding points of the two brightness data are determined) by pattern matching techniques. ~hat is meant by pattern matching techniques is that the scale (namely, measure~ of a coordinate space is enlarged or contracted at each of the points in that portion of the coordinate space which has brightness distribution, so that the difference between the brightness distribution in the coordinate space whose scale has been changed and reference brightness distribution becomes minimum under some criterion.
S~ch pattern matching tecnniques encounter two problems, the first one of which is as follows. The number of coordinate points wh~re the scale of the coordinate space is to be enlarged or contracted, is equal to the number of bright points (herein referred to as "bright lines") formed on a sensing surface, and the posi-tion of each of the coordinate points can be freely changed, provided that the configurational order of the bright lines is not changed. Accordingly, a vast number of combinations of scale transformation are basically allowed, and a combl~
nation capable of minimizing the difference between the 1 brightness distribution obtained after scale transformation and the reference brightness distribution, is selected from a multiplicity of combinations. Thus, it takes a lot of time to carry out pattern matching by digital processing.
The second problem of the pattern matching techniques is as follows. As mentioned above, the scale transformation is made so that the configurational order of the bright lines is not changed. In a case where the brightness distribution at an object system is such tha~ a first group of bright lines which is formed on a sensing surface by the first image focusing lens system, is different in the configurational order of brig~t lines from the second group of bright lines which is formed on the sensing surface by the second image focusing lens system, pattern patching is attended with a fundamental error. This error is attended for an object system in which the spacing between two object points in the direction of depth is greater than the spacing between the object points in a direction correspond-ing to the change of longitudinal azimuth angle. ~t is to ~o ~e noted that, when the object points and the image focusing lens systems are in the same plane surface, the direction from the image focusing lens systems towards the object point.s in the above plane surface is herein referred to as "direction of depth", and an angle between the straight line connecting the image focusing lens systems and the straight line connecting one of the image focusing lens systems with one of the object points is herein referred to as "a horizontal azimuth angle." Incidental~y, an angle , 5.;~
1 ~etween the above plane surface and the image sensing sur ace is herein referred to as the "vertical azimuth angle", which will be mentioned later.
In a case where an object system is not formed of independent object points but has continuous spatial distribution, the continuous distribution is converted into discrete distribution to make it possible to use digital processing. Accordingly, this case also encounters the above-mentioned two problems.
Further, even when a large number of brightness distribution data are used for increasing the accuracy of pattern matching, these two problems are unavoidable, provided that the above-mentioned pattern matching techniques are used.
SUMMARY OF THE INVENTION
An object of the present invention is to provide a method of and an apparatus for forming three-dimensional distance information which are able to solve the problems of the prior art and to form a stereo vision in a short time without producing the fundamental error based upon the spatial distribution object points within an object system.
In order to attain the above object, a method of and an apparatus ror forming three-dimensional distance inform-ation in accorddnce with the present invention use three or 2S more image focusing lens systems. In more detail, in the above mentioned and apparatus, special attention is paid to the fact that a special relation exists between the ~5~
1 geometrical positional relation among ~mage points (namely, bright lines) of the same object point which are formed on a sensing surface by three or more image focusing lens systems, and the geometrical positional relations among the image focusing lens systems themselves and between the sensing surface and the image focusing lens systems, and bright lines of the same object point are selected from a multiplicity of bright lines on the sensing surface by using a sp~cial relation, to eliminate the two problems of the conventional pattern matching techniques.
An example of the above special relation is that, when three or more image focusing lens systems (for examples, eyes) are placed in a plane, and the plane is parallel to an image sensing surface, image points (namely, bright lines) of the same object point which are formed on the sensing surface by the image focusing lens systems, are spaced apart from one another so as to have a geometrical positional relation similar to the geometrical positional relation among the image focusing lens systems. In other words, a figure formed by straight lines which connects the image focusing lens systems is similar to a figure formed by straight lines which connects image points of the same object point. Accordingly, when those ones o~ the bright lines on the sensing surface which are formed by each o~ the image focusing lenc systems, can be known, for e~ample, when a plurality of sensing areas each corresponding to one of the image focusing lens systems are provi~.ed o~ imaginary planes, ~r~g~ lnès which are formed by different image 1 focusing lens systems and have a geometrical positional relation similar to the geometrical positional relation among the image focusing lens systems, can be extracted, as candidates for corresponding points, from a multiplicity of bright lines. (Incidentally, even in a case where a single physical sensing surface is used, if the image ~ocusing lens systems form the image of an object system at different time momen~s, a plurality of sensing areas will be formed in effect).
Those candidates for corresponding points which are extracted in the above-mentioned manner, can indicate all of the object points, excepting an object point whose image point is not formed on a sensing surface~ Moreover, even in a case where the brightness distribution at the object system is such that the configuration order of bright points which are formed on a sensing surface by one of the image focusing lens systems, is different from the configurational order of bright points which are formed on the sensing surface by another image focusing lens system, the above-mentioned fundamental error is not generated in selecting the candidates.
In some cases, however, a virtual object point which does not exist actually, may be extracted as an object~ since bright lines satisfying the above special relation can be found. The probability of extracting the virtual object point can be greatly reduced by increasing the number of image focusing lens systems, disposing image focusing lens systems in a plane at random, or disposing ~5~
1 image focusing lens systems in different planes. This is because these countermeasures make conditions for determin-ing correspondin~ points severe.
All the candidates for corresponding points th-s obtained may be used as the corresponding points. In fact, the candidates can be reduced on the basis of brightness data. In general, when it is possible to detect brightness data r color data (namely, data on the frequency character-istics of incident light) r data on the phase of incident light r data on the plane of polarization of incident light r and the differential ~-alue of each of these data with respect to time or space, these data and their differ-entiated values can be used for selecting apppropria-te ones from the candidates. Nowr explanation will be made on a case where brightness data is used for reducing the candidates r by way of example. When brightness ratio among bright lines of the same object point which are formed on the sensing surface by the image focusing lens systems is previously known r only corresponding points having the above ~0 brightness ratio are extracted. The brightness of each bright line is dependent upon the direction dependence of the reflectivity of the object point and the distance between the object point and the sensing surface. The distance between the object point and the sensing surface has been known when the candidates for the corresponding points are extracted. Accordingly, only corresponding points based upon realistic direction dependence of reflectivity are extracted. In a case where da-ta other than ~55'~
1 the brightness data is used, corresponding points can be extracted by a me~hod corresponding to the data.
BRI:EF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic diagram for explaining a method of forming a three-dimensional stereo vision through a conventional pattern matching technique using two bright-ness data which are obtained by two image focusing lens systems.
Fig. 2 is a schematic diagram for explaining the generation of a fundamental error in the method of forming a three-dimensional stereo vision through the conventional pattern matching technique.
Fig. 3 is a schematic diagram for explaining the principle of a method of forming a three-dimensional stereo vision in accordance with the present invention.
Fig. 4 is a block diagram showing an apparatus for carrying out the first embodiment of a method of forming a three-dimensional stereo vision in accordance with the present invention.
Fig. 5 is a block diagram (partly pictorial and partly schematic) showing an apparatus for carrying out the second embodiment of a method of forming a three-dimensional stereo vision in accordance with the present invention.
Figs. 6 and 7 are schemiatic diagrams for explain-ing a method of reducing candidates for corresponding points in accordance with the present invention.
3 0 ~3 1 DESCRIPTION OF THE PREFERRED EMsoDIMENTs In order to facilitate an understanding of the present invention, the prior art will first be explained, by reference to Figs. 1 and 2. Fig. 1 is a schematic diagram for explaining the pattern matching for two brightness data which are obtained by two image focusing lens systems,~nd this diagram shows a case where an object system is formed of three object points a, b and c. In a prior art, the first image focusing lens system and a second image focusing lens system are used, and brightness distribution is formed on a sensing surface by each of these image focusing lens systems. Referring to Fig. 1, three points a, b and c which ara spaced apart from one another, make up the object system, and the brightness distribution which is formed on 15`` the sensing surface by the first image focusing lens system, is the discrete brightness distribution made up of three brigh~ lines al, bl and cl. Further, the bright-ness distribution which is formed on the sensing surface by the second image focusing lens system is the discrete brightness distribution made up of three bright lines a2, b2 and c2. When the bright lines al, bl, cl, a2, b2 and c2 are obtained, it is determined by some method which of the bright lines obtained by one of the image focusing lens systems corresponds to each of the brigh~ lines ob~ained by the other image focusing lens system. Then, the distance between the sensing surface and the object system in the direction of depth is determined on the basis of the distance ~ between corresponding bright line~, and the horizontal ~ ~ ~ 5 ~
1 azimuth angle is determined on the basis of the dis~ance Qg indicating the geometrical positional relation bet~een one image focusing lens system and a bright line formed by this focusing lens system. Three-dimensional distance S information with respect to the object system is then obtained by using the above distance in the direction of depth, the above horizontal azimuth angle, and other factors.
In the prior art, the bright lines obtained by the first image focusing lens system are forced to correspond to the bright lines obtained by the second image focusing lens system (that is, corresponding bright lines are determined? by some pattern matching technique. Funda-mentally, what is meant by pattern matching techniques is that the scale (namely, measure) of a coordinate space is enlarged or contracted at each of the points in that portion of the coordinate space which has a brightness distribution, so that the difference between the brightness distribution in the coordinate space whose scale has been changed and reerence brightness distribution becomes minimum under some ~o criterion. In the case shown in Fig. 1, the change of ~he scale corresponds to the change of each of the distances between the bright lines a2 and b2 and between the bright lines b2 and c2, that is, the change of the positions of the bright lines a2, b2 and c2 without being attended with a change in the configuration order of these bright lines. It is meant by pattern matching that by changing the positions of the bright lines a2, b2 and c2 in the above manner, the positional relation among the ?l ~S'î ~
1 bright lines a2, b2 and c2 becomes equal to the positional relation among the bright lines al, bl and cl (that is, the brightness distribution including the bright lines a2, b2 and c2 becomes entirely equal to the brightness distribu-tion including the bright lines al, bl and cl).
Fig. 2 is a schematic diagram for explaining the generation of the previously-mentioned fundamental error in a case where an object system includes two object points so that the spacing between the objec-t points in the direc-tion of depth is greater than the spacing between theobject points in the direction corresponding to the change of horizontal azimuth angle. Referring to Fig. 2, the object system includes an object point a', instead of the object point a of Fig~ 1, and the spacing between the points a' and b in the direction of depth is greater than the spacing between the points a' and b in the direction corresponding to the change of horizontal azimuth angle, to such an extent that the point a' lies between two straight lines which connect the point b with the first and second image focusing lens systems. As shown in Fig~ 2, the brightness distribution which is formed on the sensing surface by the first image focusing lens system, is dif-ferent in the configuration order of bright lines from the brightness distribution which is formed on the sensing surface by the second image focusing lens system.
That is, the bright lines formed by the first image focusing lens system are arranged in the order of cl, a'l ~ ~ r 5~
1 and bl. While, the bright lines formed by the second image focusing lens system are arranged in the order of c2, b2 and a'2. When pattern matching is performed for these brightness distribution data, there is a s-trong probability that the bright line a'l is forced to correspond to the bright line b2 and the bright line bl is forc~d to corr-spond to the bright line a'2. As a result, the true object points a' and b are not detected from the brightness distribution data on the sensing surface, but the object system is judged to include the false object points m and n. That is, a fundamental error is generated.
Now, a case where pattern matching is performed for brightness distribution data obtained by three image focusing lens systems which are disposed on a straight line lS parallel to a sensing surface, by utilizing the principle mentioned in "SUMMARY OF THE INVENTION", will be briefly explained by reference to Fig. 3.
Fig. 3 shows a case where the third image focusing lens system is added to the first and second image focusing ~0 lens systems of Fig. 2. In Fig. 3, the first brightness ~istribution, which is formed on the sensing surface by the first focusing system, is disposed in a first imaginary plane which is spaced apart from a second imaginary plane in which there is disposed the second brightness distribution, which is formed on the sensing surface by the second focusing system, by the distance Q12 between the first and second focusing systems, and the third brightness distribution, which is formed on the sensing surface by the third focusing system, is disposed in a third imaginary plane which is spaced apart from the second ~s~
imaginary plane by the distance Q23 between the second and third focusing systems.
The above arrangement o~ the first, second and third brightness distribution in respective first, second and third imaginary planes can clearly show a special ~elation which e~ists between the geometrical positional relation among bright lines of the same object point which are formed by the focusing systems, and the geometrical positional relations among the focusing systems themselves and between the sensing surface and the focusing systems.
In the case of Fig. 3, the special relation is that a ratio of the distance between the first and second ones of three bxight lines of the same object point to the distance between ~he second and third bright lines is equal to a ratio of the distance between the first and second focusing systems to the distance between the second and third focusing systems. Accordingly, when the first brightness distribution formed by the first focusing system, the second brightness distribution formed by the second focusing system and the third brightness distribution formed by the third focusing system are arranged so that the ratio of the distance between the first brightness dis~ribution and the second brightness distribution to the distance between the second brightness distribution and the third brightness distribution is equal to the ratio of the distance Q12 and the distance Q23' three bright lines caused by the same object point lie on a straight line.
Conversely, when a group of bright lines lying on the same straight line is found, candidates for 1 corresponding points can be obtained. In Fi~. 3, straight lines each indicating corresponding bright lines are expressed by brokeA lines A, A', B, C and P. One of the above straight lines can be determined in the following S manner. That is, when a bright line in the brightness distribution formed by a focusing system (for example, the second focusin~ system), that is, a bright line in ~he second brightness distribution is specified, and a bright line is present at each of two positions one of which is spaced apart from the point on the first brightness distri-bution corresponding to the specified bright line in the leftward direction by a given distance, and the other of which is spaced apart from the point on the third brightness distribution corresponding to the specified bright line in the rightward direction by the distance equal to the product o~ the given distance and a ratio Q23/Q12' the above-mentioned straight line can be obtained by connecting these positions. The extraction of the above positions in the fixst and third brightness distribution capable of satisfy-ing the condition with respec~ to the ratio Q23/Q12 andthe AND operation for two pieces of brightness information at the above positions can be readily and rapidly performed by hardware or software.
The remaining straight lines are determined in such a manner that each of the bri~ht lines in the second brightness distribution other than the speci~ied bright line is selected, and the presence of a bright line is confirmed at those positions on the first and third brightness 1 distribution which can satisfy the above-mentioned condition with respect to the ratio Q23/Q12 and another condition that the distance between the selected bright line and the above position on the first brightness distribution is different from the distance between the specified bright line and that position on the first bright-ness distribution which has a bright line in the preceding processing.
In this case, a vast number of positioning opera-tions are performed on the first, second and third bright-ness distribution, and the number of positioning operations to be performed is substantially equal to the product of the number of bright lines in the second brightness distribution (that is, the number o~ object points in a plane which i.s defined by the focusing systems and bright lines on the actual sensing surface), the number of bright lines in the first or third brightness distribution which are to be combined with the bright lines in the second brightness distribution to determine the distance between ~0 the actual sensing surface and each object point in the direction of depth, and the number of bright lines which can be found on the actual sensing surface along a direc-tion corresponding to the change of vertical azimuth angle (that is, a direction parallel to a straight line at which a plane perpendicular to a straight line connecting the first, second and third focusing systems intersects with the actual sensing surface).
Now, let us consider an extreme case where the l number of bright lines in each of the first, second and third brightness distribution is equal to the spatial resolving power (namely, resolution). When it is assumed that the resolution is equal in all directions, the number of positioning operations is equal to the cube of the resolution at most.
In a case where the resolution is equal to 500, the number of positioning operatio~ is 125 x 106. If the positioning operations are performed by serial, digital processing which uses a clock signal of l MHz, a processing time proportional to 125 sec will be necessary. If a clock signal of lO MHz is used, a processing time proportional to 12.5 sec will be required.
In fact, the above positioning operations can be lS carried out by parallel processing, and thus the processing time can be greatly reduced. For example, brightness distribution data corresponding to those straight lines in an actual sensing surface which are parallel to the straight line connecting the first, second and third focusing systems and are equal in number to the resolution, can be processed at once by parallel processing. According-ly, the frequency of processing becomes apparently equal to 104. Thus, if the serial, digital processing using a clock signal of l MHz is carried out, a processing time proportional to 0.25 sec will be necessary. If the clock signal of 10 MHz is used, a processing time proportional to 0.025 sec. will be required. Further, the positioning operations for one of the second brightness distribution ~5 ~
1 and the first or third brightness distribution can ~e per~ormed in parallel when a bright line in the other brightnessdistribution is specified. Thus, the frequency of processing becomes apparently equal to S00~ If the serial, digital processing using the clock signal of 1 MHz is used for the above processing, a processing time proportional to 0.5 msec. will be necessary. If the clock signal of 10 MHz is used, a processing time proportional to O.OS msec. will be required. The pipeline method may be applied to the above S00 processing.
Further, all the possible combinations of posi-tions in the second brightness distribution and positions in the first or third brightness distributio~ are funda-mentally known. Accordingly, all processing can be carried lS out in parallel. In this case, the processing is apparently carried out onl~ once. ~ccordingly, if the serial, digital processing using the clock signal of 1 MHæ is utilized, the processing time will be in the order of 1 ~s. If the clock signal of 10 MHz is used, the processing time will be in the order of 0.1 ~s.
In a case where an object system is investigated in a limited range of each of the distance in the direction of depth, the horizontal azimuth angle and the vertical azimuth angle, for example, only 10~ of each of the second brightness distribution, the first or third brightness distribution and the whole range of vertical azimuth angle, the number of positioning operations is as small as 125 ~ 103 even when the parallel processing is not used.
~S~'f'~
1 Accordingl~, if the serial, digital processing using the clock signal of 1 MHz is utilized, t~e processing time will be about 0.125 sec. If the clock signal of 10 MHz is used, the processing time will be about 0.0125 sec.
Further, let us consider a case where the bright-ness distribution indicating edges in an object system can be used instead of the brightnes~ distribution indicating object points. When bright lines in the second brightness distribution which indicate edges are 20~ of bright lines in the same brightness distribution which indicate object points, bright lines in the first or third brightness distribution which indicate edges are 20% of bright lines in the same brightness distribution which indicate object points, and those bright lines arranged on the actual sensing lS surface along a direction corresponding to the change of vertical azimuth angle which indicate edges, are 20~ of bright lines which are arranged along the above direction and indicate object points, the number of positioning operations is equal to 106. Accordingly, if the serial, digital processing using the clock signal of 1 MHz is utilized, the processing time will be about 1 sec. I~
the clock signal of 10 MHz is used, the processing time will be about 0.1 sec.
The straight lines for extracting candidates for corresponding points, that is, the broken lines A, A', B, C and P can be detel~mined in the above-mentioned processing time. As can be seen from Fig. 3, the broken lines A, A', B and C correspond to actual object points ~55~
1 a, a', b and c, respectively, but an object point ~
corresponding to the broken line P does not e~ist actually.
The appearance of such a virtual object point is caused by the fact that the bright lines al, a'2 and b3 are formed on the sensing surface accidentally so that the ratio of the distance between the bright lines al and a'2 to the distance between the bright lines a'2 and b3 is equal to the ratio of the distance between the first and second focusing systems to the distance between the second and third focusing systems.
It can be known from Fig. 3 that the possibility of appearance of the virtual object point is greatly reduced by additionally disposing a fourth focusing system at a given position. If the fourth focusing system is disposed on the straight line connecting the first, second and third focusing systems, the reduction of the above possibility will be readily seen from Fig. 3.
Further, it can be supposed that the possibility of appearance of the virtual object point is reduced by changing the geometrical arrangement of the first, second and third focusing systems, instead of adding the fourth focusing system to these focusing systems. It can be readily seen from Fig. 3 that the above possibility will be reduced, if the first, second and third focusing systems are moved on the straight line connecting these focusing systems so that the ratio of the distance between the first and second focusing systems to the distance between the second and third focusing systems differs from the original - 19 ~
1 rati Q12/Q23-~ n e~treme case where a continuous straight line is used as an object system, will be explained below, by reference to Fig. 6. Fig. 6 shows a case where not only a line segment a c which is an object system, but also first, second and third focusing systems are disposed in the same plane. In this case, the image of every point in a common area of three triangular regions each bounded by two straight lines connecting one of the first, second and third focusing systems with both ends of the line segment a c and their extensions, can be formed on a sensing surface by each of these focusing systems. Accordingly, it seems as if an object system were present throughout the above common area, (namely, hatched area). In a case where an object system has a planar shape, a common portion of three conical regions each bounded by a curved surface connecting one of the first, second and third focusing systems with the periphery of the object system and its axtension, corresponds to the common area.
The hatched area of Fig. 6 can be eliminated by placing the third focusing system at a position deviating from the plane, in which the first and second focusing systems and the line segment a c lie, as shown in Fig. 7.
In Fig. 7, reference symbol 3' designates the new position of the third focusing system. In this case, a triangular region bounded by two straight lines connecting the third focusing system with both ends of the line segment a c and their extensions intersects with the above plane only ~ st~
1 at the line segment a c. Accordingly, the hatched area is extinguished.
Further, it can be readily known that the hatched portion can be made small by additionally disposing the fourth focusing system in the plane which contains the line segment a c and the first, second and third focusing systems, instead of deviating the third focusing system from the plane.
Also, in a case where an object system has a planar shape, the hatched portion can be reduced by chang-ing the seometrical arrangement of the focusing systems or by increasing the number of focusing systems. Further, when the planar shape is converted into a linear shape by preprocessing,such as edge extraction, or by decreasirg the size of the plate, the portion can be made extremely small.
The broken line P which appears in Fig. 3 and corresponds to the virtual object point p, can be removed b~ utilizing the brightness information of bright lines.
Now, let us consider a case where image points of the same object point formed by three focusing systems are equal in brightness to one another, as shown in Fig~ 3. When at least one of three bright lines extracted by a straight line is different in brightness from the remaining bright lines, this straight line is eliminated. Bright lines extracted by each of the broken lines A, A', B and C have the same brightness, but two bright lines al and a'2 of three bright lines al, a'2 and b3 extracted by the broken ~ 7 1 line P are different in brightness from the remaining bright line b3. Accordingly, the broken line P is elimi-nated.
The processing for eliminating an unwanted straight line can be carried out in a short time by software or hardware. Further, this processing and the above-mentioned positioning processing can be combined so as to be simultaneously performed.
Next, let us consider a case where image points (namely, bright lines) of the same object point formed by a plurality of focusing systems do not have the same brightness but the reflectivity of an object point (or the radiant intensity of a luminous object point) is equal in all directions. The brightness of a bright line on the sensing surface is inversely proportional to the square of the distance between the object point and the bright line. Accordingly, when brightness values each obtained by multiplying the brightness of one of bright lines extracted by a straight line by the square of the corre-sponding distance, are not equal, the straight line iseliminated.
In most cases, the above-mentioned conditions with respect to brightness are satisfied. Even in a case where the reflectivity of an object point varies with direction, if the extent of variations in reflectivity is previously known, the brightness of each of briyht points extracted by a straight line will be corrected by the above information, and thus it will be judged on the basis 1 of the corrected brightness values of t~e bright lines whether the straight line is to be eliminated or not. Even in a case where the distance correction or the correction for variations of the reflectivity of an object point with S direction is necessary, an unwanted straight line can be readily eliminated in a short time by software or hardware, as in a case where bright points of the same object point formed by a plurality of focusing systems have the same brightness.
It is to be noted that the prior art using a pattern matching technique determines corresponding points on the assumption that bright lines on the sensing surface are all equal in brightness.
After corresponding points have been determined in the above-mentioned manner, both the information on the positions of at least two of three bright lines indicated b~ the corresponding points, on the sensing surface, and the information on the geometrical positional relation between the focusing systems and the sensing surface are u5ed for obtaining two-dimensional distance information of an object point, that is, the distance between the object point and the sensing surface in the direction of depth and one or two horizontal azimuth angles. The two-dimensional distance information thus obtained is combined with the vertical azimuth angle information which is obtained independently of the above two-dimensional distance information, to obtain three-dimensional distance informa-tion of the object point.
~ 7 ~ ~
1 The processing for obtaining each of the two-dimensional distance information and the three-dimensional distance information is a processing based upon simple abgebraic equations, and can be readily performed in a short time. The whole quantity o~ processing necessary for obtaining three-dimensional distance information of an ob~ect system is proportional to the number of ob~ect points included in the object system.
Now, explanation will be made of a first embodiment of a method of forming a three-dimensional stereo vision in accordance with the present in~ention, by reference to Fig. 4.
The first embodiment is carried out by an apparatus which, as shown in Fig. 4, includes image focusing systems 11 to 13 each formed of a lens or mirror, image sensing surfaces 21 to 23, a detected information preprocessing system 30, a detected information storage system 40, a candidate-for-corresponding-point decision ~ system 50, a false-corresponding-point removing system 60, a three-dimensional distance information calculation sys~em 70, and a configuration change system 80 for changing the configuration of the image focusing systems 11 to 13.
The image focusing systems 11 to 13 are not always re~uired to lie on a straight line parall~1 to the sensing surfaces 21 to 23,~but the positions of the image focusing systems can be changed by the configuration change system 80 so that a plane containing the image focusing systems 11 to 13 makes a desired angle with the 1 image sensing surfaces 21 to 23.
The imaqes of an object system which are formed on the sensing surfaces 21 to 23 by the focusing systems 11 to 13, are sent to the detected information preprocessing system 30, which corrects errors due to distortions and various aberrations in the focusing system, and extracts bright lines which satisfy a brightness or color condition required for extracting edges or obtaining three-dimensional distance information, if necessary. The image information thus obtained is sent to the detected information storage system 40. When it is necessary to reduce false correspond-ing points by using four or more image focusing systems, the configuration o~ the focusing systems 11 to 13 and sensing surfaces 21 to 23 is changed by the configuration change system 80, and the image information obtained on the basis of the new configuration is also sent to the detected information storage system 40 through the preprocessing system 30. Next, the candidate-for-correspond-ing-point decision system 50 extracts candidates for corresponding points from a multiplicity of image points stored in the storage system 40, by making use of a special geometrical relation, as mentioned in "SUMM~RY OF THE
INVENTION". The extracted candidates are sent to the false-corresponding-point removing system 60, in which candidates corresponding to a false (namely, non-existing) object point are removed from the extracted candidates by making use of a brightness condition or others, as mentioned in '`SUMMARY OF THE INVENTION". The remaining candidates are ~S5'7l~
1 sent to the three-dimensional distance information calculation system 70, to obtain the three-dimensional distance information of the object system in a manner mentioned in "SU~AR~ OF THE INVENTION".
According to the apparatus of Fig. 4, the configuration of the focusing systems 11 to 13 and sensing surfaces 21 to 23 can be freely changed, and moreover, the information obtained on the basis of the new configuration is also stored. Thus, the apparatus of Fig. 4 can produce the same effect as an apparatus provided with four or more focusing systems, and hence can greatly reduce the redundancy in extracting candidates for corresponding points, no matter what structure the object system may hàve.
Next, explanation will be made of a second embodiment of a method of forming a three-dimensional stereo vision in accordance with the present invention, by reference to Fig. 5.
The second embodiment is carried out by an apparatus which, as shown in Fig. 5, includes color filters 101 to 103 corresponding to three primary colors, optical path length correction systems 112 and 113, mirrors 123, 133, 121 and 131, a preprocessing system 200, an image focusing system 300, a color image sensor 400, a horizontal addressing system 500, a brightness correcting circuit group 600, a brightness correction system 700, an AND
circuit group`800, a two-dimensional distance information calculation system 900, and a three-dimensional distance 1 information calculation system 1000.
The color filters 101, 102 and 103 transmit red light, green light and blue light, respectively, and correspond to three primary color filters mounted on the color image sensor 400. The color filters 101 to 103, the optical path length correction systems 112 and 113, and the mirrors 123, 133, 121 and 131 are disposed so that the light paths rom the color filters 101 to 103 to the color image sensor 400 have the same optical path length, and respective optical axes of three light beams from the color filters 101 to 103 lie in the same horizontal plane and are spaced apart from one another at the light receiving surface of the color image sensor 400 by an amount corresponding to the width (in a horizontal direc-tion) of each of the color filters mounted on the colorimage sensor 400. As shown in Fig. 5, each of the mirrors 133 and 131 can transmit light incident upon the back surface thereof ~eferring to Fig. 5, light which is emitted or ~0 re~lected from an object system and passes through an optical system which includes the color filters 101 to 103, the optical path length correction systems 112 and 113 and the mirrors 123, 133, 121 and 131, is subjected to necessary brightness modification processing and polarization processing in the preprocessing system 200, and then i5 focused on the color image sensor 400 by the image focusing system 300. The color image sensor 400 is able to be addressed by horizontal direction addressing system 500 ~5'~ ~
1 in respective colors so that three pixels adjacent to one another and correspondiny to three colors in a horizontal direction which are provided with the red, green and blue filters, can be separately addressed by three address lines prepared independently for each color and pixels arranged in a vertical direction and receiving light of the same color can be simultaneously addressed. When the addressing for the color image sensor 400 is made in a horizontal direction and in a predetermined order, pixels corresponding to an address simultaneously deliver informa-tion corresponding to the color filters mounted on these pixels, and the above information is sent to the brightness correcting circuit group 600. The circuit group 600 includes three correcting circuits for each of pixel layers arranged in a vertical direction. The output of each pixel included in the color image sensor 40Q is modified in accordance with that circuit constant of each correcting circuit which is specified by the brightness correction system 700, and then three color outputs from three pixels adjacent to one another in a horizontal direction are applied to one of AND circuits included in the AND circuit group 800. The brightness correction system 700 gives each correcting circuit of the brightness correcting circuit group 600 a circuit constant necessary for the correction of brightness balance among three primary colors, the brightness correction resulting-from the characteristics of the object system, and the classification of brightness into a plurality of levels. Three color outputs having been ~S5';'~
1 subjected to the brightness correction are applied to one AND circuit of the AND circuit group 800. When the three color outputs lie in a predetermined brightness range, the A~D circuit delivers ON~information. When at least one of the three color outputs does not lie i.n the predetermined brightness range, the AND circuit deli~ers OFF-information.
The ON- or OFF-information thus obtained is applied to one two-dimensional distance information calculation circuit of the two-dimensional distance information calculation system 900. When ON-information is applied to the above calculation circuit, the presence of an object point is known, and the two-dimensional distance information of the object point is obtained on the basis of the address information from the horizontal addressing system 500 by the method mentioned in "SUMMARY OF THE INVENTION". When the OFF-information is applied to the two-dimensional distance information calculation circuit, the absence of an object point is indicated, and thus no processing is carried out. Thè above two-dimensional distance informa-tion obtained in each of pixel layers arranged in the vertical direction is sent to the three-dimensional distance information calculation system 1000, to be used for obtain-ing the three-dimensional distance information of the object system.
As can be seen from the above explanation, according to the second embodiment, data from pixel layers arranged in the vertical direction are processed in parallel, and hence the processing time is greatly shortened.
~ss~
1 Further, the whole processing can be carried out by hardware, and hence not only the productivity and reliability are improved but also the processing speed is increased.
Furthermore, only the light receiving surface of a single S color image sensor is used as the image sensing surface, and hence the apparatus ror carrying out the second embodiment is simple in structure and can be constructed at a low cost.
As has been explained in the foregoing, accord-ing to the present invention, a method of forming a three-dimensional stereo vision is provided in which the processing time is short, and the fundamental error dependent upon the spatial distribution of object points can be avoided.
Further, the above mentioned according to the present invention has the following advantages.
(1) Even when an object system has a special brightness pattern, precise distance information can be formed by appropriately setting the number and configuration of image focusing systems.
(2) When brightness information, color information and others are used together with position information o each image point, unwanted data is removed, and thus the processing can be performed more accurately and at a higher speed.
(3) Unlike a conventional pattern matching techni~ue in which the whole of an object system is processed at once, it is possible to specify a plurality of limited ranges of distance in the direction of depth, a ~iS~7~
1 plurality of limited ranges of horizontal azimuth angle and a plurality of limited ranges of vertical azimuth angle in a desired order. Accordingly, only the three-dimensional distance information of that portion of an object system which exists in the specified, limited ranges of the above factors, can be obtained, and further the presence or absence of an object point in the above portion can be judged. Thus, efficient processing can be made.
1 plurality of limited ranges of horizontal azimuth angle and a plurality of limited ranges of vertical azimuth angle in a desired order. Accordingly, only the three-dimensional distance information of that portion of an object system which exists in the specified, limited ranges of the above factors, can be obtained, and further the presence or absence of an object point in the above portion can be judged. Thus, efficient processing can be made.
(4) Unlike the conventional pattern matching technique in which the whole of an object system is processed at once, it is possible to determine corresponding points for an object point independently of corresponding points for another object point. Accordingly, even when image information contains a local error, only a portion of the processing for obtaining three-dimensional distance information is affected by the error, and the accurate three-dimensional distance information of object points which have no connection with the erroneous image informa-tion, can be obtained.
(5) The processing speed can be greatly improved by using parallel processing at a portion of the whole processing.
(6) When preprocessing such as the extraction of edges included in an object system, is carried out, the numbr of bright lines used in greatly reduced, and the processing speed is further improved.
(7) The number of image focusing systems and the number of image sensing areas are re~uired to be 1 three or more in effect. Accordinyly, an apparatus for carrying out the method according to the present invention may be constructed so that a single image focusing system and a single image sensing surface can be S used in a time-divisional fashion.
In the foregoing explanation, the image of an object system is formed by an optical system. However, the present invention is not limited to the optical system, but is applicable to various propagation systems such as electromagnetic wave systems other than the optical system, an accoustic system, an elastic t~ave system, and a particle beam system (for example, the electron beam of an electron microscope).
In the foregoing explanation, the image of an object system is formed by an optical system. However, the present invention is not limited to the optical system, but is applicable to various propagation systems such as electromagnetic wave systems other than the optical system, an accoustic system, an elastic t~ave system, and a particle beam system (for example, the electron beam of an electron microscope).
Claims (28)
1. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the the geometrical positional relation-ships among said image focusing systems themselves and between said image focusing systems and said image sensing areas; and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said correlating of the image points comprises allocating the object images formed on said respective image sensing areas to respective imaginary planes which are superimposed with a spatial relationship corresponding to the spatial relationship of said image focusing systems, and detecting those image points on the imaginary planes which lie on the same straight line.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the the geometrical positional relation-ships among said image focusing systems themselves and between said image focusing systems and said image sensing areas; and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said correlating of the image points comprises allocating the object images formed on said respective image sensing areas to respective imaginary planes which are superimposed with a spatial relationship corresponding to the spatial relationship of said image focusing systems, and detecting those image points on the imaginary planes which lie on the same straight line.
2. A method according to claim 1, wherein each of said image sensing areas is formed on the same one-dimensional image sensing surface.
3. A method according to claim 1, wherein a single image focusing system and a single image sensing surface are fixed successively at least at three different positions and the image of said object system is formed on said image sensing surface so as to produce at least three object images by using the formed image data on said image sensing surface.
4. A method according to claim 1, wherein a single image focusing system is fixed successively at different positions so that the image of said object system can be formed on each of at least three image sensing surfaces.
5. A method according to claim 1, wherein said image focusing systems are arranged on a straight line.
6. A method according to claim 1, wherein said image focusing systems are arranged in a plane surface.
7. A method according to claim 1, wherein said image focusing systems are arranged in a curved surface.
8. A method according to claim 1, wherein said image focusing systems are arranged on a straight line at regular intervals.
9. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface so as to form a regular polygon.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface so as to form a regular polygon.
10. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged on a straight line at randon.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged on a straight line at randon.
11. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface so as to form an ordinary polygon.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface so as to form an ordinary polygon.
12. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface parallel to said image sensing surfaces.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface parallel to said image sensing surfaces.
13. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface which is oblique with respect to said image sensing surfaces.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein said image focusing systems are arranged in a plane surface which is oblique with respect to said image sensing surfaces.
14. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein a plurality of operations corresponding to a range of vertical azimuth angles for correlating points are performed in parallel.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein a plurality of operations corresponding to a range of vertical azimuth angles for correlating points are performed in parallel.
15. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein a plurality of operations corresponding to a range of horizontal azimuth angles and a range of vertical azimuth angles for correlating points are performed in parallel.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein a plurality of operations corresponding to a range of horizontal azimuth angles and a range of vertical azimuth angles for correlating points are performed in parallel.
16. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein corresponding points are determined in a limited range of vertical azimuth angles and in a limited range of horizontal azimuth angles.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein corresponding points are determined in a limited range of vertical azimuth angles and in a limited range of horizontal azimuth angles.
17. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein a plurality of limited ranges of vertical azimuth angles and a plurality of limited ranges of horizontal azimuth angles are specified in a desired order, and corresponding points are obtained in each specififed, limited range of vertical azimuth angles and in each specified, limited range of horizontal azimuth angles.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein a plurality of limited ranges of vertical azimuth angles and a plurality of limited ranges of horizontal azimuth angles are specified in a desired order, and corresponding points are obtained in each specififed, limited range of vertical azimuth angles and in each specified, limited range of horizontal azimuth angles.
18. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein image points corresponding to a non-existing object point are removed on the basis of brightness information obtained on said image sensing areas.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein image points corresponding to a non-existing object point are removed on the basis of brightness information obtained on said image sensing areas.
19. An apparatus for obtaining three-dimensional distance information comprising:
three color filtering means corresponding -to three primary colors and spaced apart from one another, for dividing light from an object system into three light means;
an image focusing system for forming the image of an object system by using said light means of said object system received from said color filtering means;
image sensing means formed of a color image sensor, said color image sensor being provided with three kinds of color filters corresponding to said three color filtering means in such a manner that the three color filters are repeatedly mounted on pixels which form said color image sensor, in a horizontal direction, to detect said three light beams separately, said pixels being scanned in a horizontal direction;
optical deflection means for making the optical axes of said light beams from said threee color filtering means parallel to the axis of said image focusing system, for juxtaposing said optical axes of said light beams in a direction parallel to a horizontal pixel layer of said color image sensor so that said optical axes deviate from one another, and for providing the same optical path length between each of said three color filtering means and the image sensing surface of said image sensing means;
horizontal addressing means for taking out information from pixels included in a horizontal pixel layer of said color image sensor in such a manner that a plurality of pixel groups each including three pixels which are provided with said three color filters, are successively addressed in a predetermined order, and the outputs of said three pixels of each pixel group are simultaneously taken out;
detection means for delivering ON-information when said three light beams from said three color filtering means have a brightness value within a predetermined range, at those three pixels of a horizontal pixel layer which are addressed by said horizontal addressing means, and for delivering OFF-information when at least one of said three light beams does not have a brightness value within said predetermined range at one of said three pixels;
two-dimensional distance information calculation means for obtaining two-dimensional distance information which includes the distance from said image sensing surface to said object system in the direction of depth and at least one horizontal azimuth angle, from the horizontal address information on said color image sensor; and three dimensional distance information calculation means for obtaining three-dimensional distance information which includes said two-dimensional distance information of said object system and a vertical azimuth angle, by using the vertical address information in said color image sensor.
three color filtering means corresponding -to three primary colors and spaced apart from one another, for dividing light from an object system into three light means;
an image focusing system for forming the image of an object system by using said light means of said object system received from said color filtering means;
image sensing means formed of a color image sensor, said color image sensor being provided with three kinds of color filters corresponding to said three color filtering means in such a manner that the three color filters are repeatedly mounted on pixels which form said color image sensor, in a horizontal direction, to detect said three light beams separately, said pixels being scanned in a horizontal direction;
optical deflection means for making the optical axes of said light beams from said threee color filtering means parallel to the axis of said image focusing system, for juxtaposing said optical axes of said light beams in a direction parallel to a horizontal pixel layer of said color image sensor so that said optical axes deviate from one another, and for providing the same optical path length between each of said three color filtering means and the image sensing surface of said image sensing means;
horizontal addressing means for taking out information from pixels included in a horizontal pixel layer of said color image sensor in such a manner that a plurality of pixel groups each including three pixels which are provided with said three color filters, are successively addressed in a predetermined order, and the outputs of said three pixels of each pixel group are simultaneously taken out;
detection means for delivering ON-information when said three light beams from said three color filtering means have a brightness value within a predetermined range, at those three pixels of a horizontal pixel layer which are addressed by said horizontal addressing means, and for delivering OFF-information when at least one of said three light beams does not have a brightness value within said predetermined range at one of said three pixels;
two-dimensional distance information calculation means for obtaining two-dimensional distance information which includes the distance from said image sensing surface to said object system in the direction of depth and at least one horizontal azimuth angle, from the horizontal address information on said color image sensor; and three dimensional distance information calculation means for obtaining three-dimensional distance information which includes said two-dimensional distance information of said object system and a vertical azimuth angle, by using the vertical address information in said color image sensor.
20. An apparatus according to claim 19, further comprising a brightness correction system for effecting brightness correction for outputs of three pixels which are included in one pixel group and provided with said three color filters.
21. An apparatus according to claim 19, wherein a contour extracting element is disposed in front of said image sensing surface.
22. An apparatus according to claim 19, wherein three polarizers and three analyzers for detecting said three light beams separately, perform substantially the same function as said three color filtering means and said three color filters, respectively,
23. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein correlating of the image points comprises detecting image points in respective image sensing areas which satisfy the relationship that the ratio of the distance between first and second image points of the same object point in respective first and second image sensing areas to the distance between said second and a third image point of the same object point in respective second and third sensing areas is equal to the ratio of the distance between first and second focusing system to the distance between said second and a third focusing system.
forming images of an object system on an image sensing surface by using at least three image focusing systems so that at least three object images of the object system are formed on respective image sensing areas;
correlating the image points of an object point which are formed on said respective image sensing areas by said image focusing systems by using a relationship which exists between the geometrical positional relationships among said image points and the geometrical positional relationships among said image focusing systems themselves and between said image focusing systems and said image sensing areas;
and obtaining the three-dimensional distance information of said object point by using information relating to the positions of at least two of the corresponding image points on the image sensing areas and information relating to the geometrical positional relationship between said two corresponding image points and the image focusing systems, wherein correlating of the image points comprises detecting image points in respective image sensing areas which satisfy the relationship that the ratio of the distance between first and second image points of the same object point in respective first and second image sensing areas to the distance between said second and a third image point of the same object point in respective second and third sensing areas is equal to the ratio of the distance between first and second focusing system to the distance between said second and a third focusing system.
24. A method of obtaining three-dimensional distance information, comprising the steps of:
forming images of an object on at least one image sensing surface by using at least three image focusing systems so that at least three object images of the object are formed on said image sensing surface;
determining correspondence between image points of said images formed on said image sensing surface and object points on said object by using a geometrical relationship in which a shape is formed by joining said at least three image focusing systems, which shape moves toward said image sensing surface in parallel while maintaining a similar form so as to form a similar shape disposed between said shape and said image sensing surface, each of the image points corresponding to the object points existing on points of intersection of said image sensing surface and straght lines passing through vertices of said shape and the corresponding vertices of said similar shape; and obtaining three-dimensional distance information of said object points by using information relating to the positions of at least two of said image points formed on said image sensing surface corresponding to a certain object point and information relating to the geometrical positional relationship between said two corresponding image points and said image focusing systems.
forming images of an object on at least one image sensing surface by using at least three image focusing systems so that at least three object images of the object are formed on said image sensing surface;
determining correspondence between image points of said images formed on said image sensing surface and object points on said object by using a geometrical relationship in which a shape is formed by joining said at least three image focusing systems, which shape moves toward said image sensing surface in parallel while maintaining a similar form so as to form a similar shape disposed between said shape and said image sensing surface, each of the image points corresponding to the object points existing on points of intersection of said image sensing surface and straght lines passing through vertices of said shape and the corresponding vertices of said similar shape; and obtaining three-dimensional distance information of said object points by using information relating to the positions of at least two of said image points formed on said image sensing surface corresponding to a certain object point and information relating to the geometrical positional relationship between said two corresponding image points and said image focusing systems.
25. A method according to claim 24, wherein when said at least three image focusing systems are arranged on a line and said line is parallel with said image sensing surface, and wherein determination of said correspondency is made by using the geometrical relationship that a ratio between distances of adjacent image focusing systems is the same as the ratio of distances of image points corresponding to the certain object point.
26. A method according to claim 24, wherein said at least three image focusing systems are arranged on a common plane and said plane is perpendicular to said image sensing surface.
27. A method according to claim 24, wherein said at least three image focusing systems are arrange on a common plane parallel with said image sensing surface so that all of the image points corresponding to a certain object point exist on vertices of a figure on said image sensing surface, said figure being similar to said shape formed on said plane.
28. An apparatus for obtaining three-dimensional distance information comprising:
three color filtering means corresponding to three primary colors and spaced apart from one another, for dividing light from an object system into three light means;
an image focusing system for forming the image of an object system by using said light means of said object system received from said color filtering means;
image sensing means formed of a color image sensor, said color image sensor being provided with three kinds of color filters corresponding to said three color filtering means in such a manner that the three color filters are repeatedly mounted on pixels which form said color image sensor, in a horizontal direction, to detect said three light beams separately, said pixels being scanned in a horizontal direction;
optical deflection means for making the optical axes of said light beams froms said three color filtering means parallel to the axis of said image focusing system, for juxtaposing said optical axes of said light beams in a direction parallel to a horizontal pixel layer of said color image sensor so that said optical axes deviate from one another, and for providing the same optical path length between each of said three color filtering means and the image sensing surface of said image sensing means;
horizontal addressing means for taking out information from pixels included in a horizontal pixel layer of said color image sensor in such a manner that a plurality of pixel groups each including three pixels which are provided with said three color filters, are successively addressed in a predetermined order, and the outputs of said three pixels of each pixel group are simultaneously taken out;
detection means for delivering ON-information when said three light beams from said three color filtering means have a brightness value within a predetermined range, at those three pixels of a horizontal pixel layer which are addressed by said horizontal addressing means, and for delivering OFF-information when at least one of said three light beams does not have a brightness value within said predetermined range at one of said three pixels;
two-dimensional distance information calculation means for obtaining two-dimensional distance information which includes two horizontal azimuth angles, from the horizontal address information on said color image sensor;
and three dimensional distance information calculation means for obtaining three-dimensional distance information which includes said two-dimensional distance information of said object system and a vertical azimuth angle, by using the vertical address information on said color image sensor.
three color filtering means corresponding to three primary colors and spaced apart from one another, for dividing light from an object system into three light means;
an image focusing system for forming the image of an object system by using said light means of said object system received from said color filtering means;
image sensing means formed of a color image sensor, said color image sensor being provided with three kinds of color filters corresponding to said three color filtering means in such a manner that the three color filters are repeatedly mounted on pixels which form said color image sensor, in a horizontal direction, to detect said three light beams separately, said pixels being scanned in a horizontal direction;
optical deflection means for making the optical axes of said light beams froms said three color filtering means parallel to the axis of said image focusing system, for juxtaposing said optical axes of said light beams in a direction parallel to a horizontal pixel layer of said color image sensor so that said optical axes deviate from one another, and for providing the same optical path length between each of said three color filtering means and the image sensing surface of said image sensing means;
horizontal addressing means for taking out information from pixels included in a horizontal pixel layer of said color image sensor in such a manner that a plurality of pixel groups each including three pixels which are provided with said three color filters, are successively addressed in a predetermined order, and the outputs of said three pixels of each pixel group are simultaneously taken out;
detection means for delivering ON-information when said three light beams from said three color filtering means have a brightness value within a predetermined range, at those three pixels of a horizontal pixel layer which are addressed by said horizontal addressing means, and for delivering OFF-information when at least one of said three light beams does not have a brightness value within said predetermined range at one of said three pixels;
two-dimensional distance information calculation means for obtaining two-dimensional distance information which includes two horizontal azimuth angles, from the horizontal address information on said color image sensor;
and three dimensional distance information calculation means for obtaining three-dimensional distance information which includes said two-dimensional distance information of said object system and a vertical azimuth angle, by using the vertical address information on said color image sensor.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP60080159A JPH07109625B2 (en) | 1985-04-17 | 1985-04-17 | 3D stereoscopic method |
| JP80159/85 | 1985-04-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CA1255788A true CA1255788A (en) | 1989-06-13 |
Family
ID=13710522
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CA000506640A Expired CA1255788A (en) | 1985-04-17 | 1986-04-15 | Method and apparatus for 3-dimensional stereo vision |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US4792694A (en) |
| EP (1) | EP0199269B2 (en) |
| JP (1) | JPH07109625B2 (en) |
| KR (1) | KR950001578B1 (en) |
| CA (1) | CA1255788A (en) |
| DE (1) | DE3683524D1 (en) |
Families Citing this family (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS6473468A (en) * | 1987-09-14 | 1989-03-17 | Sony Corp | Image processor |
| JP2681941B2 (en) * | 1987-09-14 | 1997-11-26 | ソニー株式会社 | Image processing device |
| US4965840A (en) * | 1987-11-27 | 1990-10-23 | State University Of New York | Method and apparatus for determining the distances between surface-patches of a three-dimensional spatial scene and a camera system |
| US5193124A (en) * | 1989-06-29 | 1993-03-09 | The Research Foundation Of State University Of New York | Computational methods and electronic camera apparatus for determining distance of objects, rapid autofocusing, and obtaining improved focus images |
| US5231443A (en) * | 1991-12-16 | 1993-07-27 | The Research Foundation Of State University Of New York | Automatic ranging and automatic focusing |
| IL115971A (en) * | 1995-11-14 | 1997-01-10 | Razon Moshe | Computer stereo vision system and method |
| JP3852070B2 (en) * | 1999-11-11 | 2006-11-29 | 富士通株式会社 | Optical path simulation CAD apparatus and method |
| US7170677B1 (en) | 2002-01-25 | 2007-01-30 | Everest Vit | Stereo-measurement borescope with 3-D viewing |
| EP1706702A2 (en) * | 2003-12-21 | 2006-10-04 | KREMEN, Stanley H. | System and apparatus for recording, transmitting, and projecting digital three-dimensional images |
| JP2007124088A (en) * | 2005-10-26 | 2007-05-17 | Olympus Corp | Image photographing device |
| US20080298674A1 (en) * | 2007-05-29 | 2008-12-04 | Image Masters Inc. | Stereoscopic Panoramic imaging system |
| CN103026171B (en) * | 2011-05-27 | 2016-03-16 | 松下电器产业株式会社 | Image processing apparatus and image processing method |
| US9185391B1 (en) | 2014-06-17 | 2015-11-10 | Actality, Inc. | Adjustable parallax distance, wide field of view, stereoscopic imaging system |
| JP6084192B2 (en) * | 2014-10-15 | 2017-02-22 | 本田技研工業株式会社 | Object recognition device |
| CN109842758B (en) * | 2017-11-29 | 2022-06-07 | 超威半导体公司 | computing sensor |
| FR3080937B1 (en) | 2018-05-03 | 2021-06-04 | Commissariat Energie Atomique | REAL-TIME DISTANCE RECOGNITION METHOD AND DEVICE |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US3770887A (en) * | 1972-01-31 | 1973-11-06 | Us Navy | Color stereo television |
| JPS51142212A (en) * | 1975-06-02 | 1976-12-07 | Hokkaido Daigaku | Tridimensional television system |
| JPS52119818A (en) * | 1976-04-01 | 1977-10-07 | Kiyoshi Nagata | Stereoscopic color televisition transmitter*receiver system and apparatus therefor |
| US4142138A (en) * | 1977-05-02 | 1979-02-27 | E. I. Du Pont De Nemours And Company | Motor control |
| NL7708399A (en) * | 1977-07-28 | 1979-01-30 | Univ Moskovsk | METHOD FOR STEREOSCOPIC COLOR TV. |
| US4217602A (en) * | 1979-02-12 | 1980-08-12 | Lady Bea Enterprises, Inc. | Method and apparatus for generating and processing television signals for viewing in three dimensions |
| US4259589A (en) * | 1979-07-20 | 1981-03-31 | Solid Photography, Inc. | Generation of contiguous data files of three-dimensional information |
| FR2498402A1 (en) * | 1981-01-16 | 1982-07-23 | Centre Nat Rech Scient | METHOD AND DEVICE FOR TRI-DIMENSIONAL VISUALIZATION FROM VIDEO-SIGNALS, IN PARTICULAR FOR ELECTRON MICROSCOPY |
| NL8202934A (en) * | 1982-07-21 | 1984-02-16 | Philips Nv | DEVICE FOR DISPLAYING THREE-DIMENSIONAL IMAGES. |
| US4654872A (en) * | 1983-07-25 | 1987-03-31 | Omron Tateisi Electronics Co. | System for recognizing three-dimensional objects |
| JPS6054081A (en) * | 1983-09-03 | 1985-03-28 | Omron Tateisi Electronics Co | Device for recognizing characteristic point of picture in binocular eyesight |
-
1985
- 1985-04-17 JP JP60080159A patent/JPH07109625B2/en not_active Expired - Lifetime
-
1986
- 1986-04-15 CA CA000506640A patent/CA1255788A/en not_active Expired
- 1986-04-15 DE DE8686105212T patent/DE3683524D1/en not_active Expired - Lifetime
- 1986-04-15 EP EP86105212A patent/EP0199269B2/en not_active Expired - Lifetime
- 1986-04-16 KR KR1019860002914A patent/KR950001578B1/en not_active Expired - Fee Related
- 1986-04-17 US US06/853,231 patent/US4792694A/en not_active Expired - Lifetime
Also Published As
| Publication number | Publication date |
|---|---|
| EP0199269A3 (en) | 1988-07-20 |
| KR860008462A (en) | 1986-11-15 |
| JPH07109625B2 (en) | 1995-11-22 |
| EP0199269B2 (en) | 1996-03-13 |
| US4792694A (en) | 1988-12-20 |
| DE3683524D1 (en) | 1992-03-05 |
| KR950001578B1 (en) | 1995-02-25 |
| JPS61240376A (en) | 1986-10-25 |
| EP0199269B1 (en) | 1992-01-22 |
| EP0199269A2 (en) | 1986-10-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CA1255788A (en) | Method and apparatus for 3-dimensional stereo vision | |
| CA2101996C (en) | Validation of optical ranging of a target surface in a cluttered environment | |
| CA1233234A (en) | Optical three-dimensional digital data acquisition system | |
| US6304298B1 (en) | Method and apparatus for determining the position of a TV camera for use in a virtual studio | |
| US5559322A (en) | Imaging optical tracker | |
| EP0523152B1 (en) | Real time three dimensional sensing system | |
| JP3220179B2 (en) | Three-dimensional color imaging method and apparatus | |
| US4253112A (en) | Process for automatic alignment of two objects to be adjusted with respect to one another | |
| Goshtasby | Correction of image deformation from lens distortion using bezier patches | |
| Magee et al. | Determining the position of a robot using a single calibration object | |
| US5703677A (en) | Single lens range imaging method and apparatus | |
| JPH0666241B2 (en) | Position detection method | |
| Ahmadabadian et al. | Image selection in photogrammetric multi-view stereo methods for metric and complete 3D reconstruction | |
| CN114742898B (en) | A laser radar and camera joint calibration method and system | |
| CA1290057C (en) | Method and apparatus for displaying moving objects | |
| JP2001356010A (en) | Three-dimensional shape measuring apparatus | |
| US5023724A (en) | Focus enhancing method | |
| JPH024030B2 (en) | ||
| JPH01311207A (en) | Three-dimensional shape measuring method | |
| Varderkooy et al. | Projective invariants and the correspondence problem | |
| US4822995A (en) | Linear array mirror system | |
| Tournas et al. | Orthophoto generation from unorganized point clouds | |
| Robson et al. | Image selection in photogrammetric multi-view stereo methods for metric and complete 3D reconstruction | |
| JPH03134778A (en) | Stereo picture processing method | |
| JPH0241581A (en) | Decision device for curved surface shape |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| MKEX | Expiry |