[go: up one dir, main page]

EP2568253B1 - Structured-light measuring method and system - Google Patents

Structured-light measuring method and system Download PDF

Info

Publication number
EP2568253B1
EP2568253B1 EP10850965.4A EP10850965A EP2568253B1 EP 2568253 B1 EP2568253 B1 EP 2568253B1 EP 10850965 A EP10850965 A EP 10850965A EP 2568253 B1 EP2568253 B1 EP 2568253B1
Authority
EP
European Patent Office
Prior art keywords
camera
matching
laser light
light spot
image position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP10850965.4A
Other languages
German (de)
French (fr)
Other versions
EP2568253A4 (en
EP2568253A1 (en
Inventor
Danwei Shi
Di Wu
Wenchuang Zhao
Qi XIE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Taishan Online Tech Co Ltd
Original Assignee
Shenzhen Taishan Online Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Taishan Online Tech Co Ltd filed Critical Shenzhen Taishan Online Tech Co Ltd
Publication of EP2568253A1 publication Critical patent/EP2568253A1/en
Publication of EP2568253A4 publication Critical patent/EP2568253A4/en
Application granted granted Critical
Publication of EP2568253B1 publication Critical patent/EP2568253B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Definitions

  • the present application relates to a structured-light based measuring method and a structured-light based measuring system employing such a method, and particularly to a structured-light based measuring method and a structured-light based measuring system employing such a method with a high measurement precision.
  • a structured-light based measuring system For a structured-light based measuring system, if a light beam spot and its corresponding coordinate Z are known, the coordinate of a cross point of the light beam on the surface of an object may be obtained.
  • light beams of the structured-light are designed in such a way that the image distribution of a light beam of the structured-light does not overlap that of a neighboring beam of the structured-light within a camera, and the respective image positions are recorded for various depths (i.e. the coordinates Z).
  • An interpolating method may be used for determining 3D coordinates of the known image point.
  • an existing structured-light based measuring system for the purpose of ensuring that the image distributions of any two light beams at different depths do not overlap with each other, it is necessary to concentrate the image distribution of each light beam, that is, a small image displacement corresponds to a large physical depth difference, which results in a low measuring precision.
  • WO 2004/011876 discloses a stereoscopic optical scanner which includes a marker generator.
  • a plurality of marker generators may be disposed along a circumference of an object in order to project the optical markers on the surface of the entire object.
  • the pattern projector projects predetermined patterns so that 3D scan data of the object can be obtained. Namely, space-encoded beams are projected on the surface of the object. That is, each marker is identified by relative position information of markers computed by processors based on 3D scans.
  • WO 2008/120457 discloses a 3D image measurement apparatus of non-static estimating movement state of object with a captured full-illumination reflected image and a reflected pattern image.
  • US 5,852,672 describes a 3D measuring system with six cameras placed around the measured object and that uses laser projection, where each of the sets of stereo pair of images is processed by the computer to generate three dimensional coordinates of the object surface at grid points, where the processing software establishes a correspondence of grid points seen from two perspectives i.e. cameras.
  • a long-coherence length laser is divided into a pair of beams by a splitter.
  • a pair of spatial filters directs the resulting pair of beam fans to cross. Then maximum intensity and minimum intensity at an intersect are used to determine information of an object.
  • US 2005/018209 A1 discloses an optical 3D digitizing method with an enlarged non-ambiguity zone, comprising: controllably projecting a fringe pattern having a shiftable position over a target area; capturing images obtained by high depth resolution sensing and low depth resolution sensing from respective measurement fields at least partially overlapping each other over the target area; determining absolute pixel 3D positions in the images obtained by low depth resolution sensing and high depth resolution sensing as a function of relations depending on the fringe pattern in the captured images and correspondence between the absolute pixel 3D positions in the images; extracting chromatic texture from the captured images; and building a complete textured 3D model from the absolute pixel 3D positions and the chromatic texture.
  • US 5,061,062 discloses a focus spot size controller for a variable depth triangulation ranging system.
  • the ranging system includes an apparatus for emitting light beam to be focused onto an object, a light sensitive apparatus, a lens apparatus for imaging reflected light onto said light sensitive apparatus, and an apparatus for calculating system geometry and range from signals received from the light sensitive apparatus.
  • the technical problem to be addressed by the present invention is to provide a structured-light based measuring method and a structured-light based measuring system using such a method as defined by the independent claims 1 and 6.
  • the present invention provide a technical solution that includes a structured-light based measuring method including:
  • the demarcation database is obtained by a demarcating process including: demarcating the first mapping relationship between an image position of each laser light spot within the first camera and the sequence number as well as the low-precision depth of the laser light spot, demarcating the second mapping relationship between an image position of each laser light spot within the second camera and the sequence number as well as the high-precision depth of the laser light spot, and storing the demarcated first and second mapping relationships in a memory to form the demarcation database for the use by the matching process and the calculating process.
  • a position of a laser output port relative the first camera is adjusted to prevent image positions of any two laser light spots within the first camera from overlapping with each other.
  • the distance between the second camera and the laser output port is larger than the distance between the first camera and the laser output port.
  • the demarcating process and the matching process are performed in a condition that image positions at different depths of the same laser light spot are surrounded by a geometric region.
  • the precise position of the laser light spot is obtained by an interpolating method applied on the image position in the second camera and the high-precision depth during the calculating process.
  • conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera during the matching process includes: searching for a reference matching pair according to a luminance difference of images of the laser light spot; and determining the optimal matching point using the reference matching pair.
  • the method further includes: conducting a 3D reconstruction of the candidates of matching point, to obtain a depth of each of the candidate of matching point; and conducting initial selection among the candidates of matching point according to the depths of the candidates of matching point.
  • a structured-light based measuring system including a processing system, an imaging system and a projecting system, where the imaging system includes a first camera and a second camera, the projecting system includes a laser generator for generating laser light, and the processing system includes a matching module and a calculating module, the matching module is adapted for obtaining a sequence number and a low-precision depth of a laser light spot based on an image position of the laser light spot within a first camera according to a first mapping relationship in a demarcation database, searching for image positions of the laser light spot within a second camera according to the sequence number and the low-precision depth of the laser light spot to obtain candidates of matching point, and conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera, to obtain a result of the matching; and the calculating module is adapted for obtaining an image position within the second camera that matches with the image position within the first camera according to the result of the matching, and determining a precise position of the laser
  • the demarcation database is obtained by a demarcating module through a demarcating process including: demarcating the first mapping relationship between an image position of each laser light spot within the first camera and the sequence number as well as the low-precision depth of the laser light spot, demarcating the second mapping relationship between an image position of each laser light spot within the second camera and the sequence number as well as the high-precision depth of the laser light spot, and storing the demarcated first and second mapping relationships in a memory to form the demarcation database for the use by the matching process and the calculating process.
  • a position of a laser output port relative the first camera is adjusted to prevent image positions of any two laser light spots within the first camera from overlapping with each other.
  • the distance between the second camera and the laser output port is larger than the distance between the first camera and the laser output port.
  • functions of the demarcating module and the matching module are implemented in a condition that image positions at different depths of the same laser light spot are surrounded by a geometric region.
  • the precise position of the laser light spot is obtained by an interpolating method applied on the image position in the second camera and the high-precision depth during the calculating process.
  • conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera by the matching module includes: searching for a reference matching pair according to a luminance difference of images of the laser light spot; and determining the optimal matching point using the reference matching pair.
  • the matching module is further adapted for conducting a 3D reconstruction of the candidates of matching point, to obtain a depth of each of the candidate of matching point; and conducting initial selection among the candidates of matching point according to the depths of the candidates of matching point.
  • the structured-light based measuring method and a structured-light based measuring system using such a method of the present invention are advantageous in that the measurement precision of the system is greatly improved by adding the second camera for fine measurement to the existing structured-light measuring system.
  • a demarcation database is established through the demarcating process, so that the structured-light measuring process is simplified.
  • the image positions of any two laser light spots within the first camera do not overlap with each other, so that the accuracy of the mapping relationship between the image position and the depth of each laser light spot (that is formed on the measured object) within the first camera is ensured during the demarcating process.
  • the distance between the second camera and the laser output port is larger than the distance between the first camera and the laser output port, so that a mapping relationship between the image position and the depth that is provided by the second camera is more precise than that provided by the first camera.
  • a geometric region is used for surrounding image positions at various depths of the same laser light spot in order to conduct the demarcating process and the matching process, and the matching speed may be accelerated.
  • the precise position of the laser light spot is obtained by an interpolating method applied on the sequence of the image position in the second camera and the depth, so that multiple precise depths of the measured objected may be obtained.
  • the matching is conducted according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera during the matching process, so that the matching result may be obtained more easily, simply and quickly.
  • a structured-light based measuring method of the invention includes a matching process and a calculating process, in which the used demarcation database is obtained through a demarcating process.
  • a first mapping relationship between an image position of each laser light spot within a first camera 21 and the sequence number as well as a low-precision depth (i.e. scene depth) of the laser light spot is determined.
  • an image position of a laser light spot i at a depth Z i j within the first camera 21 is denoted by ( u i j , v i j ) , and the image distributions of any two laser light spots are prevented from overlapping with each other by adjusting the position of the laser output port 31 (as shown in Figure 6 ) with respect to the first camera 21, as shown in the schematic image in Figure 2 .
  • Each of the point sets that are not overlapped and are separated in Figure 2 represents an image distribution of one laser light spot at various depths, and each point in the point set represents an image position of a laser light spot corresponding to the point set at a different depth.
  • data recorded for the laser light spot i may be denoted by u 1 i v 1 i z 1 i ⁇ u j i v j i z j i ⁇ u N i i v N i i z N i i , here, N i denotes the number of the demarcated images of the laser light spot i at various depths.
  • the point set may be surrounded by a regular geometric region in a shape such as a rectangle and an ellipse.
  • the data recorded for the laser light spot i is denoted by Param i u 1 i v 1 i z 1 i ⁇ u j i v j i z j i ⁇ u N i i v N i i z N i i
  • N i denotes the number of the demarcated images of the laser light spot i at various depths
  • Param i denotes a parameter for the region surrounding the point set, which may be the maximum and minimum horizontal and vertical coordinates of rectangle corners for a rectangular region surrounding the point set, or a center point and major and minor axes of an ellipse for an elliptic region surrounding the point set.
  • Figure 3 shows schematic division by rectangular surrounding regions.
  • a second mapping relationship between an image position of each laser light spot within a second camera 22 and the sequence number as well as a high-precision depth of the laser light spot is determined.
  • an image position of a laser light spot i at a depth Z i j within the second camera 22 is denoted by ( u i j , v i j )
  • data recorded for the laser light spot i may be denoted by u 1 i v 1 i z 1 i ⁇ u j i v j i z j i ⁇ u N i i v N i i z N i i , here, N i denotes the number of experimental data of the laser light spot i.
  • the image positions of two laser light spots might be overlapped, as schematically shown in Figure 4 .
  • the point set may be surrounded by a regular geometric region in a regular shape such as a rectangle and an ellipse.
  • the data recorded for the laser light spot i is denoted by Param i u 1 i v 1 i z 1 i ⁇ u j i v j i z j i ⁇ u N i i v N i i z N i i
  • N i denotes the number of the experimental data of the laser light spot i
  • Param i denotes a parameter for the region surrounding the point set, which may be the maximum and minimum horizontal and vertical coordinates of rectangle corners for a rectangular region surrounding the point set, or a center point and major and minor axes of an ellipse for an elliptic region surrounding the point set.
  • Figure 5 shows schematic rectangular surrounding regions.
  • the image position sequence within the first camera 21 is denoted by u 1 A v 1 A , ⁇ , u i A v i A , ⁇ , u M A v M A
  • the image position sequence within the second camera 22 is denoted by u 1 B v 1 B , ⁇ , u j B v j B , ⁇ , u N B v N B .
  • Any possible point among the image position sequence within the second camera 22 that matches the image position u i A v i A within the first camera 21 may be determined by the following steps (1), (2) and (3).
  • the sequence number and a low-precision depth of the laser light spot are determined according to an image position within the first camera 21 and the recorded table (e.g. the demarcation database) in the first camera 21.
  • the sequence number and a low-precision depth of a laser light spot corresponding to such image position may be determined directly (i.e. based on the first mapping relationship).
  • the determining of the sequence number and the low-precision depth may be as follows depending on different data in the recorded table obtained during the demarcation.
  • step (2) according to the obtained sequence number of the laser light spot and the sequence number of the image position of the laser light spot within the first camera 21, the image distribution of the laser light spot within the second camera 22 is searched out and candidates of matching point are obtained based on the image distribution.
  • the demarcated point data obtained from the demarcation data of the first camera 21 may be denoted by u T Index v T Index z T Index .
  • the obtained sequence of the demarcated image distribution of the laser light spot within the second camera 22 may be denoted by Param index u 1 index v 1 index z 1 index ⁇ u j index v j index z j index ⁇ u N index index v N index index index z N index index .
  • Candidates of demarcated point are searched out from the sequence of the demarcated image distribution of a laser light spot Index within the second camera 22 by a manner of obtaining a demarcated point within a range having a center z T Index , i.e. z T Index ⁇ d ⁇ z j Index ⁇ z T Index + d .
  • d denotes a matching search range that is defined manually.
  • any image point having satisfying similarity is determined from the image points u 1 B v 1 B , ⁇ , u j B v j B , ⁇ , u N B v N B in the second camera 22 by a method (A) or (B) below depending on the data of the recorded table obtained during the demarcation.
  • any of the image points u 1 B v 1 B , ⁇ , u j B v j B , ⁇ , u N B v N B that falls within the surrounding region Param Index is determined.
  • the surrounding region Param Index defined by ⁇ min_ u Index , max_ u Index ,min_ v Index ,max_ v Index ⁇ shall meet conditions of min _ u Index ⁇ u j B ⁇ max _ u Index and min _ v Index ⁇ v j B ⁇ max _ v Index .
  • the matching process is completed based on the known sequence of image positions u 1 A v 1 A , ⁇ , u i A v i A , ⁇ , u M A v M A within the first camera 21 and the candidates of matching point therefor within the second camera 22.
  • an object at the present stage is to find the best matching point for the point u i A v i A among the candidates of matching point, e.g. by below steps (A), (B), (C) and (D).
  • Steps (A) and (B) above relate to the coarse selection and may be selected for performing as desired.
  • the following general assignment algorithm may be used for implementing the matching between the sequence u 1 A v 1 A , ⁇ , u i A v i A , ⁇ , u M A v M A of known image positions in the first camera 21 and the respective candidates of matching point in the second camera 22.
  • x ij is an MxN matrix.
  • the matching process above involves a relatively large amount of calculation due to the assumption that any of p i and q j might form a matching pair, which is a Non-deterministic Polynomial (N-P) problem.
  • N-P Non-deterministic Polynomial
  • Many optimized algorithms were proposed for the N-P problem, but involve a calculation amount which is dramatically increased along with the increase of the size of the similarity matrix C.
  • C A 0 0 B
  • the similarity matrix C is calculated in the term of similarity of properties such as an image area, an image major axis, and an image minor axis, and information such as the epipolar constraint of a stereo vision system which is known to those skilled in the art may be taken into consideration.
  • the matrixes A and B are calculated through known methods such as the Hungary algorithm and the Branch-and-Bound Method.
  • the precise position of the laser light spot is determined according to the result of the matching.
  • the image position of the light spot in the second camera 22 is obtained, and a precise position of the light spot may be obtained through an interpolating method according to the second mapping relationship between the image position of a light spot within the second camera 22 and the sequence number as well as the high-precision depth of the light spot.
  • the distance between the second camera 22 and the laser output port 31 is larger than that between the first camera 21 and the laser output port 31, and the laser output port 31 is at a side of the first camera 21 that is opposite to the second camera 22, or arranged between the first and second camera 21 and 22.
  • the laser output port is located at a side of the first camera 21 that is opposite to the second camera 22, and the distance between the second camera 22 and the laser output port 31 is larger than the distance between the first camera 21 and the laser output port 31.
  • the laser output port 31 is arranged between the first and second camera 21 and 22, and the distance between the second camera 22 and the laser output port 31 is larger than the distance between the first camera 21 and the laser output port 31.
  • a mapping relationship between the image position and the depth provided by the second camera 22 is more precise than that provided by the first camera 21, and the precision of the second camera 22 may be adjusted as desired actually.
  • the position of the laser output port 31 may be adjusted as desired as long as the distance between the second camera 22 and the laser output port 31 is larger than the distance between the first camera 21 and the laser output port 31.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Description

    TECHNICAL FIELD
  • The present application relates to a structured-light based measuring method and a structured-light based measuring system employing such a method, and particularly to a structured-light based measuring method and a structured-light based measuring system employing such a method with a high measurement precision.
  • TECHNICAL BACKGROUND
  • For a structured-light based measuring system, if a light beam spot and its corresponding coordinate Z are known, the coordinate of a cross point of the light beam on the surface of an object may be obtained. For this purpose, during the demarcation of the structured-light based measuring system, light beams of the structured-light are designed in such a way that the image distribution of a light beam of the structured-light does not overlap that of a neighboring beam of the structured-light within a camera, and the respective image positions are recorded for various depths (i.e. the coordinates Z). During the practical measurement, it is possible to quickly determine a light beam corresponding to an already known image point in a manner of determining a light beam corresponding to the demarcated image point that is closest to the known image point. An interpolating method may be used for determining 3D coordinates of the known image point. In an existing structured-light based measuring system, for the purpose of ensuring that the image distributions of any two light beams at different depths do not overlap with each other, it is necessary to concentrate the image distribution of each light beam, that is, a small image displacement corresponds to a large physical depth difference, which results in a low measuring precision.
  • WO 2004/011876 discloses a stereoscopic optical scanner which includes a marker generator. A plurality of marker generators may be disposed along a circumference of an object in order to project the optical markers on the surface of the entire object. The pattern projector projects predetermined patterns so that 3D scan data of the object can be obtained. Namely, space-encoded beams are projected on the surface of the object. That is, each marker is identified by relative position information of markers computed by processors based on 3D scans.
  • WO 2008/120457 discloses a 3D image measurement apparatus of non-static estimating movement state of object with a captured full-illumination reflected image and a reflected pattern image.
  • US 5,852,672 describes a 3D measuring system with six cameras placed around the measured object and that uses laser projection, where each of the sets of stereo pair of images is processed by the computer to generate three dimensional coordinates of the object surface at grid points, where the processing software establishes a correspondence of grid points seen from two perspectives i.e. cameras. In said document, a long-coherence length laser is divided into a pair of beams by a splitter. A pair of spatial filters directs the resulting pair of beam fans to cross. Then maximum intensity and minimum intensity at an intersect are used to determine information of an object.
  • US 2005/018209 A1 discloses an optical 3D digitizing method with an enlarged non-ambiguity zone, comprising: controllably projecting a fringe pattern having a shiftable position over a target area; capturing images obtained by high depth resolution sensing and low depth resolution sensing from respective measurement fields at least partially overlapping each other over the target area; determining absolute pixel 3D positions in the images obtained by low depth resolution sensing and high depth resolution sensing as a function of relations depending on the fringe pattern in the captured images and correspondence between the absolute pixel 3D positions in the images; extracting chromatic texture from the captured images; and building a complete textured 3D model from the absolute pixel 3D positions and the chromatic texture.
  • US 5,061,062 discloses a focus spot size controller for a variable depth triangulation ranging system. The ranging system includes an apparatus for emitting light beam to be focused onto an object, a light sensitive apparatus, a lens apparatus for imaging reflected light onto said light sensitive apparatus, and an apparatus for calculating system geometry and range from signals received from the light sensitive apparatus.
  • SUMMARY OF THE INVENTION
  • In view of the defect of a low measurement precision in the above structured-light based measuring system in the prior art, the technical problem to be addressed by the present invention is to provide a structured-light based measuring method and a structured-light based measuring system using such a method as defined by the independent claims 1 and 6.
  • To address the technical problem, the present invention provide a technical solution that includes a structured-light based measuring method including:
    • a matching process, which includes obtaining a sequence number and a low-precision depth of a laser light spot based on an image position of the laser light spot within a first camera according to a first mapping relationship in a demarcation database, searching for image positions of the laser light spot within a second camera according to the sequence number and the low-precision depth of the laser light spot to obtain candidates of matching point, and conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera, to obtain a result of the matching; and
    • a calculating process, which includes obtaining an image position within the second camera that matches with the image position within the first camera according to the result of the matching, and determining a precise position of the laser light spot according to a second mapping relationship in the demarcation database.
  • In the inventive structured-light based measuring method, the demarcation database is obtained by a demarcating process including:
    demarcating the first mapping relationship between an image position of each laser light spot within the first camera and the sequence number as well as the low-precision depth of the laser light spot, demarcating the second mapping relationship between an image position of each laser light spot within the second camera and the sequence number as well as the high-precision depth of the laser light spot, and storing the demarcated first and second mapping relationships in a memory to form the demarcation database for the use by the matching process and the calculating process.
  • In the inventive structured-light based measuring method, during the demarcating process, a position of a laser output port relative the first camera is adjusted to prevent image positions of any two laser light spots within the first camera from overlapping with each other.
  • In the inventive structured-light based measuring method, the distance between the second camera and the laser output port is larger than the distance between the first camera and the laser output port.
  • In the inventive structured-light based measuring method, the demarcating process and the matching process are performed in a condition that image positions at different depths of the same laser light spot are surrounded by a geometric region.
  • In the inventive structured-light based measuring method, the precise position of the laser light spot is obtained by an interpolating method applied on the image position in the second camera and the high-precision depth during the calculating process.
  • In the inventive structured-light based measuring method, conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera during the matching process includes: searching for a reference matching pair according to a luminance difference of images of the laser light spot; and determining the optimal matching point using the reference matching pair.
  • In the inventive structured-light based measuring method, during the conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera in the matching process, before searching for a reference matching pair according to a luminance difference of images of the laser light spot, the method further includes: conducting a 3D reconstruction of the candidates of matching point, to obtain a depth of each of the candidate of matching point; and conducting initial selection among the candidates of matching point according to the depths of the candidates of matching point.
  • There is provided a structured-light based measuring system, including a processing system, an imaging system and a projecting system, where the imaging system includes a first camera and a second camera, the projecting system includes a laser generator for generating laser light, and the processing system includes a matching module and a calculating module,
    the matching module is adapted for obtaining a sequence number and a low-precision depth of a laser light spot based on an image position of the laser light spot within a first camera according to a first mapping relationship in a demarcation database, searching for image positions of the laser light spot within a second camera according to the sequence number and the low-precision depth of the laser light spot to obtain candidates of matching point, and conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera, to obtain a result of the matching; and
    the calculating module is adapted for obtaining an image position within the second camera that matches with the image position within the first camera according to the result of the matching, and determining a precise position of the laser light spot according to a second mapping relationship in the demarcation database.
  • In the inventive structured-light based measuring system, the demarcation database is obtained by a demarcating module through a demarcating process including: demarcating the first mapping relationship between an image position of each laser light spot within the first camera and the sequence number as well as the low-precision depth of the laser light spot, demarcating the second mapping relationship between an image position of each laser light spot within the second camera and the sequence number as well as the high-precision depth of the laser light spot, and storing the demarcated first and second mapping relationships in a memory to form the demarcation database for the use by the matching process and the calculating process.
  • In the inventive structured-light based measuring system, during the demarcating process by the demarcating module, a position of a laser output port relative the first camera is adjusted to prevent image positions of any two laser light spots within the first camera from overlapping with each other.
  • In the inventive structured-light based measuring system, the distance between the second camera and the laser output port is larger than the distance between the first camera and the laser output port.
  • In the inventive structured-light based measuring system, functions of the demarcating module and the matching module are implemented in a condition that image positions at different depths of the same laser light spot are surrounded by a geometric region.
  • In the inventive structured-light based measuring system, the precise position of the laser light spot is obtained by an interpolating method applied on the image position in the second camera and the high-precision depth during the calculating process.
  • In the inventive structured-light based measuring system, conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera by the matching module includes: searching for a reference matching pair according to a luminance difference of images of the laser light spot; and determining the optimal matching point using the reference matching pair.
  • In the inventive structured-light based measuring system, during the conducting matching according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera by the matching module, before searching for a reference matching pair according to a luminance difference of images of the laser light spot, the matching module is further adapted for conducting a 3D reconstruction of the candidates of matching point, to obtain a depth of each of the candidate of matching point; and conducting initial selection among the candidates of matching point according to the depths of the candidates of matching point.
  • The structured-light based measuring method and a structured-light based measuring system using such a method of the present invention are advantageous in that the measurement precision of the system is greatly improved by adding the second camera for fine measurement to the existing structured-light measuring system.
  • In the invention, a demarcation database is established through the demarcating process, so that the structured-light measuring process is simplified. The image positions of any two laser light spots within the first camera do not overlap with each other, so that the accuracy of the mapping relationship between the image position and the depth of each laser light spot (that is formed on the measured object) within the first camera is ensured during the demarcating process. The distance between the second camera and the laser output port is larger than the distance between the first camera and the laser output port, so that a mapping relationship between the image position and the depth that is provided by the second camera is more precise than that provided by the first camera. A geometric region is used for surrounding image positions at various depths of the same laser light spot in order to conduct the demarcating process and the matching process, and the matching speed may be accelerated. During the calculating process, the precise position of the laser light spot is obtained by an interpolating method applied on the sequence of the image position in the second camera and the depth, so that multiple precise depths of the measured objected may be obtained. The matching is conducted according to the image position of the laser light spot within the first camera and the respective candidates of matching point within the second camera during the matching process, so that the matching result may be obtained more easily, simply and quickly.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention is now further described below with reference to the drawings and embodiments, in which,
    • Figure 1 is a schematic structural diagram of a structured-light based measuring system according to a preferred embodiment of the invention.
    • Figure 2 is a schematic diagram showing the images of laser light spots that are formed within a first camera of a structured-light based measuring system according to a preferred embodiment of the invention.
    • Figure 3 is a schematic diagram showing the division of image regions of laser light spots that are formed within the first camera of the structured-light based measuring system according to the preferred embodiment of the invention.
    • Figure 4 is a schematic diagram showing the images of neighboring laser light spots that are formed within a second camera of a structured-light based measuring system according to a preferred embodiment of the invention.
    • Figure 5 is a schematic diagram showing the division of image regions of laser light spots that are formed within a first camera of a structured-light based measuring system according to a preferred embodiment of the invention.
    • Figure 6 is a schematic structural diagram showing a structured-light based measuring system according to a preferred embodiment of the invention.
    • Figure 7 is a schematic structural diagram showing a structured-light based measuring system according to a preferred embodiment of the invention.
    DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
  • Preferred embodiments of the present invention are described in detail below in connection with the attached drawings.
  • As shown in Figure 1, a structured-light based measuring method of the invention includes a matching process and a calculating process, in which the used demarcation database is obtained through a demarcating process.
  • During the demarcating process, a first mapping relationship between an image position of each laser light spot within a first camera 21 and the sequence number as well as a low-precision depth (i.e. scene depth) of the laser light spot is determined. Particularly, in the case of the first camera 21, it is assumed that an image position of a laser light spot i at a depth Zi j within the first camera 21 is denoted by (ui j, vi j ), and the image distributions of any two laser light spots are prevented from overlapping with each other by adjusting the position of the laser output port 31 (as shown in Figure 6) with respect to the first camera 21, as shown in the schematic image in Figure 2. Each of the point sets that are not overlapped and are separated in Figure 2 represents an image distribution of one laser light spot at various depths, and each point in the point set represents an image position of a laser light spot corresponding to the point set at a different depth. In this case, data recorded for the laser light spot i may be denoted by u 1 i v 1 i z 1 i u j i v j i z j i u N i i v N i i z N i i ,
    Figure imgb0001
    here, Ni denotes the number of the demarcated images of the laser light spot i at various depths. In practice, to accelerate the matching, the point set may be surrounded by a regular geometric region in a shape such as a rectangle and an ellipse. Thus, the data recorded for the laser light spot i is denoted by Param i u 1 i v 1 i z 1 i u j i v j i z j i u N i i v N i i z N i i ,
    Figure imgb0002
    here Ni denotes the number of the demarcated images of the laser light spot i at various depths, and Parami denotes a parameter for the region surrounding the point set, which may be the maximum and minimum horizontal and vertical coordinates of rectangle corners for a rectangular region surrounding the point set, or a center point and major and minor axes of an ellipse for an elliptic region surrounding the point set. Figure 3 shows schematic division by rectangular surrounding regions.
  • Further, during the demarcating process, a second mapping relationship between an image position of each laser light spot within a second camera 22 and the sequence number as well as a high-precision depth of the laser light spot is determined. Particularly, in the case of the second camera 22, it is assumed that an image position of a laser light spot i at a depth Zi j within the second camera 22 is denoted by (ui j, vi j ), in this case, data recorded for the laser light spot i may be denoted by u 1 i v 1 i z 1 i u j i v j i z j i u N i i v N i i z N i i ,
    Figure imgb0003
    here, Ni denotes the number of experimental data of the laser light spot i. It shall be noted that the image positions of two laser light spots might be overlapped, as schematically shown in Figure 4. In practice, to accelerate the matching, the point set may be surrounded by a regular geometric region in a regular shape such as a rectangle and an ellipse. Thus, the data recorded for the laser light spot i is denoted by Param i u 1 i v 1 i z 1 i u j i v j i z j i u N i i v N i i z N i i ,
    Figure imgb0004
    here Ni denotes the number of the experimental data of the laser light spot i, and Parami denotes a parameter for the region surrounding the point set, which may be the maximum and minimum horizontal and vertical coordinates of rectangle corners for a rectangular region surrounding the point set, or a center point and major and minor axes of an ellipse for an elliptic region surrounding the point set. Figure 5 shows schematic rectangular surrounding regions.
  • During the matching process, for example, the image position sequence within the first camera 21 is denoted by u 1 A v 1 A , , u i A v i A , , u M A v M A ,
    Figure imgb0005
    and the image position sequence within the second camera 22 is denoted by u 1 B v 1 B , , u j B v j B , , u N B v N B .
    Figure imgb0006
    Any possible point among the image position sequence within the second camera 22 that matches the image position u i A v i A
    Figure imgb0007
    within the first camera 21 may be determined by the following steps (1), (2) and (3).
  • At step (1), the sequence number and a low-precision depth of the laser light spot are determined according to an image position within the first camera 21 and the recorded table (e.g. the demarcation database) in the first camera 21.
  • Because the image point sets of the various laser light spots do not overlap with each other within the first camera 21, according to the image position u i A v i A
    Figure imgb0008
    within the first camera 21, the sequence number and a low-precision depth of a laser light spot corresponding to such image position may be determined directly (i.e. based on the first mapping relationship).
  • The determining of the sequence number and the low-precision depth may be as follows depending on different data in the recorded table obtained during the demarcation.
    1. (A) If the point sets are not surrounded by geometric regions, in the case of an image position u i A v i A
      Figure imgb0009
      within the first camera 21, the recorded table is searched and an item thereof containing an image position (ui j , vi j ) having the most similarity to the image position u i A v i A ,
      Figure imgb0010
      i.e. u 1 i v 1 i z 1 i u j i v j i z j i u N i i v N i i z N i i
      Figure imgb0011
      is searched out. The calculating of the similarity may be based on a distance similarity of R = u j i u i A 2 + v j i v i A 2 .
      Figure imgb0012
    2. (B) If the point sets are surrounded by geometric regions, the recorded table is searched to find an item thereof containing an image position u i A v i A
      Figure imgb0013
      falling within the geometric region, i.e. Param i u 1 i v 1 i z 1 i u j i v j i z j i u N i i v N i i z N i i .
      Figure imgb0014
      For example, in the case of a rectangular region, Parami is {min_ui ,max_ui ,min_vi ,max_vi }, thus min _ u i u i A max _ u i
      Figure imgb0015
      and min _ v i v i A max _ v i
      Figure imgb0016
      shall be satisfied.
  • At step (2), according to the obtained sequence number of the laser light spot and the sequence number of the image position of the laser light spot within the first camera 21, the image distribution of the laser light spot within the second camera 22 is searched out and candidates of matching point are obtained based on the image distribution.
  • If the obtained sequence number of the laser light spot and the sequence number of the image position of the laser light spot within the first camera 21 are denoted by Index and T, respectively, the demarcated point data obtained from the demarcation data of the first camera 21 may be denoted by u T Index v T Index z T Index .
    Figure imgb0017
  • The obtained sequence of the demarcated image distribution of the laser light spot within the second camera 22 may be denoted by Param index u 1 index v 1 index z 1 index u j index v j index z j index u N index index v N index index z N index index .
    Figure imgb0018
  • Candidates of demarcated point are searched out from the sequence of the demarcated image distribution of a laser light spot Index within the second camera 22 by a manner of obtaining a demarcated point within a range having a center z T Index ,
    Figure imgb0019
    i.e. z T Index d z j Index z T Index + d .
    Figure imgb0020
    Here, d denotes a matching search range that is defined manually.
  • If the number sequence of the satisfying demarcated points within the second camera 22 is denoted by {index 1,···,indexP ,···,indexC }, for each element of the number sequence, any image point having satisfying similarity is determined from the image points u 1 B v 1 B , , u j B v j B , , u N B v N B
    Figure imgb0021
    in the second camera 22 by a method (A) or (B) below depending on the data of the recorded table obtained during the demarcation.
  • (A) In the case that the point sets are not surrounded by geometric regions, for each demarcated image point u index p Index v index p Index z index p Index
    Figure imgb0022
    from the sequence of satisfying demarcated image points within the second camera 22, any of the image points u 1 B v 1 B , , u j B v j B , , u N B v N B
    Figure imgb0023
    that has satisfying similarity is determined as a candidate.
  • The calculation of the similarity may be based on a distance similarity R = u index p Index u j B 2 + v index p Index v j B 2 ,
    Figure imgb0024
    here R≥ Threshold and Threshold is a predefined value.
  • (B) In the case that the point sets are surrounded by geometric regions, any of the image points u 1 B v 1 B , , u j B v j B , , u N B v N B
    Figure imgb0025
    that falls within the surrounding region ParamIndex is determined.
  • For example, in the case of a rectangular surrounding region, the surrounding region ParamIndex defined by {min_uIndex , max_uIndex ,min_vIndex ,max_vIndex } shall meet conditions of min _ u Index u j B max _ u Index
    Figure imgb0026
    and min _ v Index v j B max _ v Index .
    Figure imgb0027
  • At step (3), the matching process is completed based on the known sequence of image positions u 1 A v 1 A , , u i A v i A , , u M A v M A
    Figure imgb0028
    within the first camera 21 and the candidates of matching point therefor within the second camera 22.
  • As described above, the initial measurement of the depths of the laser light spot array forming image distribution within the first camera 21 has been completed, and the candidates of matching point from the point array within the second camera 22 are found. In the case of a certain point u i A v i A
    Figure imgb0029
    within the first camera 21, if the depth of such point is determined as ZA i in the first camera 21 and candidates of matching point in the second camera 22 for such point are (uB j , vB j ), (uB k, vB k ) and (uB l, vB l ) (the description is made here by an example of three candidates, but the present invention is not limited to this), an object at the present stage is to find the best matching point for the point u i A v i A
    Figure imgb0030
    among the candidates of matching point, e.g. by below steps (A), (B), (C) and (D).
    1. (A) 3D reconstruction for the candidates of matching point is conducted. Here, Binocular Stereo Vision technologies may be used. The 3D reconstruction with a high precision may be completed once the binocular demarcating of the first and second cameras 21 and 22 is conducted. If the candidates of matching point (uB j, vB j ), (uB k, vB k ) and (uB l, vB l ) are respectively paired with the point u i A v i A
      Figure imgb0031
      for the purpose of the 3D reconstruction, 3 depths z j AB ,
      Figure imgb0032
      z k AB ,
      Figure imgb0033
      and z I AB
      Figure imgb0034
      may be obtained.
    2. (B) Selection is made based on the depths. Particularly, an initial selection among the 3 depths may be made using z i A .
      Figure imgb0035
      If a measurement precision of z i A
      Figure imgb0036
      measured by the first camera 21 is d, any candidate of matching point that corresponds to a depth exceeding a range of z i A d 2 , z i A + d 2
      Figure imgb0037
      is discarded. Further, depending on various applications, if the shot scene depth range is limited, an allowable variation range such as a range from 1 to 5 meters may be provided for the reconstructed depth, for the purpose of the selection. For the ease of description, the candidate of matching point (uB l, vB l ) is discarded at this step, for example.
    3. (C) The searching for a reference matching pair is conducted. Particularly, the optimal matching point for the point u i A v i A
      Figure imgb0038
      shall be searched out from the remaining candidates of matching point. For example, if the candidate of matching point (uB l, vB l ) is discarded as above, an optimal matching point shall be searched out from the candidates of matching point of (uB j , vB j ) and (uB k, vB k ), which is relatively difficult because both of the candidates (uB j , vB j ) and (uB k, vB k ), which have a precision better than that of z i A ,
      Figure imgb0039
      satisfy the precision requirement. Therefore, the selection can be made merely based on the image positions of the laser light spot in the camera. In this case, it is necessary to find a reference matching pair from the first and second cameras 21 and 22.
      If the laser output port 31 is arranged between the first and second cameras, the common visual field is relatively large, which is preferable. Typically, a regular laser light spot array is obtained by the effects of both interference and diffraction of a point light source, and is characterized in that the luminance of spots close to the center of the light source is higher due to more interference than spots at the periphery of the light source, as a result, the areas of images of the spots close to the center of the light source are larger than those of the spots at the periphery of the light source. With such a character and the large common imaging visual field of the first and second cameras, a statistical method is used to search for a reliable reference matching pair as described below.
      In the two pictures within the first and second cameras 21 and 22, the images of the light spots are sorted according to their areas, N light spot images with the largest areas are selected, a geometrical center position of the selected light spot images is calculated, and light spot images closest to the calculated geometrical center position are searched out, to obtain two light spot images within the first and second cameras 21 and 22 as the reliable reference matching pair.
      It shall be noted that the above method for selecting reference matching points cannot ensure the reliability in the direction Y of the coordinate system, because of the comparable luminance of light spots along the same vertical line close to the central region. However, the reliability in the direction X of the coordinate system is high due to the relatively large luminance difference in the horizontal direction. Therefore, the X coordinate of the center region with high luminance can be obtained through the statistics above. Fortunately, the epipolar constraint can be used for the direction Y. Further in practice, a relatively regular laser light spot array is provided, and the candidates of matching point selected by the demarcation are typically around the same horizontal level and thus have very close Y coordinates, hence it is not necessary to conduct selection in the direction Y.
    4. (D) The optimal matching points are obtained through the reference matching pair. If the obtained reference matching pairs are x max A y max A ,
      Figure imgb0040
      x max B y max B ,
      Figure imgb0041
      merely the comparison in the direction X is required as described above. That is, merely x max B u j B
      Figure imgb0042
      and x max B u k B
      Figure imgb0043
      are required for comparing with x max A u i A .
      Figure imgb0044
      One of x max B u j B
      Figure imgb0045
      and x max B u k B
      Figure imgb0046
      that is closer to x max A u i A
      Figure imgb0047
      is at a distance to the reference matching point in the second camera 22 that is the closest to the distance from the point u i A v i A
      Figure imgb0048
      to the same reference matching point in the first camera 21 in the direction X, and thus is selected as the optimal matching point, which is assumed as u j B v j B
      Figure imgb0049
      here. As such, the matching between the image position in the first camera 21 and the image position in the second camera 22 is finished, and z j AB
      Figure imgb0050
      represents the depth of the point u i A v i A
      Figure imgb0051
      in the first camera 21 with the higher precision.
  • Steps (A) and (B) above relate to the coarse selection and may be selected for performing as desired.
  • In addition, the following general assignment algorithm may be used for implementing the matching between the sequence u 1 A v 1 A , , u i A v i A , , u M A v M A
    Figure imgb0052
    of known image positions in the first camera 21 and the respective candidates of matching point in the second camera 22.
  • If {p 1,···,p i···, pM } and {q1,···,qj ···,qM } denote two 2D labeling sequences, a corresponding similarity matrix C may be calculated as C = {ci,j }, here Ci,j represents the similarity of pi and qj. The total similarity F may be defined as F = i = 1 M j = 1 M c ij x ij .
    Figure imgb0053
  • Here, i = 1 M x ij = 1 , j = 1,2 , M ,
    Figure imgb0054
    j = 1 N x ij = 1 , i = 1,2 , M
    Figure imgb0055
    and xij is equal to 0 or 1. xij is an MxN matrix.
  • The maximum of the total similarity F may be obtained by varying xij. If x ij=1, it may be determined that pi and qj forms a matching pair. Otherwise, pi and qj do not form a matching pair.
  • The matching process above involves a relatively large amount of calculation due to the assumption that any of pi and qj might form a matching pair, which is a Non-deterministic Polynomial (N-P) problem. Many optimized algorithms were proposed for the N-P problem, but involve a calculation amount which is dramatically increased along with the increase of the size of the similarity matrix C.
  • If the similarity matrix C is characterized by C = A 0 0 B ,
    Figure imgb0056
    the N-P problem may be simplified to apply the above calculation on the matrix A and the matrix B, and thus the calculation amount involved is decreased greatly. Therefore, C = A 0 0 B
    Figure imgb0057
    is preferably achieved through various constraint conditions such as the epipolar constraint of a stereo vision system and image similarity in the practice.
  • The similarity matrix C is calculated in the term of similarity of properties such as an image area, an image major axis, and an image minor axis, and information such as the epipolar constraint of a stereo vision system which is known to those skilled in the art may be taken into consideration.
  • After obtaining the matrix C, if C = A 0 0 B ,
    Figure imgb0058
    the matrixes A and B are calculated through known methods such as the Hungary algorithm and the Branch-and-Bound Method.
  • During the calculation, the precise position of the laser light spot is determined according to the result of the matching.
  • Based on the obtained sequence number and low-precision depth of the laser light spot, the image position of the light spot in the second camera 22 is obtained, and a precise position of the light spot may be obtained through an interpolating method according to the second mapping relationship between the image position of a light spot within the second camera 22 and the sequence number as well as the high-precision depth of the light spot.
  • As shown in Figures 6 and 7, the distance between the second camera 22 and the laser output port 31 is larger than that between the first camera 21 and the laser output port 31, and the laser output port 31 is at a side of the first camera 21 that is opposite to the second camera 22, or arranged between the first and second camera 21 and 22. As shown in Figure 6, the laser output port is located at a side of the first camera 21 that is opposite to the second camera 22, and the distance between the second camera 22 and the laser output port 31 is larger than the distance between the first camera 21 and the laser output port 31. As shown in Figure 7, the laser output port 31 is arranged between the first and second camera 21 and 22, and the distance between the second camera 22 and the laser output port 31 is larger than the distance between the first camera 21 and the laser output port 31. In this way, a mapping relationship between the image position and the depth provided by the second camera 22 is more precise than that provided by the first camera 21, and the precision of the second camera 22 may be adjusted as desired actually. The position of the laser output port 31 may be adjusted as desired as long as the distance between the second camera 22 and the laser output port 31 is larger than the distance between the first camera 21 and the laser output port 31.
  • The embodiments of the present invention are described as above in conjunction with the drawings. However, the present invention is not limited to above detailed embodiments, since the full scope of protection of the present invention is only defined by the appended claims.

Claims (10)

  1. A structured-light based measuring method, comprising:
    a demarcating process which includes, demarcating a first mapping relationship between an image position of each laser light spot numbered with sequence numbers within a first camera (21) and the sequence number as well as a low-precision depth of the laser light spot, demarcating a second mapping relationship between an image position of each laser light spot within a second camera (22) and the sequence number as well as a high-precision depth of the laser light spot, and storing the demarcated first and second mapping relationships in a memory to form a demarcation database for the use by a matching process and a calculating process;
    the matching process, which includes obtaining the sequence number and the low-precision depth of the laser light spot based on an image position of the laser light spot within the first camera (21) according to the first mapping relationship in the demarcation database, searching for image positions of the laser light spot within the second camera (22) according to the sequence number and the low-precision depth of the laser light spot to obtain candidates of matching point, and conducting matching according to the image position of the laser light spot within the first camera (21) and the respective candidates of matching point within the second camera (22), to obtain a result of the matching; and
    the calculating process, which includes obtaining an image position within the second camera (22) that matches with the image position within the first camera (21) according to the result of the matching, and determining the precise position of the laser light spot according to the second mapping relationship in the demarcation database;
    wherein conducting matching according to the image position of the laser light spot within the first camera (21) and the respective candidates of matching point within the second camera (22) during the matching process comprises:
    searching for a reference matching pair according to a luminance difference of images of the laser light spot; and
    determining the optimal matching point using the reference matching pair;
    wherein during the conducting matching according to the image position of the laser light spot within the first camera (21) and the respective candidates of matching point within the second camera (22) in the matching process, before searching for a reference matching pair according to a luminance difference of images of the laser light spot, the method further comprises:
    conducting a 3D reconstruction of the candidates of matching point, to obtain a depth of each of the candidates of matching point; and
    conducting initial selection among the candidates of matching point according to the depths of the candidates of matching point.
  2. The method of claim 1, wherein during the demarcating process, a position of a laser output port (31) relative the first camera (21) is adjusted to prevent image positions of any two laser light spots within the first camera (21) from overlapping with each other.
  3. The method of claim 2, wherein a distance between the second camera (22) and the laser output port (31) is larger than a distance between the first camera (21) and the laser output port (31).
  4. The method of one of the preceding claims, wherein the demarcating process and the matching process are performed in a condition that image positions at different depths of the same laser light spot are surrounded by a geometric region.
  5. The method of one of the preceding claims, wherein the precise position of the laser light spot is obtained by an interpolating method applied on the image position in the second camera (22) and the high-precision depth during the calculating process.
  6. A structured-light based measuring system, comprising a processing system (1), an imaging system (2) and a projecting system (3), wherein the imaging system (2) comprises a first camera (21) and a second camera (22), the projecting system (3) comprises a laser generator for generating laser light, and the processing system (1) comprises a demarcating module, a matching module and a calculating module,
    the demarcating module is adapted for demarcating a first mapping relationship between an image position of each laser light spot numbered with sequence numbers within the first camera (21) and the sequence number as well as a low-precision depth of the laser light spot, demarcating a second mapping relationship between an image position of each laser light spot within the second camera (22) and the sequence number as well as a high-precision depth of the laser light spot, and storing the demarcated first and second mapping relationships in a memory to form the demarcation database for the use by the matching module and the calculating module;
    the matching module is adapted for obtaining the sequence number and the low-precision depth of the laser light spot based on an image position of the laser light spot within the first camera (21) according to the first mapping relationship in the demarcation database, searching for image positions of the laser light spot within the second camera (22) according to the sequence number and the low-precision depth of the laser light spot to obtain candidates of matching point, and conducting matching according to the image position of the laser light spot within the first camera (21) and the respective candidates of matching point within the second camera (22), to obtain a result of the matching, wherein conducting matching according to the image position of the laser light spot within the first camera (21) and the respective candidates of matching point within the second camera (22) by the matching module comprises:
    searching for a reference matching pair according to a luminance difference of images of the laser light spot and determining the optimal matching point using the reference matching pair; and
    the calculating module is adapted for obtaining an image position within the second camera (22) that matches with the image position within the first camera (21) according to the result of the matching, and determining a precise position of the laser light spot according to the second mapping relationship in the demarcation database;
    whereby the matching module is further adapted for conducting a 3D reconstruction of the candidates of matching point, to obtain a depth of each of the candidates of matching point conducting initial selection among the candidates of matching point according to the depths of the candidates of matching point during the conducting matching according to the image position of the laser light spot within the first camera (21) and the respective candidates of matching point within the second camera (22) by the matching module, before searching for a reference matching pair according to a luminance difference of images of the laser light spot.
  7. The system of claim 6, wherein the calculation module is adapted for obtaining the precise position of the laser light spot by an interpolating method applied on the image position in the second camera (22) and the high-precision depth during a calculating process.
  8. The system of claim 6, wherein the system is adapted for adjusting during the demarcating process by the demarcating module, a position of a laser output port (31) relative the first camera (21) to prevent image positions of any two laser light spots within the first camera (21) from overlapping with each other.
  9. The system of claim 8, wherein a distance between the second camera (22) and the laser output port (31) is larger than a distance between the first camera (21) and the laser output port (31).
  10. The system of claim 8 or 9, wherein functions of the demarcating module and the matching module are implemented in a condition that image positions at different depths of the same laser light spot are surrounded by a geometric region.
EP10850965.4A 2010-05-07 2010-05-07 Structured-light measuring method and system Active EP2568253B1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2010/072528 WO2011137596A1 (en) 2010-05-07 2010-05-07 Structured-light measuring method and system

Publications (3)

Publication Number Publication Date
EP2568253A1 EP2568253A1 (en) 2013-03-13
EP2568253A4 EP2568253A4 (en) 2013-10-02
EP2568253B1 true EP2568253B1 (en) 2021-03-10

Family

ID=44903583

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10850965.4A Active EP2568253B1 (en) 2010-05-07 2010-05-07 Structured-light measuring method and system

Country Status (6)

Country Link
US (1) US9360307B2 (en)
EP (1) EP2568253B1 (en)
JP (1) JP5567738B2 (en)
CN (1) CN102884397B (en)
AU (1) AU2010352828B2 (en)
WO (1) WO2011137596A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104918031B (en) * 2014-03-10 2018-08-07 联想(北京)有限公司 depth recovery device and method
CN103942802A (en) * 2014-04-29 2014-07-23 西安电子科技大学 Method for obtaining depth of structured light dynamic scene on basis of random templates
DE102014113389A1 (en) * 2014-09-17 2016-03-17 Pilz Gmbh & Co. Kg Method and device for identifying structural elements of a projected structural pattern in camera images
CN106289092B (en) * 2015-05-15 2020-10-27 高准国际科技有限公司 Optical device and light-emitting device thereof
CN105578173A (en) * 2016-02-20 2016-05-11 深圳市晶源动力科技有限公司 Rapid three-dimensional space projection and camera shooting visual identification system
CN106767707B (en) * 2016-12-16 2019-06-04 中南大学 A storage state detection method and system based on structured light
CN108267097B (en) * 2017-07-17 2020-07-10 先临三维科技股份有限公司 3D reconstruction method and device based on binocular 3D scanning system
CN107860337B (en) * 2017-10-11 2020-03-24 华天科技(昆山)电子有限公司 Structured light three-dimensional reconstruction method and device based on array camera
CN108257112B (en) * 2017-12-27 2020-08-18 北京七鑫易维信息技术有限公司 Method and device for filtering light spots
CN108428251A (en) * 2018-03-09 2018-08-21 深圳市中捷视科科技有限公司 One kind being based on machine vision technique laser structure light automatic calibration method
CN108957914B (en) * 2018-07-25 2020-05-15 Oppo广东移动通信有限公司 Laser projection module, depth acquisition device and electronic equipment
CN118379336A (en) * 2019-01-31 2024-07-23 先临三维科技股份有限公司 Method and device for line-fringe mismatch detection and three-dimensional reconstruction
CN110702007B (en) * 2019-10-31 2020-11-24 华中科技大学 A three-dimensional measurement method of line structured light based on MEMS scanning galvanometer
CN111983709A (en) * 2020-07-02 2020-11-24 中科兴华(深圳)科技服务有限公司 Multi-object capturing method of laser detector
CN112833816A (en) * 2020-12-31 2021-05-25 武汉中观自动化科技有限公司 Positioning method and system with mixed landmark positioning and intelligent reverse positioning
CN113252603B (en) * 2021-04-16 2022-03-01 清华大学 An Optimal Refractive Index Measurement Method for a Multilayer Transparent Pebble Bed
CN115272221B (en) * 2022-07-25 2025-07-29 北京大学 Silicon-based resonant device adjusting method and system based on image recognition
CN116067305A (en) * 2023-02-09 2023-05-05 深圳市安思疆科技有限公司 Structured light measurement system and measurement method
CN116593282B (en) * 2023-07-14 2023-11-28 四川名人居门窗有限公司 Glass impact resistance reaction test system and method based on structured light
CN119152032B (en) * 2024-09-12 2025-11-21 北京航空航天大学 Laser spot array detection method and system for structured light parameter calibration
CN119379890B (en) * 2024-12-27 2025-03-14 株洲霍普科技文化股份有限公司 Optical scene reconstruction system combining structured light and three-dimensional vision

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5061062A (en) * 1990-07-02 1991-10-29 General Electric Company Focus spot size controller for a variable depth range camera
US20050018209A1 (en) * 2003-07-24 2005-01-27 Guylain Lemelin Optical 3D digitizer with enlarged non-ambiguity zone

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5852672A (en) * 1995-07-10 1998-12-22 The Regents Of The University Of California Image system for three dimensional, 360 DEGREE, time sequence surface mapping of moving objects
CA2309008C (en) * 1999-05-24 2007-07-17 Richard Mcbain High speed laser triangulation measurements of shape and thickness
JP4337203B2 (en) * 2000-01-24 2009-09-30 ソニー株式会社 Distance image generating apparatus, distance image generating method, and program providing medium
JP3677444B2 (en) * 2000-10-16 2005-08-03 住友大阪セメント株式会社 3D shape measuring device
WO2002086420A1 (en) * 2001-04-19 2002-10-31 Dimensional Photonics, Inc. Calibration apparatus, system and method
JP4761670B2 (en) * 2001-08-27 2011-08-31 三洋電機株式会社 Moving stereo model generation apparatus and method
US6856314B2 (en) * 2002-04-18 2005-02-15 Stmicroelectronics, Inc. Method and system for 3D reconstruction of multiple views with altering search path and occlusion modeling
CN1300551C (en) * 2002-07-25 2007-02-14 索卢申力士公司 Apparatus and method for automatically arranging three dimensional scan data using optical marker
JP3624353B2 (en) * 2002-11-14 2005-03-02 有限会社テクノドリーム二十一 Three-dimensional shape measuring method and apparatus
EP1649423B1 (en) * 2003-07-24 2008-08-13 Cognitens Ltd. Method and sytem for the three-dimensional surface reconstruction of an object
CN1203292C (en) * 2003-08-15 2005-05-25 清华大学 Method and system for measruing object two-dimensiond surface outline
CN1260544C (en) 2004-07-14 2006-06-21 天津大学 Compatible and accurate calibration method for double eye line structure photo-sensor and implementing apparatus
JP4554316B2 (en) * 2004-09-24 2010-09-29 富士重工業株式会社 Stereo image processing device
JP5002144B2 (en) * 2005-09-30 2012-08-15 株式会社トプコン Projection apparatus and system for three-dimensional measurement
JP4855749B2 (en) * 2005-09-30 2012-01-18 株式会社トプコン Distance measuring device
JP2007114168A (en) * 2005-10-17 2007-05-10 Applied Vision Systems Corp Image processing method, device, and program
US7804585B2 (en) * 2006-01-08 2010-09-28 Visiontools Bildanalyse Systeme Gmbh Creation of a range image
JP4986679B2 (en) * 2007-03-29 2012-07-25 学校法人福岡工業大学 Non-stationary object three-dimensional image measurement apparatus, three-dimensional image measurement method, and three-dimensional image measurement program
CN101689299B (en) * 2007-06-20 2016-04-13 汤姆逊许可证公司 For the system and method for the Stereo matching of image
CN100554869C (en) 2007-07-11 2009-10-28 华中科技大学 A Two-Dimensional Three-Frequency Phase-Resolution Measurement Method Based on Color Structured Light

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5061062A (en) * 1990-07-02 1991-10-29 General Electric Company Focus spot size controller for a variable depth range camera
US20050018209A1 (en) * 2003-07-24 2005-01-27 Guylain Lemelin Optical 3D digitizer with enlarged non-ambiguity zone

Also Published As

Publication number Publication date
EP2568253A4 (en) 2013-10-02
WO2011137596A1 (en) 2011-11-10
CN102884397B (en) 2015-07-15
JP5567738B2 (en) 2014-08-06
EP2568253A1 (en) 2013-03-13
AU2010352828A1 (en) 2012-11-29
JP2013525821A (en) 2013-06-20
CN102884397A (en) 2013-01-16
AU2010352828B2 (en) 2013-07-11
US20130050476A1 (en) 2013-02-28
US9360307B2 (en) 2016-06-07

Similar Documents

Publication Publication Date Title
EP2568253B1 (en) Structured-light measuring method and system
EP3907702B1 (en) Three-dimensional sensor system and three-dimensional data acquisition method
CN110230998B (en) Rapid and precise three-dimensional measurement method and device based on line laser and binocular camera
JP4290733B2 (en) Three-dimensional shape measuring method and apparatus
JP5631025B2 (en) Information processing apparatus, processing method thereof, and program
CN103279982B (en) The speckle three-dimensional rebuilding method of the quick high depth resolution of robust
CN108759669B (en) Indoor self-positioning three-dimensional scanning method and system
CN104903680B (en) Method for controlling linear dimensions of three-dimensional object
CN109186491A (en) Parallel multi-thread laser measurement system and measurement method based on homography matrix
JP2018513345A (en) Three-dimensional measurement system and measurement method for feature points of planar array of four cameras
JP2012504771A (en) Method and system for providing three-dimensional and distance inter-surface estimation
CN108362228B (en) Double-optical-machine-based optical knife grating hybrid three-dimensional measurement device and measurement method
CN101065785B (en) Methods for Automated 3D Imaging
US11060853B2 (en) Three-dimensional sensor system and three-dimensional data acquisition method
CN113160416A (en) Speckle imaging device and method for coal flow detection
CN105306922A (en) Method and device for obtaining depth camera reference diagram
JP6009206B2 (en) 3D measuring device
WO2014108976A1 (en) Object detecting device
Ahmadabadian et al. Image selection in photogrammetric multi-view stereo methods for metric and complete 3D reconstruction
JP6671589B2 (en) Three-dimensional measurement system, three-dimensional measurement method, and three-dimensional measurement program
JP6388806B2 (en) Projection image generation apparatus, projection image generation program, and projection image generation method
CN118115347A (en) Information processing device, point cloud data processing method, and information processing method
JPH11183142A (en) Method and apparatus for picking up three-dimensional image
CN107835931A (en) Method for monitoring linear dimensions of three-dimensional solids
CN120747243A (en) Synchronous calibration method for three-dimensional reconstruction of micro-distance photographic object

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20121106

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20130904

RIC1 Information provided on ipc code assigned before grant

Ipc: G01B 11/25 20060101AFI20130829BHEP

Ipc: G06T 7/00 20060101ALI20130829BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20180907

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200707

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTC Intention to grant announced (deleted)
INTG Intention to grant announced

Effective date: 20201116

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 1370293

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210315

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602010066591

Country of ref document: DE

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210611

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210610

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210610

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1370293

Country of ref document: AT

Kind code of ref document: T

Effective date: 20210310

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210712

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210710

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602010066591

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210531

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210531

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210507

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20210531

26N No opposition filed

Effective date: 20211213

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210507

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210710

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20210531

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20100507

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20240521

Year of fee payment: 15

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20240521

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20240521

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20240528

Year of fee payment: 15

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20210310