[go: up one dir, main page]

WO2012013486A1 - A method and a system for calibrating a multi-view three dimensional camera - Google Patents

A method and a system for calibrating a multi-view three dimensional camera Download PDF

Info

Publication number
WO2012013486A1
WO2012013486A1 PCT/EP2011/061819 EP2011061819W WO2012013486A1 WO 2012013486 A1 WO2012013486 A1 WO 2012013486A1 EP 2011061819 W EP2011061819 W EP 2011061819W WO 2012013486 A1 WO2012013486 A1 WO 2012013486A1
Authority
WO
WIPO (PCT)
Prior art keywords
depth map
camera
hollow body
inner shape
correction parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2011/061819
Other languages
French (fr)
Inventor
Varun Akur Venkatesan
Antony Louis Piriyakumar Douglas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Siemens Corp
Original Assignee
Siemens AG
Siemens Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG, Siemens Corp filed Critical Siemens AG
Publication of WO2012013486A1 publication Critical patent/WO2012013486A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps

Definitions

  • This invention relates to a method and a system for
  • a lens of the camera gathers the reflected electromagnetic pulses from various such surfaces of the object in the scene and generates a depth map of the whole scene. Depending on the distance travelled by each of the pulse, the electromagnetic pulse experiences a delay when captured by the lens of the camera.
  • a 360 degree lens system has lenses as well as curved mirrors to combine refraction via lenses and reflection via curved mirrors to gather the reflected electromagnetic pulses from all angles of the scene.
  • a multi-view three dimensional camera based on the time-of-flight principle is used.
  • the multi-view three dimensional camera uses the 360 degree lens system to gather the reflected electromagnetic pulses from various angles of the scene to generate the depth map.
  • the 360 degree lens system has a tendency to distort path length of the reflected electromagnetic pulse when the electromagnetic pulse passes through the 360 degree lens system.
  • the depth map generated by the time of flight camera using the 360 degree lens system gets
  • the object of the invention is achieved by a method of claim 1 and a system of the claim 10.
  • the underlying idea of the invention is to generate a depth map using a 360 degree lens system representing multi-view of a hollow body with a known inner shape and to compare this depth map with the known inner shape to determine for a set of undetermined parameters of the depth map to compensate the distortion of the electromagnetic pulse path length due to the 360 degree lens system. This allows calibrating the multi-view three dimensional cameras easily.
  • the correction parameters are determined such that the depth map corrected with the correction parameters corresponds to the known inner shape, so that, the compensation parameters are aligned synchronously to the comparison of depth map with the known inner shape.
  • Such compensation parameters calibrate the multi-view three dimensional cameras more precisely.
  • the correction parameters compensate a difference offset of a path length of an
  • the depth map generated from the electromagnetic pulse after correction will correspond to the known inner shape.
  • the method includes determining the correction parameters for every pixel of the depth map, so that the correction parameters can be availabl for entire depth map to correct the entire depth map at the same time in a way to provide the depth map after correction which corresponds to the inner shape of the hollow body.
  • the matching algorithm is based on linear shift or spline transformation of the depth map or combination thereof.
  • Such matching algorithms are easy to implement, as the concepts of linear shift and spline transformation are generally known.
  • the inner shape of the hollow body is cylinder or hemisphere. This makes method easy to implement and use, as the hollow body with such inner shape are readily available.
  • the hollow body is a room o known shape, thus using a room itself as the hollow body for calibration of camera. This replaces a need of separate hollow body for the calibration of the camera and also provides a solution for calibration when the separate hollow body generally used for calibration of the camera is lost.
  • placing the camera in predetermined position relative to the hollow body This helps to determine the correction factors easily and quickly, as time taken by the processor to calculate the position of the camera with respect to the surrounding body is saved.
  • the shape of the hollow body is having a symmetry axis and the hollow body is placed coaxially with an optical axis of the camera.
  • the symmetry axis is easy to locate, so a user can easily place the camera relative to the surrounding of the hollow body.
  • the system includes a holder for receiving the hollow body, so that the hollow body can be supported during the calibration of the camera.
  • FIG 1 shows a schematic diagram of a system for calibrating a multi-view three dimensional camera.
  • FIG 2 shows a flowchart for a matching algorithm used for determining the correction parameters by matching the depth map to the known inner shape.
  • a camera includes a electromagnetic pulse source which emits a electromagnetic pulse to be captured back by the camera on being reflected by a surface of a known inner shap, so that to generate a multi-view depth map of the known inner shape representing views from various angles of the known inner shape. But, as the electromagnetic pulse passes through the 360 degree lens system, path length of the electromagnetic b pulse gets distorted resulting in a change in the path length. To calibrate the camera for generating the depth map to correspond to a known inner shape on compensation for distortion of path length, a system is illustrated in FIG 1. b
  • a system 4 is exemplified showing a multi-view three dimensional camera 4 with a 360 degree lens system 8 for generating a multi-view depth map 6, a
  • the camera 4 is generating the multi-view depth map of the inner shape 12 and processor 22 receiving the depth map from the camera 4 to calibrate the camera 4 by determining correction parameters 14 for a set of undetermined parameters 26 for each pixel 16 of depth map 6 lb such that the depth map 6 on being corrected using the
  • correction parameters 14 corresponds to the known inner shape 12.
  • the undetermined parameters 26 are based on various physical 20 properties of the camera 4, the hollow body 10, the known
  • the camera 4 also includes a source of electromagnetic pulse which emits a electromagnetic pulse like a light pulse or a laser pulse or any such pulses having electromagnetic
  • the 360 degree lens system 8 distorts the path length of the electromagnetic pulse by elongating or
  • the processor 22 takes the depth map 6 as an input to process the depth map 6 and determines the correction factors.
  • the depth map 6 on correction using the correction parameters 14 represents the known inner shape 12.
  • the correction parameters 14 on being used with the depth map 6 compensate a difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6, wherein the difference offset is defined by a difference in the path lengths of the electromagnetic pulse when the 360 degree lens system 8 is used and when the 360 degree lens system 8 is not used.
  • the correction parameters 14 compensate a difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6 by compensating intensity of each pixel 16 of the depth map 6.
  • the correction parameters 14 can compensate the difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6 by compensating frequency or wavelength or a combination of intensity, wavelength or frequency of each pixel 16 of the depth map 6.
  • the depth map 6 comprises of various pixels 16 on the depth map 6 and each pixel 16 is generated by different
  • electromagnetic pulses In general, the electromagnetic pulse source emits the electromagnetic pulse in different
  • each of the electromagnetic pulse gets to a different distance and generates the pixel 16 on the depth map 6 on a basis of the distance travelled by each of the electromagnetic pulse.
  • each of the electromagnetic pulse gets to a different distance and generates the pixel 16 on the depth map 6 on a basis of the distance travelled by each of the electromagnetic pulse.
  • the correction parameters 14 can be determined for each of the pixels 16 on the depth map 6, as the electromagnetic pulses corresponding to each of the pixels 16 have been distorted differently by the 360 degree lens system 8.
  • the correction parameters 14 can be determined as a function to represent correction parameters 14 for each of the pixels 16 generated from the electromagnetic pulses distorted differently by the 360 degree lens system 8.
  • the function for correction parameters 14 can be an algebraic function, a vector function, a
  • the correction parameters 14 are determined by matching the depth map 6 to the known inner shape 12 by a matching
  • the matching algorithm takes the depth map 6 and spatial coordinates referring to the known inner shape 12 as an input and matches those spatial parameters with the pixels 16 in the depth map 6 referring to the spatial
  • the processor 22 determines the correction parameters 14. The matching
  • the known inner shape 12 of the hollow body 10 is a cylinder.
  • the inner shape 12 of the device is known through the
  • the cylinder 10 has a regular shape which makes it fast to make calculations while determining
  • Location and orientation of the camera 4 can be calculated by measuring spatial distances travelled by the electromagnetic pulse and using the spatial distances along with the
  • the processor 22 calculates the correction parameters 14.
  • the known inner shape 12 can be a hemisphere.
  • the shape of the hemisphere is known through a dimension of the hemisphere, i.e., a radius of the hemisphere.
  • the processor 22 takes in consideration of the radius to determine the correction parameters 14 by taking in consideration the spatial distance travelled by the electromagnetic pulse and using the spatial distance along with the radius of the hemisphere.
  • the orientation and the location of the camera 4 into the surrounding of the hemisphere can also be determined using the spatial distance travelled by the electromagnetic pulse and the radius of the hemisphere.
  • the known inner shape 12 can be cuboids, cube, trapezium or any other known shape for which the dimensions of the inner shape 12 is known and the correction parameters 14 can be determined by the processor 22 using the dimensions of the inner shape 12 and the spatial distance travelled by the electromagnetic pulse to various parts of the inner shape 12 of the hollow body 10.
  • the known inner shape 12 can be a room of known dimension and the camera 4 can be placed inside the room to determine the correction parameters 14.
  • the dimensions of the room can be made available by architectural map 6 of the room and the data in relation to the dimensions related to room can be fed into the processor 22.
  • processor 22 will determine the correction parameters 14 taking in consideration dimensions of the room.
  • the orientation and the location of the camera 4 into the surrounding of the room can also be
  • the camera 4 is placed in a predetermined position relative to the hollow body 10. If the predetermined position is known than determination of the correction parameters 14 by the processor 22 even becomes faster, because the orientation and the location of the camera 4 is not required to be known due to the availability of these data by knowledge of the
  • the camera 4 need not be placed in a predetermined position, rather it can be placed arbitrary in respect to the surrounding of the hollow body 10 and the correction parameters 14 are calculated by the processor 22 using the matching algorithm.
  • the hollow body 10 is having a symmetry axis 18 and the camera 4 is placed into the surrounding of the hollow body 10, so that the optical axis 20 of the camera 4 and the symmetry axis 18 are coaxial. Placing the camera 4 in such a way helps to place the camera 4 into the predetermined position, as when the camera 4 is placed coaxially to the hollow body 10 the orientation and the location of the camera 4 is easily known due to the symmetry of the hollow body 10.
  • the hollow body 10 need not have a symmetry axis 18 and the camera 4 is placed in a predetermined
  • the hollow body 10 and the camera 4 both are movable to move in a way to attain a desired position of camera 4 and the hollow body 10 relative to each other.
  • either of the camera 4 or the hollow body 10 is movable to attain a desired position of camera 4 and the hollow body 10 relative to each other.
  • the hollow body 10 is placed on a holder 24, so that the hollow body 10 can be easily placed and retained in a
  • the holder 24 can be used to hold the camera 4 when the camera 4 is kept fixed and the hollow body 10 is movable. Yet alternatively, the holder 24 can be provided to keep both the holder 24 and the camera 4 to be in a desired position of camera 4 and the hollow body 10 relative to each other.
  • the holder 24 is provided with a flexibility to move the hollow body 10 rotationally and transitionally in three dimensional spaces. While moving the hollow body 10, when the hollow body 10 has attained the desired position with respect to the hollow body 10, the movement of the hollow body 10 is locked using a locking mechanism of the hollow body 10.
  • the holder 24 can have a resistive movement of the hollow body 10 by having a resistive movement mechanism, so that the hollow body 10 can be easily moved into the desired position easily and quickly with larger precedence.
  • the 360 degree lens system 8 includes combination of lenses and curved mirrors to provide multi-view depth map 6 of the known inner shape 12, so that a sectional view of a part of the known inner shape 12 is generated. In an alternate embodiment, the 360 degree lens system 8 generates a 360 degree view depth map 6 of the known inner shape 12
  • the processor 22 receives the multi-view depth map 12
  • the processor 14 can be a general purpose computer like a central processing unit (CPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (CPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (GPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (CPU) and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm.
  • the processor 14 can be a general purpose computer like a central processing unit (GPU) and further compares the depth
  • FIG 2 illustrates a matching algorithm used for determining the correction parameters by matching a depth map of a known inner shape, while calibrating a multi-view three dimensional camera .
  • intensity of pixels on the depth map are compensated using linear shift or spline transform or any other such transform, on a basis of path length of a electromagnetic pulse.
  • the matching algorithm comprises following steps.
  • step 102 a set of undetermined parameters based on various physical properties of the camera, the hollow body, the known inner shape of the hollow body, the interrelation between the hollow body and the camera, etc or combination thereof, is chosen for each of the pixels on the depth map of a known inner shape of the hollow body.
  • step 104 the intensities of pixels are transformed radialy for the set of undetermined parameters.
  • a transformed depth map is produced in step 106 by using the intensities transformed in step 104.
  • step 108 the depth maps of the known inner shape of the hollow body are compared at various relative positions of the camera with respect to the hollow body to obtain best
  • step 110 the undetermined parameters are changed if the matching of the depth map and the inner known inner shape of the hollow body is not appropriate and steps 104 to 110 are iterated till the matching is not appropriate.
  • step 112 on finding the appropriate match, the undetermined parameters are saved as correction parameters to transform the intensity of the pixels on the depth map to get accurate depth map.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method for calibrating a multi-view three dimensional camera (4) having a time-of-flight camera (4) for generating a three dimensional depth map (6) through a 360 degree lens system (8), the method includes placing the camera (4) relative to a surrounding hollow body (10) of a known inner shape (12), generating the depth map (6) of the known inner shape (12) of the hollow body (10), comparing the depth map (6) with the known inner shape (12) to determine correction parameters (14) of the depth map (6).

Description

Description
A method and a system for calibrating a multi-view three dimensional camera
This invention relates to a method and a system for
calibrating a multi-view three dimensional camera including a time-of-flight camera. Cameras based on time-of-flight principle use an
electromagnetic pulse to generate a depth map of a surface of an object in a scene when the electromagnetic pulse is reflected by the surface of the object. A lens of the camera gathers the reflected electromagnetic pulses from various such surfaces of the object in the scene and generates a depth map of the whole scene. Depending on the distance travelled by each of the pulse, the electromagnetic pulse experiences a delay when captured by the lens of the camera. A 360 degree lens system has lenses as well as curved mirrors to combine refraction via lenses and reflection via curved mirrors to gather the reflected electromagnetic pulses from all angles of the scene. To generate a multi-view depth map of the scene representing views from various angles of the scene, a multi-view three dimensional camera based on the time-of-flight principle is used. The multi-view three dimensional camera uses the 360 degree lens system to gather the reflected electromagnetic pulses from various angles of the scene to generate the depth map. But, the 360 degree lens system has a tendency to distort path length of the reflected electromagnetic pulse when the electromagnetic pulse passes through the 360 degree lens system. Thus the depth map generated by the time of flight camera using the 360 degree lens system gets
distorted .
It is an object of the present invention to calibrate a multi-view three dimensional camera based on the time-of- flight principle to compensate for the distortion of
electromagnetic pulse path lengths due to the 360 degree lens system. The object of the invention is achieved by a method of claim 1 and a system of the claim 10.
The underlying idea of the invention is to generate a depth map using a 360 degree lens system representing multi-view of a hollow body with a known inner shape and to compare this depth map with the known inner shape to determine for a set of undetermined parameters of the depth map to compensate the distortion of the electromagnetic pulse path length due to the 360 degree lens system. This allows calibrating the multi-view three dimensional cameras easily.
According to an exemplary embodiment, the correction
parameters are determined such that the depth map corrected with the correction parameters corresponds to the known inner shape, so that, the compensation parameters are aligned synchronously to the comparison of depth map with the known inner shape. Such compensation parameters calibrate the multi-view three dimensional cameras more precisely. According to one embodiment, the correction parameters compensate a difference offset of a path length of an
electromagnetic pulse for each pixel of the depth map due to generating of the depth map through the 360 degree lens system. This helps to correct the path length of the pulse which has been shortened or lengthened due to distortion of the path length when the electromagnetic pulse travels through 360 degree lens system, thus making the
electromagnetic pulse appear non-distorted. The depth map generated from the electromagnetic pulse after correction will correspond to the known inner shape.
According to another embodiment, wherein the method includes determining the correction parameters for every pixel of the depth map, so that the correction parameters can be availabl for entire depth map to correct the entire depth map at the same time in a way to provide the depth map after correction which corresponds to the inner shape of the hollow body.
According to yet another embodiment, wherein determining the correction parameters by matching the depth map to the known inner shape by a matching algorithm. Using a matching algorithm provides a simple solution to provide the
correction parameters.
According to an exemplary embodiment, wherein the matching algorithm is based on linear shift or spline transformation of the depth map or combination thereof. Such matching algorithms are easy to implement, as the concepts of linear shift and spline transformation are generally known.
According to one embodiment, the inner shape of the hollow body is cylinder or hemisphere. This makes method easy to implement and use, as the hollow body with such inner shape are readily available.
According to another embodiment, the hollow body is a room o known shape, thus using a room itself as the hollow body for calibration of camera. This replaces a need of separate hollow body for the calibration of the camera and also provides a solution for calibration when the separate hollow body generally used for calibration of the camera is lost. According to another embodiment, placing the camera in predetermined position relative to the hollow body. This helps to determine the correction factors easily and quickly, as time taken by the processor to calculate the position of the camera with respect to the surrounding body is saved.
According to yet another embodiment, the shape of the hollow body is having a symmetry axis and the hollow body is placed coaxially with an optical axis of the camera. As the symmetry axis is easy to locate, so a user can easily place the camera relative to the surrounding of the hollow body.
According to an exemplary embodiment, the system includes a holder for receiving the hollow body, so that the hollow body can be supported during the calibration of the camera.
FIG 1 shows a schematic diagram of a system for calibrating a multi-view three dimensional camera.
FIG 2 shows a flowchart for a matching algorithm used for determining the correction parameters by matching the depth map to the known inner shape.
A camera includes a electromagnetic pulse source which emits a electromagnetic pulse to be captured back by the camera on being reflected by a surface of a known inner shap, so that to generate a multi-view depth map of the known inner shape representing views from various angles of the known inner shape. But, as the electromagnetic pulse passes through the 360 degree lens system, path length of the electromagnetic b pulse gets distorted resulting in a change in the path length. To calibrate the camera for generating the depth map to correspond to a known inner shape on compensation for distortion of path length, a system is illustrated in FIG 1. b
In reference to FIG 1, a system 4 is exemplified showing a multi-view three dimensional camera 4 with a 360 degree lens system 8 for generating a multi-view depth map 6, a
surrounding hollow body 10 of known inner shape 12
10 surrounding the camera 4, wherein the camera 4 is generating the multi-view depth map of the inner shape 12 and processor 22 receiving the depth map from the camera 4 to calibrate the camera 4 by determining correction parameters 14 for a set of undetermined parameters 26 for each pixel 16 of depth map 6 lb such that the depth map 6 on being corrected using the
correction parameters 14 corresponds to the known inner shape 12.
The undetermined parameters 26 are based on various physical 20 properties of the camera 4, the hollow body 10, the known
inner shape 12 of the hollow body 10, the interrelation between the hollow body 10 and the camera 4, etc or
combination thereof.
2b The camera 4 also includes a source of electromagnetic pulse which emits a electromagnetic pulse like a light pulse or a laser pulse or any such pulses having electromagnetic
properties and a shutter to cut-off electromagnetic pulse when the pulse returns back after being reflected by the
30 surface of the known inner shape 12 through the 360 degree lens system 8. The 360 degree lens system 8 distorts the path length of the electromagnetic pulse by elongating or
shortening of the path length or by changing geometry of path of the electromagnetic pulse, so the depth map 6 generated by the pulse of electromagnetic pulse doesn't maps onto the known inner shape 12. The processor 22 takes the depth map 6 as an input to process the depth map 6 and determines the correction factors. The depth map 6 on correction using the correction parameters 14 represents the known inner shape 12.
The correction parameters 14 on being used with the depth map 6 compensate a difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6, wherein the difference offset is defined by a difference in the path lengths of the electromagnetic pulse when the 360 degree lens system 8 is used and when the 360 degree lens system 8 is not used. The correction parameters 14 compensate a difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6 by compensating intensity of each pixel 16 of the depth map 6. Alternatively, the correction parameters 14 can compensate the difference offset of the path length of the electromagnetic pulse for each pixel 16 of the depth map 6 by compensating frequency or wavelength or a combination of intensity, wavelength or frequency of each pixel 16 of the depth map 6. The depth map 6 comprises of various pixels 16 on the depth map 6 and each pixel 16 is generated by different
electromagnetic pulses. In general, the electromagnetic pulse source emits the electromagnetic pulse in different
directions to reach to different parts of the known inner shape 12. Each part of the inner shape 12 is located at varied distances, so each electromagnetic pulse travels to a different distance and generates the pixel 16 on the depth map 6 on a basis of the distance travelled by each of the electromagnetic pulse. When passing through the 360 degree lens system 8, each of the electromagnetic pulse gets
distorted differently as each of the electromagnetic pulses comes from different directions after travelling for
different distance. So the corrections parameters are
determined for each of the pixels 16 on the depth map 6, as the electromagnetic pulses corresponding to each of the pixels 16 have been distorted differently by the 360 degree lens system 8. Alternatively, the correction parameters 14 can be determined as a function to represent correction parameters 14 for each of the pixels 16 generated from the electromagnetic pulses distorted differently by the 360 degree lens system 8. The function for correction parameters 14 can be an algebraic function, a vector function, a
trigonometric function or any of such mathematical function which can represent all the correction parameters 14 for each of the pixels 16.
The correction parameters 14 are determined by matching the depth map 6 to the known inner shape 12 by a matching
algorithm 18. The matching algorithm takes the depth map 6 and spatial coordinates referring to the known inner shape 12 as an input and matches those spatial parameters with the pixels 16 in the depth map 6 referring to the spatial
parameters. And on a basis of the match, the processor 22 determines the correction parameters 14. The matching
algorithm can be based on linear shift or spline
transformation of the depth map 6, or any such spatial transform based algorithm which determine the correction parameters 14 or any other such matching algorithm to
determine correction parameters 14 or combinations thereof.
The known inner shape 12 of the hollow body 10 is a cylinder. The inner shape 12 of the device is known through the
dimensions of the cylinder 10, i.e., a length and a radius of the cylinder 10. The cylinder 10 has a regular shape which makes it fast to make calculations while determining
correction parameters 14 by the processor 22. Once,
dimensions of the cylinder 10 are known, it becomes easy for the processor 22 to calculate the orientation and location of the camera 4 into the surroundings of the cylinder 10.
Location and orientation of the camera 4 can be calculated by measuring spatial distances travelled by the electromagnetic pulse and using the spatial distances along with the
dimensions of the camera 4. In furtherance, from the
dimensions of the cylinder 10, the location and the
orientation of the camera 4 into the surroundings of the cylinder 10 and the spatial distance travelled by the
electromagnetic pulse, the processor 22 calculates the correction parameters 14. Alternatively, the known inner shape 12 can be a hemisphere. The shape of the hemisphere is known through a dimension of the hemisphere, i.e., a radius of the hemisphere. The processor 22 takes in consideration of the radius to determine the correction parameters 14 by taking in consideration the spatial distance travelled by the electromagnetic pulse and using the spatial distance along with the radius of the hemisphere. In the intermediate step, the orientation and the location of the camera 4 into the surrounding of the hemisphere can also be determined using the spatial distance travelled by the electromagnetic pulse and the radius of the hemisphere. Alternatively, the known inner shape 12 can be cuboids, cube, trapezium or any other known shape for which the dimensions of the inner shape 12 is known and the correction parameters 14 can be determined by the processor 22 using the dimensions of the inner shape 12 and the spatial distance travelled by the electromagnetic pulse to various parts of the inner shape 12 of the hollow body 10. In an alternate embodiment, the known inner shape 12 can be a room of known dimension and the camera 4 can be placed inside the room to determine the correction parameters 14. The dimensions of the room can be made available by architectural map 6 of the room and the data in relation to the dimensions related to room can be fed into the processor 22. When the electromagnetic pulse travels after being reflected from various parts of the inner shape 12 of the room, the
processor 22 will determine the correction parameters 14 taking in consideration dimensions of the room. In the intermediate step, the orientation and the location of the camera 4 into the surrounding of the room can also be
determined using the spatial distance travelled by the electromagnetic pulse to various parts of the room and the dimensions of the room.
The camera 4 is placed in a predetermined position relative to the hollow body 10. If the predetermined position is known than determination of the correction parameters 14 by the processor 22 even becomes faster, because the orientation and the location of the camera 4 is not required to be known due to the availability of these data by knowledge of the
predetermined position. Alternatively, the camera 4 need not be placed in a predetermined position, rather it can be placed arbitrary in respect to the surrounding of the hollow body 10 and the correction parameters 14 are calculated by the processor 22 using the matching algorithm.
The hollow body 10 is having a symmetry axis 18 and the camera 4 is placed into the surrounding of the hollow body 10, so that the optical axis 20 of the camera 4 and the symmetry axis 18 are coaxial. Placing the camera 4 in such a way helps to place the camera 4 into the predetermined position, as when the camera 4 is placed coaxially to the hollow body 10 the orientation and the location of the camera 4 is easily known due to the symmetry of the hollow body 10. Alternatively, the hollow body 10 need not have a symmetry axis 18 and the camera 4 is placed in a predetermined
position in respect to the hollow body 10 in a way to
surround the hollow body 10.
The hollow body 10 and the camera 4, both are movable to move in a way to attain a desired position of camera 4 and the hollow body 10 relative to each other. In an alternate embodiment, either of the camera 4 or the hollow body 10 is movable to attain a desired position of camera 4 and the hollow body 10 relative to each other. The hollow body 10 is placed on a holder 24, so that the hollow body 10 can be easily placed and retained in a
position relative to the camera 4. Alternatively, the holder 24 can be used to hold the camera 4 when the camera 4 is kept fixed and the hollow body 10 is movable. Yet alternatively, the holder 24 can be provided to keep both the holder 24 and the camera 4 to be in a desired position of camera 4 and the hollow body 10 relative to each other.
The holder 24 is provided with a flexibility to move the hollow body 10 rotationally and transitionally in three dimensional spaces. While moving the hollow body 10, when the hollow body 10 has attained the desired position with respect to the hollow body 10, the movement of the hollow body 10 is locked using a locking mechanism of the hollow body 10.
Alternatively, the holder 24 can have a resistive movement of the hollow body 10 by having a resistive movement mechanism, so that the hollow body 10 can be easily moved into the desired position easily and quickly with larger precedence. The 360 degree lens system 8 includes combination of lenses and curved mirrors to provide multi-view depth map 6 of the known inner shape 12, so that a sectional view of a part of the known inner shape 12 is generated. In an alternate embodiment, the 360 degree lens system 8 generates a 360 degree view depth map 6 of the known inner shape 12
representing complete view for each part of the known inner shape 12. The processor 22 receives the multi-view depth map 12
generated by the camera 4 and further compares the depth map 12 with the known inner shape 12 to determine the correction parameters 14. While generating the correction parameters 14, the processor 22 uses the matching algorithm. The processor 14 can be a general purpose computer like a central
processing unit of a personal computer or a calculating device able to perform arithmetic and logical function, which are trained to generate the correction parameters 14 using the depth map 6.
FIG 2 illustrates a matching algorithm used for determining the correction parameters by matching a depth map of a known inner shape, while calibrating a multi-view three dimensional camera .
According to the matching algorithm, intensity of pixels on the depth map are compensated using linear shift or spline transform or any other such transform, on a basis of path length of a electromagnetic pulse.
The matching algorithm comprises following steps. In step 102 a set of undetermined parameters based on various physical properties of the camera, the hollow body, the known inner shape of the hollow body, the interrelation between the hollow body and the camera, etc or combination thereof, is chosen for each of the pixels on the depth map of a known inner shape of the hollow body. In step 104 the intensities of pixels are transformed radialy for the set of undetermined parameters. A transformed depth map is produced in step 106 by using the intensities transformed in step 104. While, in step 108 the depth maps of the known inner shape of the hollow body are compared at various relative positions of the camera with respect to the hollow body to obtain best
approximation of position of the hollow body with respect to the camera to identify best match of the transformed depth map and the known inner shape of the hollow body, so that the transformed depth map exactly fits the the known inner shape. In step 110, the undetermined parameters are changed if the matching of the depth map and the inner known inner shape of the hollow body is not appropriate and steps 104 to 110 are iterated till the matching is not appropriate. In step 112, on finding the appropriate match, the undetermined parameters are saved as correction parameters to transform the intensity of the pixels on the depth map to get accurate depth map.

Claims

Claims
1. A method for calibrating a multi-view three dimensional camera (4), comprising a time-of-flight camera (4) for generating a three dimensional depth map (6) through a 360 degree lens system (8), wherein the method comprises:
- placing the camera (4) relative to a surrounding hollow body (10) of a known inner shape (12),
- generating the depth map (6) of the known inner shape (12) of the hollow body (10),
- comparing the depth map (6) with the known inner shape (12) to determine correction parameters (14) for a set of
undetermined parameters (26) of the depth map (6) .
2. The method according to claim 1, wherein the correction parameters (14) are determined such that the depth map (6) corrected with the correction parameters (14) corresponds to the known inner shape (12) .
3. The method according to any of the claims 1 or 2, wherein the correction parameters (14) compensate an offset of a path length of a electromagnetic pulse for each pixel of the depth map due to generating the depth map through the 360 degree lens system ( 8 ) .
4. The method according to claim 3, wherein the correction parameters (14) compensate the offset of the path length of the electromagnetic pulse for each pixel of the depth map by compensating intensity of each pixel of the depth map.
5. The method according to any of the claims from 1 to 4, wherein the correction parameters (14) are determined for every pixel (16) of the depth map (6) .
6. The method according to any of the claims from 1 to 5, wherein the correction parameters (14) are determined by matching the depth map (6) to the known inner shape (12) by a matching algorithm.
7. The method according to the claim 6, wherein the matching algorithm is based on linear shift or spline transformation or combination thereof.
8. The method according to any of the claims from 1 to 7, wherein the inner shape (12) of the hollow body (10) is a cylinder or a hemisphere.
9. The method according to any of the claims from 1 to 8, wherein the hollow body (10) is a room of known shape.
10. The method according to any of the claims from 1 to 9, wherein the camera (4) is placed in a predetermined position relative to the hollow body (10) .
11. The method according to claim 10, wherein the inner shape of the hollow body (10) has a symmetry axis (18) and the hollow body (10) is placed coaxially to an optical axis (20) of the camera (4) .
12. A system (2) comprising:
- a multi-view three dimensional camera (4) for generating the three dimensional depth map (6) through the 360 degree lens system ( 8 ) ,
- a processor (22) adapted to calibrate the camera (4) according to any of the method according to any of the claims from 1 to 11.
13. The system (2) according to claim 12 comprising: - a holder (24) for receiving the hollow body predetermined position relative to the camera
PCT/EP2011/061819 2010-07-27 2011-07-12 A method and a system for calibrating a multi-view three dimensional camera Ceased WO2012013486A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN828KO2010 2010-07-27
IN828/KOL/2010 2010-07-27

Publications (1)

Publication Number Publication Date
WO2012013486A1 true WO2012013486A1 (en) 2012-02-02

Family

ID=44318180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2011/061819 Ceased WO2012013486A1 (en) 2010-07-27 2011-07-12 A method and a system for calibrating a multi-view three dimensional camera

Country Status (1)

Country Link
WO (1) WO2012013486A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104284173A (en) * 2013-07-10 2015-01-14 宏达国际电子股份有限公司 Method and electronic device for generating multi-view video
US10141022B2 (en) 2013-07-10 2018-11-27 Htc Corporation Method and electronic device for generating multiple point of view video
CN110506297A (en) * 2017-04-17 2019-11-26 康耐视公司 Pinpoint accuracy calibration system and method
US11682131B2 (en) * 2017-10-27 2023-06-20 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling image capturing apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070061040A1 (en) * 2005-09-02 2007-03-15 Home Robots, Inc. Multi-function robotic device
EP2073035A1 (en) * 2007-12-18 2009-06-24 IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. Recording of 3D images of a scene

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070061040A1 (en) * 2005-09-02 2007-03-15 Home Robots, Inc. Multi-function robotic device
EP2073035A1 (en) * 2007-12-18 2009-06-24 IEE INTERNATIONAL ELECTRONICS & ENGINEERING S.A. Recording of 3D images of a scene

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ALLESSANDRO BEVILACQUA ET AL: "People Tracking Using a Time-of-Flight Depth Sensor", IEEE INTERNATIONAL CONFERENCE ON VIDEO AND SIGNAL BASED SURVEILLANCE, 2006. AVSS '06, IEEE, 1 November 2006 (2006-11-01), pages 1 - 5, XP002509695, ISBN: 978-0-7695-2688-1 *
J.A. BERALDIN, S.F. EL HAKIM, L. COURNOYER: "practical range camera calibration", SPIE VIDEOMETRICS II, 1993, pages 21 - 31, XP002656448 *
KAHLMANN T ET AL: "Calibration of the fast range imaging camera SwissRanger(TM) for the use in the surveillance of the environment", PROCEEDINGS OF SPIE, THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING SPIE, USA, vol. 6396, 1 January 2006 (2006-01-01), pages 639605 - 1, XP002539116, ISSN: 0277-786X, DOI: 10.1117/12.684458 *
STEFAN MAY, DAVID DROESCHEL,DIRK HOLZ,CHRISTOPH WIESEN: "3D pose estimation and mapping with time-of-flight cameras", IEEE/RSJ INT. CONF. ON INTELLIGENT ROBOTS AND SYSTEMS IROS, 2008 - 2008, Nice, France, XP002656447, Retrieved from the Internet <URL:www.robotic.de/fileadmin/robotic/fuchs/iros08_3dcam.pdf> [retrieved on 20110804] *
XIAOFENG LIAN ET AL: "Reconstructing indoor environmental 3D model using laser range scanners and omnidirectional camera", INTELLIGENT CONTROL AND AUTOMATION, 2008. WCICA 2008. 7TH WORLD CONGRESS ON, IEEE, PISCATAWAY, NJ, USA, 25 June 2008 (2008-06-25), pages 1640 - 1644, XP031302428, ISBN: 978-1-4244-2113-8 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104284173A (en) * 2013-07-10 2015-01-14 宏达国际电子股份有限公司 Method and electronic device for generating multi-view video
US10141022B2 (en) 2013-07-10 2018-11-27 Htc Corporation Method and electronic device for generating multiple point of view video
CN110506297A (en) * 2017-04-17 2019-11-26 康耐视公司 Pinpoint accuracy calibration system and method
CN110506297B (en) * 2017-04-17 2023-08-11 康耐视公司 High accuracy calibration system and method
US11682131B2 (en) * 2017-10-27 2023-06-20 Canon Kabushiki Kaisha Image capturing apparatus and method of controlling image capturing apparatus

Similar Documents

Publication Publication Date Title
JP7519126B2 (en) Augmented reality display with active alignment and corresponding method
JP7133554B2 (en) Range sensor with adjustable focus image sensor
US10582188B2 (en) System and method for adjusting a baseline of an imaging system with microlens array
US9361519B2 (en) Computational array camera with dynamic illumination for eye tracking
KR102085228B1 (en) Imaging processing method and apparatus for calibrating depth of depth sensor
US9377298B2 (en) Surface determination for objects by means of geodetically precise single point determination and scanning
US11307031B2 (en) Surveying device, and calibration checking method and calibration checking program for surveying device
CN105424005B (en) Measurement apparatus with the function for calibrating the focusing optical element position to be set in the way of distance is related
US20090097039A1 (en) 3-Dimensional Shape Measuring Method and Device Thereof
US20140300886A1 (en) Geodetic referencing of point clouds
US20160103209A1 (en) Imaging device and three-dimensional-measurement device
WO2017138291A1 (en) Distance image acquisition device, and application thereof
JP2012514749A (en) Optical distance meter and imaging device with chiral optical system
JP2019100915A (en) Measurement device, measurement device calibration method, and measurement device calibration-purpose program
WO2012013486A1 (en) A method and a system for calibrating a multi-view three dimensional camera
CN107179059B (en) Method for determining an angular error and light emitting device
EP3591465A2 (en) Handheld three dimensional scanner with autofocus or autoaperture
CN111473747B (en) Calibration device, calibration system, electronic equipment and calibration method
JP2019500606A5 (en)
JP6065670B2 (en) Three-dimensional measurement system, program and method.
Langmann Wide area 2D/3D imaging: development, analysis and applications
CN111044039B (en) IMU-based monocular target area adaptive high-precision ranging device and method
EP3766046B1 (en) Camera calibration and/or use of a calibrated camera
JP7228294B2 (en) Projector control device, projector, projection system, projection method and program
EP3513380B1 (en) Method and device for depth detection using stereo images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11735624

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11735624

Country of ref document: EP

Kind code of ref document: A1