US20160094840A1 - Calibration target for video processing - Google Patents
Calibration target for video processing Download PDFInfo
- Publication number
- US20160094840A1 US20160094840A1 US14/502,647 US201414502647A US2016094840A1 US 20160094840 A1 US20160094840 A1 US 20160094840A1 US 201414502647 A US201414502647 A US 201414502647A US 2016094840 A1 US2016094840 A1 US 2016094840A1
- Authority
- US
- United States
- Prior art keywords
- calibration target
- hollow body
- planar surfaces
- fiducial markers
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- G06T7/0018—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
Definitions
- a camera creates a record of a three-dimensional (3D) physical scene with a two-dimensional (2D) image.
- the image may be recorded on a film or as a digital 2D array of pixel values.
- Computer-based animation techniques often involve capturing a series of images of an actor (or other object) with one or more cameras, which may have different viewing perspectives. The images from these cameras can be combined to generate a three-dimensional (3D) graphical representation of the actor that can be applied to an animated character and placed in a computer-generated 3D scene.
- each camera needs to be calibrated to the 3D graphical representation of the scene.
- Calibration of a camera to the scene includes determining the intrinsic parameters of the camera and the location of the camera within the scene.
- Current systems for imaging calibration are relatively slow and inaccurate.
- an image of a known object (often referred to as a calibration target, calibration apparatus, or calibration tool) is captured and an animator manually maps the object's features to the corresponding computer graphics model to set the orientation of a virtual camera in the 3D model.
- a calibration target may include a known pattern, image or markers formed on one or more surfaces or edges of the target, such as a black and white checkerboard pattern on one or more surfaces of the target or edges that are painted different colors.
- a calibration target may also serve as a reference for configuring a virtual camera in the 3D representation of the scene in order, in some examples, to create further images of the scene.
- Embodiments of the invention pertain to a calibration target with a series of distinguishable fiducial markers on each of multiple surfaces that enable methods and systems of the invention to automatically identify the precise position of the target in a scene without manual input from a user.
- a calibration target includes a hollow body having an interior surface and an exterior surface. At least one window is formed in the hollow body through which the interior surface is visible, and a plurality of distinguishable fiducial markers are arranged in a predetermined pattern along the interior and exterior surfaces of the hollow body.
- the fiducial markers are distinguishable such that, for any given image of the calibration target captured by a camera, based on the position of the calibration target in the image there is only one position, rotation, distortion value, and focal length for the camera.
- the calibration target is approximately the size of a human bust, and/or the calibration target can further include one or more focus-assist patterns interspersed with the fiducial markers.
- the hollow body can include a plurality of planar surfaces joined together to define a perimeter of the body, with each planar surface in the plurality having a planar interior surface that is part of the interior surface of the hollow body and a planar exterior surface that is part of the exterior surface of the hollow body.
- the plurality of distinguishable fidicual markers include a first plurality of fiducial markers is arranged on the exterior surface in a first grid pattern and a second plurality of fiducial markers is arranged on the interior surface in a second grid pattern.
- Each of the first and second grid patterns can include a plurality of similarly sized cells, and in some embodiments, each cell contains either one fiducial marker or one focus-assist marker.
- a calibration target includes a hollow body having a plurality of planar surfaces joined together to define a perimeter of the hollow body.
- Each of the planar surfaces includes a planar interior surface that is part of an interior surface of the hollow body and a planar exterior surface that is part of an exterior surface of the hollow body.
- a first set of distinguishable fiducial markers is arranged in a predetermined pattern along the exterior surface of the hollow body, and a second set of distinguishable fiducial markers is arranged in a predetermined pattern along the interior surface of the hollow body, where no two fiducial markers in the first and second sets of fiducial markers are identical.
- At least one window is formed in the hollow body through which the interior surface of the hollow body and at least some of the fiducial markers in the second set are visible.
- the hollow body is formed from at least five planar surfaces arranged in different planes, at least eight planar surfaces arranged in different planes or from sixteen planar surfaces, each arranged in different planes.
- FIG. 1 is a simplified front perspective view of a calibration target according to an embodiment of the invention that can be used to automatically calibrate a camera;
- FIG. 2 is a simplified rear perspective view of the calibration target shown in FIG. 1 ;
- FIGS. 3A and 3B are right and left side plan views of the calibration target shown in FIG. 1 , respectively;
- FIG. 4 is a top plan view of the calibration target shown in FIG. 1 ;
- FIGS. 5A-F are top plan views of calibration targets according to other embodiments of the invention.
- FIG. 6 is a simplified front perspective view of a calibration target according to another embodiment of the invention.
- FIGS. 7A-7C are simplified illustrations of fiducial markers that can be incorporated onto a calibration target according to embodiments of the invention.
- FIG. 8 is a simplified view of the calibration target located in a three dimensional space being imaged from multiple angles by different cameras;
- FIG. 9A is a flowchart of a method for calibrating a camera according to an embodiment of the invention.
- FIG. 9B is a flowchart of a method for calibrating a virtual camera according to an embodiment of the invention.
- FIG. 10 is a schematic diagram of a computing system that can be used in connection with computer-implemented methods described in this document.
- Embodiments of the invention are directed to devices, methods and systems for automatically calibrating a camera.
- Calibration of a camera entails, in part, determining parameters of a camera related to its focal length, principal point, and other values that affect how the camera produces a two-dimensional (2D) image from a view of points in a three-dimensional (3D) space.
- the parameters of the camera may be used in forming or adjusting the images that the camera produces.
- calibrating a camera involves producing one or more images or pictures of a test object with the camera, locating components of the image that correspond to particular parts of the test object, and calculating the camera parameters based on the image components, the geometry of the test object, its position relative to the camera when the image was taken, and the physical assumptions about the camera.
- the test object is sometimes referred to as a calibration target or a calibration tool.
- Some embodiments of the invention pertain to a calibration target that can be used as part of a system that can automatically perform such a calibration process without manual input from a user.
- FIG. 1 is a simplified front perspective view of a calibration target 100 according to an embodiment of the invention that can be used to automatically calibrate one or more cameras
- FIG. 2 is a simplified rear perspective view of calibration target 100
- calibration target 100 includes a hollow body 102 having multiple planar surfaces 104 joined together to define a perimeter of the hollow body.
- Each planar surface 104 includes an exterior surface and an interior surface where the sum of the exterior surfaces for all planar surfaces 104 generally defines an exterior surface 106 of body 102 and the sum of the interior surfaces of all planar surfaces 104 generally defines an interior surface 108 of hollow body 102 .
- planar surface 104 lies between two adjacent planar surfaces.
- planar surface 104 labeled in FIG. 1 which extends from a bottom 111 of calibration target 100 to a top 113 of the target, is positioned between two additional surfaces 104 , which are referred to herein as surfaces 104 ( l ) and 104 ( r ) for reference.
- Surface 104 is joined to surface 104 ( l ) at a left edge of surface 104 and a right edge of surface 104 ( l ).
- Surface 104 is also joined to surface 104 ( r ) at a right edge of surface 104 and a left edge of surface 104 ( r ).
- each additional planar surface 104 includes two similar surfaces 104 ( l ) and 104 ( r ) such that any individual planar surface 104 is itself a surface 104 ( l ) to another planar surface 104 and a surface 104 ( r ) to still a different planar surface 104 .
- the set of planar surfaces 104 form a closed shape that enables calibration target 100 to be a free standing device that can be placed on a table or other support surface and oriented in an upright position without requiring tools or additional support.
- a plurality of fiducial markers 120 are provided on both exterior surface 106 and interior surface 108 to provide patterns for recognition by a camera system as explained below.
- Fiducial markers 120 can be arranged in predetermined locations across some or all of external surface 106 and across some or all of internal surface 108 .
- fiducial markers 120 are arranged in a grid pattern on each of exterior surface 106 and interior surface 108 where each grid includes an array of generally square-shaped cells 112 .
- FIG. 1 only some of fiducial markers 120 are shown in FIG. 1 and no fiducial markers are shown in FIG. 2 . Thus, many cells 112 are shown without a fiducial marker.
- each and every cell 112 includes a fiducial marker 120 . In other embodiments, however, only a subset of cells 112 include fiducial markers.
- FIGS. 3A and 3B which are left and right side views of calibration target 100 , more clearly show that markers 120 of calibration target 100 are arranged in a grid, four markers high, around the entire exterior surface 106 of the calibration target.
- each individual fiducial marker 120 is generally square in shape and has a unique pattern that enables any given fiducial marker to be distinguished from other fidicual markers in the set of markers included on target 100 .
- fiducial markers 120 can be arranged in predetermined patterns other than a grid.
- each individual cell 112 in the grid need not be a square.
- the grid may include rectangular, hexagonal, octagonal or other appropriate shaped cells 112 , and in some embodiments, individual cells in the grid may be sized or shaped differently from other cells in the grid.
- the fiducial markers may have other geometric shapes, and/or vary in other parameters such as color or size.
- Calibration target 100 can also include one or more focus-assist patterns 122 on either or both exterior surface 106 and interior surface 108 .
- Each focus-assist pattern 122 can be positioned in a cell 112 instead of a fiducial marker 120 being positioned in the respective cell(s).
- target 100 includes multiple focus-assist patterns 102 (six are shown in FIGS.
- focus-assist pattern 122 is a wheel-and-spoke pattern but other patterns can be used.
- Hollow body 102 includes a window 110 that enables fiducial markers 120 and/or focus-assist patterns 122 on interior surface 108 behind the window to be visible to a particular camera view through the window for calibration purposes when such fiducial markers would otherwise be blocked to the camera by exterior surface 106 .
- the combination of window 110 and fiducial markers 120 and/or focus-assist patterns 122 on interior surface 108 provides additional depth perception to certain cameras positioned around calibration target 100 thus enabling more accurate calibration and focus control.
- Window 110 may further be advantageous in calibrating a camera initially and then for orienting the camera and determining imaging parameters for the camera to be applied to a virtual camera.
- Window 110 may be either a transparent material, such as cellophane, acrylic or glass or may be an area void of material.
- window 110 is void of material, is rectangular in shape and is centered between first and second bands 114 and 116 of hollow body 102 .
- Each of bands 114 and 116 completely surrounds an interior space of body 102 in the width and length dimensions, and have a height that is approximately equal to the height of each cell 112 .
- Window 110 has a height that is approximately equal to the height of two cells, but calibration targets according to the present invention are not limited to the precise dimensions of window 110 and/or bands 114 , 116 shown in FIG. 1 .
- window 110 may have different shapes, body 102 may include multiple windows and fewer or more bands, and individual bands, such as bands 114 , 116 , can have different heights.
- Calibration target 100 may be made from a material having sufficient rigidity, such as plastic, cardboard or metal, to enable the device to maintain its shape and readily stand without exterior support.
- calibration tool 100 is approximately the same size of a life-size bust of a human head. This can be useful for calibrating cameras, or determining their imaging parameters, in order to obtain accurate images of a human actor, in which inaccuracies would be quickly apparent to a human viewer.
- calibration targets according to the present invention may be larger or smaller as needed to accurately calibrate a camera to a given scene.
- calibration target 100 may include one or more markers that indicate the bottom 111 and/or top 113 of the target.
- Such markers can be used to ensure that, when target 100 is used to calibrate one or more cameras, it is positioned in a known orientation within the scene which assists the software program in identifying the various fiducial markers 120 on target 100 and calculating the relationship between a camera and the target.
- calibration target 100 further includes a spine 130 , which is shown in FIG. 2 .
- spine 130 provides additional for target 100 .
- spine has angled edges 132 , 134 that are sized and shaped to be useable with a standard quick-release clamp on a tripod. This enables target 100 to be easily attached to and detached from a tripod and placed in almost any desired position within a scene.
- Calibration target 100 can made from a single piece of material or can be made from multiple parts that can be easily assembled and disassembled to facilitate transportation of target 100 from one scene (or movie set) to another. Such an assembly method may also make for efficient application of the fiducial markers on the planar surfaces prior to assembly.
- body 102 can be made from a flat sheet of plastic having angled grooves cutout from the interior side such that the sheet can be bent in the shape shown in FIGS. 1 and 2 such that the the two opposing ends of the sheet abutt each other.
- the ends can be attached using a variety of techniques, and in one embodiment, spine 130 helps secure the ends together.
- body 102 can be printed using a 3D printer.
- body 102 can be printed as two separate components where a first component includes band 114 and the row of cells 112 directly beneath band 114 and a second component includes band 116 and the row of cells 112 directly above band 114 .
- the two components can be printed with small holes at the top and bottom ends of each planar surface that allow small dowels to be placed in the holes to secure the two components together using a dowel joint.
- Spine 130 can be printed as a separate component and joined to body 102 to provide additional support and strength.
- a person of skill in the art will recognize other techniques, including other approaches to 3D printing various components, that can be used to fabricate body 102 and target 100 .
- FIG. 4 is a top plan view of calibration target 100 .
- target 100 includes 16 separate sides 401 - 416 of the same length.
- Sides 401 - 416 form a closed polygon (a regular hexadecagon) that enables calibration target 100 to be a free standing device so that it can be placed on a table or other support surface in a scene.
- Each side 401 - 416 corresponds to one of planar surfaces 104 discussed with respect to FIG. 1 , and thus may included one or more fiducial markers on both the interior and exterior surfaces of the side 401 - 416 .
- sides 401 - 416 enables software to more readily recognize fiducial markers 120 positioned on each side as the software does not need to correct for curvature or irregularities of a nonplanar surface in performing its recognition route. Additionally, having sixteen sides 401 - 416 , each in a different plane, ensures that any camera directed towards target 100 will capture multiple sides when imaging the target and thus capture multiple fiducial markers at different depths.
- Embodiments of the invention are not limited to any particular shaped body 102 , however, and in other embodiments, body 102 may have different cross-sectional shapes.
- FIG. 5A depicts and embodiment of the invention where a body 502 includes five sides that form a body having a regular pentagon cross-sectional shape (i.e., all angles between adjacent sides are equal in measure and all sides are equal in length).
- FIG. 5B depicts and embodiment of the invention where a body 504 includes six equal length sides having equal angles and a cross-sectional shape in the form of a regular hexagon.
- FIG. 5A depicts and embodiment of the invention where a body 502 includes five sides that form a body having a regular pentagon cross-sectional shape (i.e., all angles between adjacent sides are equal in measure and all sides are equal in length).
- FIG. 5B depicts and embodiment of the invention where a body 504 includes six equal length sides having equal angles and a cross-sectional shape in the form of a regular hexagon.
- FIG. 5C depicts and embodiment of the invention where a body 504 includes eight equal length sides having equal angles and a cross-sectional shape in the form of a regular octagon
- FIG. 5D depicts and embodiment of the invention where a body 504 includes ten equal length sides having equal angles and a cross-sectional shape in the form of a regular decagon.
- the body of a calibration target includes more than five sides (planar surfaces 104 ) aligned in different planes, such as each of the bodies 502 , 504 , 506 and 508 shown in FIGS. 5A-5D .
- the interior angle between each adjacent planar side is greater than 90 degrees.
- the interior angle between each side in body 502 is 108 degrees; the interior angle between each side in body 504 is 120 degrees, the interior angle between each side in body 506 is 135 degrees, the interior angle between each side in body 508 is 144 degrees; and the interior angle between each side in body 102 is 157.5 degrees.
- each of bodies 502 , 504 , 506 and 508 have a regular polygon cross-section, in other embodiments of the invention, the body may have an irregular closed polygon shape.
- the body of a calibration target according to the invention can have a curved shape, such as the circular shape of body 510 shown in FIG. 5E or an oval shape or other curved shape.
- the software algorithm that identifies the fiducial markers can be adapted to the particular curvature of the body to facilitate pattern recognition.
- some embodiments of the invention may included one or more wings or extensions that extend from the body, such as the four extensions 520 shown as extending from octagonal shaped body 508 in FIG. 5F .
- Each extension 520 may included additional fiducial markers 120 and/or focus-assist patterns 122 .
- FIG. 6 is a simplified front perspective view of a calibration target 600 according to another embodiment of the invention.
- Calibration target 600 includes a body 602 and is essentially two separate calibration targets 100 stacked upon each other. In other embodiments, three, four or more calibration targets 100 can be stacked together.
- body 602 includes a predetermined pattern of multiple fiducial markers 120 and one or more pattern-assist features 122 arranged in a grid of cells 112 on both the interior and exterior surfaces of the target.
- target 600 includes two separate windows 610 a and 610 b .
- Window 610 a is bordered by a band 114 on the top and a band 620 below, each of which completely surround an interior space of body 602 .
- window 610 b is bordered by a band 116 below and by band 620 on top.
- bands 114 and 116 each have a height approximately equal to the height of each cell 112
- windows 610 a and 610 b and band 620 each have a height that is approximately equal to the height of two cells.
- body 602 has a hexadecagon cross-sectional shape.
- body 602 and target 600 can have any of the shapes and/or features described with respect to other embodiments of calibration targets according to the present invention.
- fiducial marker 120 is square in shape and is configured to have a 10-by-10 grid of smaller squares, which can be differently colored.
- an outer peripheral band 702 of squares are all colored black or a dark color
- a second inner band of squares 704 interior to but touching the outermost band has a light color to contrast with the black or dark outer band.
- band 702 is black and band 704 is white.
- a 6-by-6 grid 706 of smaller squares Interior to the two bands 702 and 704 is a 6-by-6 grid 706 of smaller squares in which a unique pattern of dark (or black) colored and light colored (or white) smaller squares is created.
- some individual fiducial markers 120 included on a particular calibration target 100 have an outer band 702 that is dark and an inner band 704 that is light, while other individual fiducial markers 120 have an outer band 702 that is light and an inner band 704 that is dark. This additional level of variation can help further distinguish individual markers from each other.
- the pattern formed in the 6 ⁇ 6 grid of each and every fiducial marker on calibration tool 100 or 600 is unique. This enables software to distinguish each individual marker from others and identify the exact location of a particular fiducial marker visible to a given camera.
- the 6 ⁇ 6 pattern shown in FIG. 7A allows for significantly more than enough possible fiducial markers to be able to use all distinct fiducial markers on the apparatus. Further, various selection processes may be used to choose a set of fiducial markers for the apparatus such that the fiducial markers in the set are not only distinct, and also that any two differ in a large number of the smaller subsquares.
- the fiducial markers may have a rectangular, triangular, circular or other shapes, and/or may include a pattern made up of components other than a grid of similarly sized squares.
- the fiducial markers may include a plurality of hexagons, pentagons, rectangles, triangles, dots or other shapes within an outer border of the fiducial.
- the fiducial markers may contain information within the pattern of dark and light subsquares.
- the particular sequence of fiducial markers around the border area of a quadrant of a planar surface may also contain information to assist in identification of the fiducial markers, and/or to assist in the calibration of the camera.
- Particular fiducials which are known to be easily recognized in a camera may be positioned at particular locations on the calibration target to aid in identifying which surface is being viewed.
- Calibration target 100 or 600 and their equivalents may be used either to calibrate a camera, and/or to determine imaging properties of a camera, from images taken by the camera. Methods according to the invention use the information available via a captured image of calibration target 100 to increase automation of calibration and other operations.
- the size of the of the apparatus, including the boundary area surrounding a window, and the size of the fiducial markers and their location on the apparatus may be recorded or known before the calibration target is used.
- FIG. 8 is a simplified schematic diagram showing a scene 800 in which a calibration target 810 according to the invention can be used for calibrating one or more cameras according to an embodiment of the invention.
- three physical cameras 802 a , 802 b and 802 c are in use to “shoot” scene 800 which may include one or more actors 806 and/or props 808 .
- Each of the cameras 802 a , 802 b and 802 c are directed to take images of scene 800 while calibration target 810 is positioned in the scene so that it is within the field of view of each camera.
- each camera 802 a , 802 b and 802 c will capture at least multiple sides and angles of target 810 when the camera is positioned to capture at least some of the exterior and/or interior surface of the target regardless of the angle between the camera and target.
- Each camera 802 a , 802 b , 802 c is connected to a processor 804 , such as the computer processing system described in FIG. 10 .
- processor 804 stores a computer graphics model of calibration target 810 including the positions and unique patterns of the fiducial markers on the target.
- Calibration target 810 includes unique fiducial markers, such that, for any given image there is only one position, rotation, distortion value, and focal length for the camera of interest.
- processor 804 can compare information from captured images of each of cameras 802 a , 802 b , 802 c to the computer graphics model to identify fiducial markers on calibration target 810 , and based on the comparison, determine, among other parameters, the position, rotation, distortion and focal length of each camera with respect to the calibration target 810 and scene as described below with respect to FIG. 9A .
- Processor 804 can also be used to place a virtual camera in a computer generated 3D scene including setting the virtual camera's position, rotation, distortion and focal length according to the intrinsic and imaging parameters of camera 802 a , 802 b , 802 c as described with respect to FIG. 9B .
- FIG. 9A shows a flow chart for a method 900 for calibrating a camera using the calibration target 100 of FIG. 1 , or an alternate embodiment thereof.
- the calibration target 100 may be placed in a physical set depicting a scene. Information regarding the location of the calibration target 100 in the scene, such as which surface of the apparatus is facing the camera, the distance from the camera to a particular point on the apparatus, or other data, may optionally be recorded to assist in the camera calibration.
- the camera is directed so as to include the calibration target 100 in an image, and at least one image that includes the calibration target 100 is obtained using the camera.
- the images are digitally generated, such as by a camera using a charge-coupled device (CCD).
- the images may be captured on film, and then scanned into digital form. In digital form the pixel coordinates of an image point can be determined.
- the image is then received at a computational device or system capable of analyzing images and image data (block 905 ).
- Each image captured for calibrating a camera may include a 2D array of pixels, and may be enumerated using pixel coordinates. Pixel coordinates may be normalized or converted to homogeneous coordinate form.
- An identification procedure is performed on an image in order to identify parts of the image that correspond to particular fiducial markers (block 910 ). That is, the parts or segments of the image are identified for which a fiducial marker on the calibration target was the source in the physical scene.
- Such a procedure may involve one or more pattern recognition algorithms.
- the pattern recognition algorithm may use the known information about the bands to assist in identifying the image of fiducial markers within the captured image.
- the pattern recognition algorithm may determine a plurality of parts of the image corresponding to fiducial markers, but with varying levels of certainty that a fiducial marker is the source.
- the identification procedure may choose a sufficient number of the image parts having sufficient certainty of corresponding to fiducial marker.
- only fiducial markers that are fully visible in an image are chosen for analysis by the pattern recognition algorithm.
- a fidicual marker that is only partially visible through a window 108 may be excluded form the pattern recognition algorithm in order to increase the effectiveness of the algorithm.
- the pattern recognition algorithm may identify the location of such focus-assist patterns and use that information in identifying fiducial markers and/or in calculating the position and orientation of target 100 in the set.
- pattern matching algorithms or pattern recognition operations that compare the image to a computer model of the calibration target including its fiducial markers may be used to uniquely determine which of the fiducial markers was the source for that part of the image (block 915 ).
- pattern matching algorithms include the discrete Fourier transform, the Hough transform and wavelet transforms.
- the fiducial markers may be chosen to have easily recognized and distinguished transforms. Known relative locations of the fiducial markers on the calibration target may be used as part of the pattern recognition operations.
- one or more specific features of the fiducial marker may be located in the image.
- the specific features may be one or more of the corners of the fiducial marker.
- subimages of multiple fiducial markers are identified within the image, the corresponding unique fiducial markers are determined, and specific features of each fiducial marker are selected.
- multiple components of the image corresponding to fiducial markers may be identified, the respective source fiducial marker uniquely determined, and the corners of those fiducial markers selected as source points.
- a set of 2D image coordinates, such as either pixel coordinates or homogeneous coordinates, of the source points is then obtained.
- reprojection of the set 2D image coordinates of the source points is performed to determine a corresponding set of estimated 3D coordinates for the location in the 3D scene of the source points of the selected features (block 920 ).
- the known sizes and orientations of the fiducial markers, both relative to each other, and relative the calibration target 100 as a whole, may be used to determine an estimated configuration of the calibration target 100 in the 3D scene.
- Other information may also be used, such as the overall outline and dimension of the calibration target 100 , and information independently determined regarding the image, for example a known distance between the calibration target 100 and the camera when the image was generated.
- an error minimization operation is performed to determine an estimate for the intrinsic parameters of the camera (block 925 ).
- a known relationship connecting world coordinates of a point in the 3D scene and corresponding point in the 2D pixel coordinate space is:
- [u,v,1] T denotes the pixel coordinates on the image plane of an image point using homogeneous coordinates
- [x w ,y w ,z w ,1] T are the 3D world coordinates of the original source point in homogeneous coordinates
- R and T represent extrinsic parameters which transform a point's 3D world coordinates to the camera's 3D coordinates, with R being a rotation matrix.
- the parameter z c is a constant of proportionality.
- the matrix A comprises the intrinsic parameters of the camera. It is given by:
- A ( a x ⁇ u 0 0 a y ⁇ v 0 0 0 0 1 ) .
- a x and a y are related to the camera's focal length and scale factors that relate distance to pixels.
- the intrinsic parameter ⁇ is a skew coefficient.
- the values u 0 and v 0 represent the principal point.
- Nonlimiting examples of error minimization operations include gradient descent, the Levenberg-Marquardt algorithm, and the Gauss-Newton algorithm.
- the error minimization operation may use the known relative positions of the uniquely determined fiducial markers as criteria for error minimization.
- the method 800 may be iterated using the an initial estimate of the intrinsic parameters to improve the reprojection estimates and the estimation of the intrinsic parameters.
- the method 900 may be applied to multiple images to obtain improve estimates.
- the exemplary steps shown in method 900 are capable of being performed within a computing system without a user once a digital image is obtained by the system. A user is not needed to view the image on a display and enter identification of particular 2D coordinates and corresponding 3D locations.
- the uniqueness of the fiducial markers and the pattern recognition algorithms, together with error minimization algorithms allow a computing system to proceed without needing user input.
- the method 900 may be implemented in conjunction with user input at any stage to improve overall performance.
- the methods just described refer to only one image, but it is clear to a person of skill in the art that using a sequence of different images of the calibration target and proceeding as above to generate successive estimates for the parameters of the camera would allow better refinement of the values for the camera parameters.
- different images may be used which show the calibration target from different orientations and/or from different distances.
- the successive estimates for parameters of the camera may be weighted and averaged to obtain a single estimate for the camera's parameters.
- a calibration target 100 may also be used to determine camera imaging parameters used in the capture of subsequent images.
- the imaging parameters may then be used by a virtual camera within a computing system.
- One embodiment according to the invention for placing a virtual camera within a computer generated graphics scene is set forth in FIG. 9B as method 950 .
- FIG. 9B at least one digital image of a 3D scene containing the calibration target 100 is obtained from a camera and the camera's intrinsic parameters are determined from the at least one image (block 955 ).
- the procedure for determining the camera's intrinsic parameters includes, as described above with respect to FIG.
- method 950 further includes determining an estimated distance from the camera to the calibration target 100 (block 960 ).
- the overall dimensions of the calibration target 100 in addition to the estimated 3D locations of the source points on the identified fiducial markers, can be used in the determination.
- Well known operations such as triangulation based on known geometric values of the calibration target 100 and its fiducial markers may be used.
- the method further includes determining the field of view of the camera that produced the received 2D image (block 965 ).
- an estimated distance between the camera and the calibration target 100 may be obtained, as described, and used in conjunction with an observed height or width of the calibration target 100 or its fiducial markers within the 2D image to determine a vertical, horizontal and/or diagonal viewing angle of the camera.
- the focal length of the camera may also be calculated (block 970 ).
- the intrinsic parameters of the camera contain information from which the focal length may be calculated.
- the focal length may be obtained using the field of view and the relationship between the field of view and the focal length.
- the imaging parameters of the camera obtained from a digital image of a 3D scene may be used to implement in a computer system a virtual camera that replicates the performance of the physical camera that obtained the original image or images (block 975 ).
- the virtual camera may be used to create animation scenes based on original physical 3D scene.
- FIG. 10 is a schematic diagram of a generic computer system 1000 .
- the system 1000 can be used for the operations described in association with any of the computer-implement methods described previously, according to one implementation.
- the system 1000 includes a processor 1010 , a memory 1020 , a storage device 1030 , and an input/output device 1040 .
- Each of the components 1010 , 1020 , 1030 , and 1040 are interconnected using a system bus 1050 .
- the processor 1010 is capable of processing instructions for execution within the system 1000 .
- the processor 1010 is a single-threaded processor.
- the processor 1010 is a multi-threaded processor.
- the processor 1010 is capable of processing instructions stored in the memory 1020 or on the storage device 1030 to display graphical information for a user interface on the input/output device 1040 .
- the memory 1020 stores information within the system 1000 .
- the memory 1020 is a computer-readable medium.
- the memory 1020 is a volatile memory unit.
- the memory 1020 is a non-volatile memory unit.
- the storage device 1030 is capable of providing mass storage for the system 1000 .
- the storage device 1030 is a computer-readable medium.
- the storage device 1030 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device.
- the input/output device 1040 provides input/output operations for the system 1000 .
- the input/output device 1040 includes a keyboard and/or pointing device.
- the input/output device 1040 includes a display unit for displaying graphical user interfaces.
- the features described can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
- the apparatus can be implemented in a computer program product tangibly embodied in an information carrier, e.g., in a machine-readable storage device or in a propagated signal, for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions of the described implementations by operating on input data and generating output.
- the described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
- a computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result.
- a computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors of any kind of computer.
- a processor will receive instructions and data from a read-only memory or a random access memory or both.
- the essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data.
- a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
- Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
- semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
- magnetic disks such as internal hard disks and removable disks
- magneto-optical disks and CD-ROM and DVD-ROM disks.
- the processor and the memory can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- ASICs application-specific integrated circuits
- the features can be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
- a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
- the features can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them.
- the components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.
- the computer system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a network, such as the described one.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
- This application is related to commonly assigned U.S. patent application Ser. No. 14/248,124, filed on Apr. 8, 2014, and entitled “AUTOMATED CAMERA CALIBRATION METHOD AND SYSTEM,” which is incorporated by reference herein in its entirety.
- A camera creates a record of a three-dimensional (3D) physical scene with a two-dimensional (2D) image. The image may be recorded on a film or as a digital 2D array of pixel values. Computer-based animation techniques often involve capturing a series of images of an actor (or other object) with one or more cameras, which may have different viewing perspectives. The images from these cameras can be combined to generate a three-dimensional (3D) graphical representation of the actor that can be applied to an animated character and placed in a computer-generated 3D scene.
- In order for the 3D representation and location of the character in the 3D scene to be accurate, the location of the camera must be able to be accurately reproduced. Towards this end, each camera needs to be calibrated to the 3D graphical representation of the scene. Calibration of a camera to the scene includes determining the intrinsic parameters of the camera and the location of the camera within the scene. Current systems for imaging calibration are relatively slow and inaccurate. Typically an image of a known object (often referred to as a calibration target, calibration apparatus, or calibration tool) is captured and an animator manually maps the object's features to the corresponding computer graphics model to set the orientation of a virtual camera in the 3D model.
- Currently known calibration targets may include a known pattern, image or markers formed on one or more surfaces or edges of the target, such as a black and white checkerboard pattern on one or more surfaces of the target or edges that are painted different colors. Once a camera's parameters have been determined by a calibration operation, a calibration target may also serve as a reference for configuring a virtual camera in the 3D representation of the scene in order, in some examples, to create further images of the scene. Despite the availability of a various calibration tools or targets that have been used to calibrate cameras in the past, improvements in the design and modeling of calibration targets are desirable.
- Embodiments of the invention pertain to a calibration target with a series of distinguishable fiducial markers on each of multiple surfaces that enable methods and systems of the invention to automatically identify the precise position of the target in a scene without manual input from a user.
- One embodiment of a calibration target according to the invention includes a hollow body having an interior surface and an exterior surface. At least one window is formed in the hollow body through which the interior surface is visible, and a plurality of distinguishable fiducial markers are arranged in a predetermined pattern along the interior and exterior surfaces of the hollow body. The fiducial markers are distinguishable such that, for any given image of the calibration target captured by a camera, based on the position of the calibration target in the image there is only one position, rotation, distortion value, and focal length for the camera. In some instances, the calibration target is approximately the size of a human bust, and/or the calibration target can further include one or more focus-assist patterns interspersed with the fiducial markers. In certain embodiments, the hollow body can include a plurality of planar surfaces joined together to define a perimeter of the body, with each planar surface in the plurality having a planar interior surface that is part of the interior surface of the hollow body and a planar exterior surface that is part of the exterior surface of the hollow body. Further, in some embodiments, the plurality of distinguishable fidicual markers include a first plurality of fiducial markers is arranged on the exterior surface in a first grid pattern and a second plurality of fiducial markers is arranged on the interior surface in a second grid pattern. Each of the first and second grid patterns can include a plurality of similarly sized cells, and in some embodiments, each cell contains either one fiducial marker or one focus-assist marker.
- Another embodiment of a calibration target according to the invention includes a hollow body having a plurality of planar surfaces joined together to define a perimeter of the hollow body. Each of the planar surfaces includes a planar interior surface that is part of an interior surface of the hollow body and a planar exterior surface that is part of an exterior surface of the hollow body. A first set of distinguishable fiducial markers is arranged in a predetermined pattern along the exterior surface of the hollow body, and a second set of distinguishable fiducial markers is arranged in a predetermined pattern along the interior surface of the hollow body, where no two fiducial markers in the first and second sets of fiducial markers are identical. At least one window is formed in the hollow body through which the interior surface of the hollow body and at least some of the fiducial markers in the second set are visible. In some embodiments, the hollow body is formed from at least five planar surfaces arranged in different planes, at least eight planar surfaces arranged in different planes or from sixteen planar surfaces, each arranged in different planes.
- To better understand the nature and advantages of these and other embodiments of the invention, reference should be made to the following description and the accompanying figures. It is to be understood, however, that each of the figures is provided for the purpose of illustration only and is not intended as a definition of the limits of the scope of the present invention. It is to be further understood that, while numerous specific details are set forth in the description below in order to provide a thorough understanding of the invention, a person of skill in the art will recognize that the invention may be practiced without some or all of these specific details.
-
FIG. 1 is a simplified front perspective view of a calibration target according to an embodiment of the invention that can be used to automatically calibrate a camera; -
FIG. 2 is a simplified rear perspective view of the calibration target shown inFIG. 1 ; -
FIGS. 3A and 3B are right and left side plan views of the calibration target shown inFIG. 1 , respectively; -
FIG. 4 is a top plan view of the calibration target shown inFIG. 1 ; -
FIGS. 5A-F are top plan views of calibration targets according to other embodiments of the invention; -
FIG. 6 is a simplified front perspective view of a calibration target according to another embodiment of the invention; -
FIGS. 7A-7C are simplified illustrations of fiducial markers that can be incorporated onto a calibration target according to embodiments of the invention; -
FIG. 8 is a simplified view of the calibration target located in a three dimensional space being imaged from multiple angles by different cameras; -
FIG. 9A is a flowchart of a method for calibrating a camera according to an embodiment of the invention; -
FIG. 9B is a flowchart of a method for calibrating a virtual camera according to an embodiment of the invention; and -
FIG. 10 is a schematic diagram of a computing system that can be used in connection with computer-implemented methods described in this document. - Embodiments of the invention are directed to devices, methods and systems for automatically calibrating a camera. Calibration of a camera entails, in part, determining parameters of a camera related to its focal length, principal point, and other values that affect how the camera produces a two-dimensional (2D) image from a view of points in a three-dimensional (3D) space. Once known, the parameters of the camera may be used in forming or adjusting the images that the camera produces.
- Often, calibrating a camera involves producing one or more images or pictures of a test object with the camera, locating components of the image that correspond to particular parts of the test object, and calculating the camera parameters based on the image components, the geometry of the test object, its position relative to the camera when the image was taken, and the physical assumptions about the camera. The test object is sometimes referred to as a calibration target or a calibration tool. Some embodiments of the invention pertain to a calibration target that can be used as part of a system that can automatically perform such a calibration process without manual input from a user.
- In order to better appreciate and understand embodiments of the invention, reference is now made to
FIGS. 1 and 3 .FIG. 1 is a simplified front perspective view of acalibration target 100 according to an embodiment of the invention that can be used to automatically calibrate one or more cameras, andFIG. 2 is a simplified rear perspective view ofcalibration target 100. As shown inFIGS. 1 and 2 ,calibration target 100 includes ahollow body 102 having multipleplanar surfaces 104 joined together to define a perimeter of the hollow body. Eachplanar surface 104 includes an exterior surface and an interior surface where the sum of the exterior surfaces for allplanar surfaces 104 generally defines anexterior surface 106 ofbody 102 and the sum of the interior surfaces of allplanar surfaces 104 generally defines aninterior surface 108 ofhollow body 102. - Each
planar surface 104 lies between two adjacent planar surfaces. For example,planar surface 104 labeled inFIG. 1 , which extends from abottom 111 ofcalibration target 100 to atop 113 of the target, is positioned between twoadditional surfaces 104, which are referred to herein as surfaces 104(l) and 104(r) for reference.Surface 104 is joined to surface 104(l) at a left edge ofsurface 104 and a right edge of surface 104(l).Surface 104 is also joined to surface 104(r) at a right edge ofsurface 104 and a left edge of surface 104(r). Similarly, each additionalplanar surface 104 includes two similar surfaces 104(l) and 104(r) such that any individualplanar surface 104 is itself a surface 104(l) to anotherplanar surface 104 and a surface 104(r) to still a differentplanar surface 104. The set ofplanar surfaces 104 form a closed shape that enablescalibration target 100 to be a free standing device that can be placed on a table or other support surface and oriented in an upright position without requiring tools or additional support. - A plurality of
fiducial markers 120 are provided on bothexterior surface 106 andinterior surface 108 to provide patterns for recognition by a camera system as explained below.Fiducial markers 120 can be arranged in predetermined locations across some or all ofexternal surface 106 and across some or all ofinternal surface 108. As shown inFIG. 1 ,fiducial markers 120 are arranged in a grid pattern on each ofexterior surface 106 andinterior surface 108 where each grid includes an array of generally square-shapedcells 112. For ease of illustration, only some offiducial markers 120 are shown inFIG. 1 and no fiducial markers are shown inFIG. 2 . Thus,many cells 112 are shown without a fiducial marker. In some embodiments, each and everycell 112 includes afiducial marker 120. In other embodiments, however, only a subset ofcells 112 include fiducial markers. -
FIGS. 3A and 3B , which are left and right side views ofcalibration target 100, more clearly show thatmarkers 120 ofcalibration target 100 are arranged in a grid, four markers high, around the entireexterior surface 106 of the calibration target. As shown, each individualfiducial marker 120 is generally square in shape and has a unique pattern that enables any given fiducial marker to be distinguished from other fidicual markers in the set of markers included ontarget 100. In other embodiments,fiducial markers 120 can be arranged in predetermined patterns other than a grid. Additionally, in some embodiments wherecells 112 are arranged in a grid, eachindividual cell 112 in the grid need not be a square. For example, the grid may include rectangular, hexagonal, octagonal or other appropriate shapedcells 112, and in some embodiments, individual cells in the grid may be sized or shaped differently from other cells in the grid. In additional and/or alternative embodiments, the fiducial markers may have other geometric shapes, and/or vary in other parameters such as color or size. -
Calibration target 100 can also include one or more focus-assistpatterns 122 on either or bothexterior surface 106 andinterior surface 108. Each focus-assist pattern 122 can be positioned in acell 112 instead of afiducial marker 120 being positioned in the respective cell(s). When shooting a particular scene there are often multiple cameras pointed towards the scene from multiple angles. Determining accurate focus for each camera, some of which may have a very shallow depth of field, can be very important. Thus, in some embodiments,target 100 includes multiple focus-assist patterns 102 (six are shown inFIGS. 3A and 3B ) spaced apart on the target along differentplanar surfaces 104 such that at least one focus-assist pattern will be within the image capture field of view for each individual camera shooting the scene withcalibration target 100 regardless of its position. As shown inFIG. 1 , focus-assist pattern 122 is a wheel-and-spoke pattern but other patterns can be used. -
Hollow body 102 includes awindow 110 that enablesfiducial markers 120 and/or focus-assistpatterns 122 oninterior surface 108 behind the window to be visible to a particular camera view through the window for calibration purposes when such fiducial markers would otherwise be blocked to the camera byexterior surface 106. The combination ofwindow 110 andfiducial markers 120 and/or focus-assistpatterns 122 oninterior surface 108 provides additional depth perception to certain cameras positioned aroundcalibration target 100 thus enabling more accurate calibration and focus control.Window 110 may further be advantageous in calibrating a camera initially and then for orienting the camera and determining imaging parameters for the camera to be applied to a virtual camera. -
Window 110 may be either a transparent material, such as cellophane, acrylic or glass or may be an area void of material. In the particular embodiment ofcalibration target 100 shown inFIG. 1 ,window 110 is void of material, is rectangular in shape and is centered between first and 114 and 116 ofsecond bands hollow body 102. Each of 114 and 116 completely surrounds an interior space ofbands body 102 in the width and length dimensions, and have a height that is approximately equal to the height of eachcell 112.Window 110 has a height that is approximately equal to the height of two cells, but calibration targets according to the present invention are not limited to the precise dimensions ofwindow 110 and/or 114, 116 shown inbands FIG. 1 . In other embodiments,window 110 may have different shapes,body 102 may include multiple windows and fewer or more bands, and individual bands, such as 114, 116, can have different heights.bands -
Calibration target 100 may be made from a material having sufficient rigidity, such as plastic, cardboard or metal, to enable the device to maintain its shape and readily stand without exterior support. In one embodiment,calibration tool 100 is approximately the same size of a life-size bust of a human head. This can be useful for calibrating cameras, or determining their imaging parameters, in order to obtain accurate images of a human actor, in which inaccuracies would be quickly apparent to a human viewer. In alternative embodiments, calibration targets according to the present invention may be larger or smaller as needed to accurately calibrate a camera to a given scene. Further,calibration target 100 may include one or more markers that indicate the bottom 111 and/or top 113 of the target. Such markers can be used to ensure that, whentarget 100 is used to calibrate one or more cameras, it is positioned in a known orientation within the scene which assists the software program in identifying the variousfiducial markers 120 ontarget 100 and calculating the relationship between a camera and the target. - In some embodiments,
calibration target 100 further includes aspine 130, which is shown inFIG. 2 . For ease of illustration,fiducial markers 120 and focus-assistpatterns 122 have been intentionally left out ofFIG. 2 but can be present in each of thecells 112 as described above with respect toFIG. 1 .Spine 130 provides additional fortarget 100. Additionally, in some embodiments, spine has angled 132, 134 that are sized and shaped to be useable with a standard quick-release clamp on a tripod. This enablesedges target 100 to be easily attached to and detached from a tripod and placed in almost any desired position within a scene. -
Calibration target 100 can made from a single piece of material or can be made from multiple parts that can be easily assembled and disassembled to facilitate transportation oftarget 100 from one scene (or movie set) to another. Such an assembly method may also make for efficient application of the fiducial markers on the planar surfaces prior to assembly. For example,body 102 can be made from a flat sheet of plastic having angled grooves cutout from the interior side such that the sheet can be bent in the shape shown inFIGS. 1 and 2 such that the the two opposing ends of the sheet abutt each other. The ends can be attached using a variety of techniques, and in one embodiment,spine 130 helps secure the ends together. As another example,body 102 can be printed using a 3D printer. In oneinstance body 102 can be printed as two separate components where a first component includesband 114 and the row ofcells 112 directly beneathband 114 and a second component includesband 116 and the row ofcells 112 directly aboveband 114. The two components can be printed with small holes at the top and bottom ends of each planar surface that allow small dowels to be placed in the holes to secure the two components together using a dowel joint.Spine 130 can be printed as a separate component and joined tobody 102 to provide additional support and strength. A person of skill in the art will recognize other techniques, including other approaches to 3D printing various components, that can be used to fabricatebody 102 andtarget 100. -
FIG. 4 is a top plan view ofcalibration target 100. As shown inFIG. 4 ,target 100 includes 16 separate sides 401-416 of the same length. Sides 401-416 form a closed polygon (a regular hexadecagon) that enablescalibration target 100 to be a free standing device so that it can be placed on a table or other support surface in a scene. Each side 401-416 corresponds to one ofplanar surfaces 104 discussed with respect toFIG. 1 , and thus may included one or more fiducial markers on both the interior and exterior surfaces of the side 401-416. The planar nature of sides 401-416 enables software to more readily recognizefiducial markers 120 positioned on each side as the software does not need to correct for curvature or irregularities of a nonplanar surface in performing its recognition route. Additionally, having sixteen sides 401-416, each in a different plane, ensures that any camera directed towardstarget 100 will capture multiple sides when imaging the target and thus capture multiple fiducial markers at different depths. - Embodiments of the invention are not limited to any particular shaped
body 102, however, and in other embodiments,body 102 may have different cross-sectional shapes. For example,FIG. 5A depicts and embodiment of the invention where abody 502 includes five sides that form a body having a regular pentagon cross-sectional shape (i.e., all angles between adjacent sides are equal in measure and all sides are equal in length).FIG. 5B depicts and embodiment of the invention where abody 504 includes six equal length sides having equal angles and a cross-sectional shape in the form of a regular hexagon.FIG. 5C depicts and embodiment of the invention where abody 504 includes eight equal length sides having equal angles and a cross-sectional shape in the form of a regular octagon, andFIG. 5D depicts and embodiment of the invention where abody 504 includes ten equal length sides having equal angles and a cross-sectional shape in the form of a regular decagon. In some embodiments of the invention, the body of a calibration target includes more than five sides (planar surfaces 104) aligned in different planes, such as each of the 502, 504, 506 and 508 shown inbodies FIGS. 5A-5D . Such embodiments help ensure that when an individual camera is directed towards the calibration target, the camera will capture multiple angles and multiple fiducial markers at different depths regardless of its position with respect to the target (unless it is directly above and pointed directly down towards target), which in turn, makes the algorithm that calculates camera position and properties described below more accurate. In some embodiments, the interior angle between each adjacent planar side is greater than 90 degrees. For example, the interior angle between each side inbody 502 is 108 degrees; the interior angle between each side inbody 504 is 120 degrees, the interior angle between each side inbody 506 is 135 degrees, the interior angle between each side inbody 508 is 144 degrees; and the interior angle between each side inbody 102 is 157.5 degrees. - While each of
502, 504, 506 and 508 have a regular polygon cross-section, in other embodiments of the invention, the body may have an irregular closed polygon shape. In still other embodiments, the body of a calibration target according to the invention can have a curved shape, such as the circular shape ofbodies body 510 shown inFIG. 5E or an oval shape or other curved shape. In embodiments where the body of a calibration target includes fiducial markers on one or more curved surfaces, the software algorithm that identifies the fiducial markers can be adapted to the particular curvature of the body to facilitate pattern recognition. Additionally, some embodiments of the invention may included one or more wings or extensions that extend from the body, such as the fourextensions 520 shown as extending from octagonal shapedbody 508 inFIG. 5F . Eachextension 520 may included additionalfiducial markers 120 and/or focus-assistpatterns 122. -
FIG. 6 is a simplified front perspective view of a calibration target 600 according to another embodiment of the invention. Calibration target 600 includes abody 602 and is essentially twoseparate calibration targets 100 stacked upon each other. In other embodiments, three, four ormore calibration targets 100 can be stacked together. - Similar to target 100 and
body 102,body 602 includes a predetermined pattern of multiplefiducial markers 120 and one or more pattern-assist features 122 arranged in a grid ofcells 112 on both the interior and exterior surfaces of the target. Instead of asingle window 110, target 600 includes two 610 a and 610 b.separate windows Window 610 a is bordered by aband 114 on the top and aband 620 below, each of which completely surround an interior space ofbody 602. Similarly,window 610 b is bordered by aband 116 below and byband 620 on top. In the embodiment shown inFIG. 6 , 114 and 116 each have a height approximately equal to the height of eachbands cell 112, while 610 a and 610 b andwindows band 620 each have a height that is approximately equal to the height of two cells. As shown inFIG. 6 ,body 602 has a hexadecagon cross-sectional shape. In other embodiments,body 602 and target 600 can have any of the shapes and/or features described with respect to other embodiments of calibration targets according to the present invention. - Reference is now made to
FIG. 7A , which shows an embodiment of a singlefiducial marker 120 according to an embodiment of the invention. As shown inFIG. 7A ,fiducial marker 120 is square in shape and is configured to have a 10-by-10 grid of smaller squares, which can be differently colored. In one embodiment, an outerperipheral band 702 of squares are all colored black or a dark color, and a second inner band ofsquares 704 interior to but touching the outermost band has a light color to contrast with the black or dark outer band. In oneparticular embodiment band 702 is black andband 704 is white. Interior to the two 702 and 704 is a 6-by-6bands grid 706 of smaller squares in which a unique pattern of dark (or black) colored and light colored (or white) smaller squares is created. In some embodiments, some individualfiducial markers 120 included on aparticular calibration target 100 have anouter band 702 that is dark and aninner band 704 that is light, while other individualfiducial markers 120 have anouter band 702 that is light and aninner band 704 that is dark. This additional level of variation can help further distinguish individual markers from each other. - In some embodiments of the invention the pattern formed in the 6×6 grid of each and every fiducial marker on
calibration tool 100 or 600 is unique. This enables software to distinguish each individual marker from others and identify the exact location of a particular fiducial marker visible to a given camera.FIGS. 7B and 7C depict two examples of 720 a and 720 b, which can correspond tofiducial markers fiducial markers 120. 720 a, 720 b differ from each other in the pattern of black and white squares within grid 720 of each marker, which allows 236 distinct patterns of black and white (binary value) subsquares within a fiducial marker with respect to an orientation of the square. However, to allow for rotational equivalences of the square, dividing this by 4 produces a lower bound on the number of rotationally distinct markers.Fiducial markers - In the embodiment of
calibration target 100 shown inFIG. 1 , there are sixteenplanar surfaces 104, and the target has a height that allows for four fiducial markers from top to bottom allowing for 8 fiducial markers on each planar surface (4 markers on the exterior surface and 4 markers on the interior surface) that does not include a portion ofwindow 110. Sincewindow 110 covers half the height oftarget 100 and half the width or length, eight of theplanar surfaces 104 allow for 4 fiducial markers. Thus, there would be 8×8+8×4=96 fiducial markers needed not accounting for the inclusion of focus-assist patterns. Double stacked target 600 shown inFIG. 6 would require at most two times the number of fiducial markers astarget 100 and would thus require 192 markers. Accordingly, the 6×6 pattern shown inFIG. 7A allows for significantly more than enough possible fiducial markers to be able to use all distinct fiducial markers on the apparatus. Further, various selection processes may be used to choose a set of fiducial markers for the apparatus such that the fiducial markers in the set are not only distinct, and also that any two differ in a large number of the smaller subsquares. - In other embodiments, the fiducial markers may have a rectangular, triangular, circular or other shapes, and/or may include a pattern made up of components other than a grid of similarly sized squares. For example, in some embodiments the fiducial markers may include a plurality of hexagons, pentagons, rectangles, triangles, dots or other shapes within an outer border of the fiducial.
- Additional and/or alternative embodiments may include any of the following features. The fiducial markers may contain information within the pattern of dark and light subsquares. The particular sequence of fiducial markers around the border area of a quadrant of a planar surface may also contain information to assist in identification of the fiducial markers, and/or to assist in the calibration of the camera. Particular fiducials which are known to be easily recognized in a camera may be positioned at particular locations on the calibration target to aid in identifying which surface is being viewed.
-
Calibration target 100 or 600 and their equivalents may be used either to calibrate a camera, and/or to determine imaging properties of a camera, from images taken by the camera. Methods according to the invention use the information available via a captured image ofcalibration target 100 to increase automation of calibration and other operations. The size of the of the apparatus, including the boundary area surrounding a window, and the size of the fiducial markers and their location on the apparatus may be recorded or known before the calibration target is used. -
FIG. 8 is a simplified schematic diagram showing ascene 800 in which acalibration target 810 according to the invention can be used for calibrating one or more cameras according to an embodiment of the invention. As shown inFIG. 8 , three 802 a, 802 b and 802 c, each of which can be, for example, a digital camera, a movie camera, or a video camera, are in use to “shoot”physical cameras scene 800 which may include one ormore actors 806 and/or props 808. Each of the 802 a, 802 b and 802 c are directed to take images ofcameras scene 800 whilecalibration target 810 is positioned in the scene so that it is within the field of view of each camera. In embodiments wheretarget 810 has more than five sides aligned in different planes as described above, each 802 a, 802 b and 802 c will capture at least multiple sides and angles ofcamera target 810 when the camera is positioned to capture at least some of the exterior and/or interior surface of the target regardless of the angle between the camera and target. - Each
802 a, 802 b, 802 c is connected to acamera processor 804, such as the computer processing system described inFIG. 10 . As each camera captures images of the scene including images ofcalibration target 810, the captured images are sent toprocessor 804, which stores a computer graphics model ofcalibration target 810 including the positions and unique patterns of the fiducial markers on the target.Calibration target 810 includes unique fiducial markers, such that, for any given image there is only one position, rotation, distortion value, and focal length for the camera of interest. Thus, without receiving input from a user,processor 804 can compare information from captured images of each of 802 a, 802 b, 802 c to the computer graphics model to identify fiducial markers oncameras calibration target 810, and based on the comparison, determine, among other parameters, the position, rotation, distortion and focal length of each camera with respect to thecalibration target 810 and scene as described below with respect toFIG. 9A .Processor 804 can also be used to place a virtual camera in a computer generated 3D scene including setting the virtual camera's position, rotation, distortion and focal length according to the intrinsic and imaging parameters of 802 a, 802 b, 802 c as described with respect tocamera FIG. 9B . -
FIG. 9A shows a flow chart for amethod 900 for calibrating a camera using thecalibration target 100 ofFIG. 1 , or an alternate embodiment thereof. Thecalibration target 100 may be placed in a physical set depicting a scene. Information regarding the location of thecalibration target 100 in the scene, such as which surface of the apparatus is facing the camera, the distance from the camera to a particular point on the apparatus, or other data, may optionally be recorded to assist in the camera calibration. Once thecalibration target 100 is positioned in the scene, the camera is directed so as to include thecalibration target 100 in an image, and at least one image that includes thecalibration target 100 is obtained using the camera. In some embodiments the images are digitally generated, such as by a camera using a charge-coupled device (CCD). In other embodiments the images may be captured on film, and then scanned into digital form. In digital form the pixel coordinates of an image point can be determined. The image is then received at a computational device or system capable of analyzing images and image data (block 905). - Each image captured for calibrating a camera may include a 2D array of pixels, and may be enumerated using pixel coordinates. Pixel coordinates may be normalized or converted to homogeneous coordinate form. An identification procedure is performed on an image in order to identify parts of the image that correspond to particular fiducial markers (block 910). That is, the parts or segments of the image are identified for which a fiducial marker on the calibration target was the source in the physical scene. Such a procedure may involve one or more pattern recognition algorithms. When the embodiment of
FIG. 1 is used, the fact that the fiducial markers are square and unique may be used as part of the pattern recognition algorithm. In the case that the fiducial markers are as disclosed in relation toFIG. 7A , the pattern recognition algorithm may use the known information about the bands to assist in identifying the image of fiducial markers within the captured image. - The pattern recognition algorithm may determine a plurality of parts of the image corresponding to fiducial markers, but with varying levels of certainty that a fiducial marker is the source. The identification procedure may choose a sufficient number of the image parts having sufficient certainty of corresponding to fiducial marker. In one embodiment, only fiducial markers that are fully visible in an image are chosen for analysis by the pattern recognition algorithm. Thus, for example, a fidicual marker that is only partially visible through a
window 108, may be excluded form the pattern recognition algorithm in order to increase the effectiveness of the algorithm. Also, whentarget 100 includes one or more focus-assistpatterns 122 at known locations, the pattern recognition algorithm may identify the location of such focus-assist patterns and use that information in identifying fiducial markers and/or in calculating the position and orientation oftarget 100 in the set. - Once a part of image has been identified as the image of a fiducial marker on the calibration target, in the case that each fiducial marker of the calibration target is unique, pattern matching algorithms or pattern recognition operations that compare the image to a computer model of the calibration target including its fiducial markers may be used to uniquely determine which of the fiducial markers was the source for that part of the image (block 915). Nonlimiting examples of such pattern matching algorithms include the discrete Fourier transform, the Hough transform and wavelet transforms. In some embodiments of the
calibration target 100, the fiducial markers may be chosen to have easily recognized and distinguished transforms. Known relative locations of the fiducial markers on the calibration target may be used as part of the pattern recognition operations. - Once a part of the image has been identified as a fiducial marker, and the identification of the particular fiducial marker has been determined, one or more specific features of the fiducial marker may be located in the image. In embodiments that use the calibration target of
FIG. 1 with fiducial markers as inFIG. 7A , the specific features may be one or more of the corners of the fiducial marker. - In some embodiments of the method, subimages of multiple fiducial markers are identified within the image, the corresponding unique fiducial markers are determined, and specific features of each fiducial marker are selected. As an illustrative example, for the
calibration target 100 shown inFIG. 1 with fiducial markers as shownFIG. 7A and described above, multiple components of the image corresponding to fiducial markers may be identified, the respective source fiducial marker uniquely determined, and the corners of those fiducial markers selected as source points. A set of 2D image coordinates, such as either pixel coordinates or homogeneous coordinates, of the source points is then obtained. - Using the set of 2D image coordinates of the source points, reprojection of the set 2D image coordinates of the source points is performed to determine a corresponding set of estimated 3D coordinates for the location in the 3D scene of the source points of the selected features (block 920). The known sizes and orientations of the fiducial markers, both relative to each other, and relative the
calibration target 100 as a whole, may be used to determine an estimated configuration of thecalibration target 100 in the 3D scene. Other information may also be used, such as the overall outline and dimension of thecalibration target 100, and information independently determined regarding the image, for example a known distance between thecalibration target 100 and the camera when the image was generated. - Using the set of estimated 3D coordinates and the estimated configuration of the
calibration target 100, an error minimization operation is performed to determine an estimate for the intrinsic parameters of the camera (block 925). In one embodiment, a known relationship connecting world coordinates of a point in the 3D scene and corresponding point in the 2D pixel coordinate space is: -
z c [u,v,1]T =A[RT][x w ,y w ,z w,1]T [1] - In this equation, [u,v,1]T denotes the pixel coordinates on the image plane of an image point using homogeneous coordinates, [xw,yw,zw,1]T are the 3D world coordinates of the original source point in homogeneous coordinates, R and T represent extrinsic parameters which transform a point's 3D world coordinates to the camera's 3D coordinates, with R being a rotation matrix. The parameter zc is a constant of proportionality. The matrix A comprises the intrinsic parameters of the camera. It is given by:
-
- Here, ax and ay are related to the camera's focal length and scale factors that relate distance to pixels. The intrinsic parameter γ is a skew coefficient. The values u0 and v0 represent the principal point.
- Nonlimiting examples of error minimization operations include gradient descent, the Levenberg-Marquardt algorithm, and the Gauss-Newton algorithm. The error minimization operation may use the known relative positions of the uniquely determined fiducial markers as criteria for error minimization. In additional and/or alternative embodiments, the
method 800 may be iterated using the an initial estimate of the intrinsic parameters to improve the reprojection estimates and the estimation of the intrinsic parameters. In additional and/or alternative embodiments, themethod 900 may be applied to multiple images to obtain improve estimates. - The exemplary steps shown in
method 900 are capable of being performed within a computing system without a user once a digital image is obtained by the system. A user is not needed to view the image on a display and enter identification of particular 2D coordinates and corresponding 3D locations. In various embodiments the uniqueness of the fiducial markers and the pattern recognition algorithms, together with error minimization algorithms, allow a computing system to proceed without needing user input. However, it will apparent to one of skill in the art that themethod 900 may be implemented in conjunction with user input at any stage to improve overall performance. - The methods just described refer to only one image, but it is clear to a person of skill in the art that using a sequence of different images of the calibration target and proceeding as above to generate successive estimates for the parameters of the camera would allow better refinement of the values for the camera parameters. In some embodiments, different images may be used which show the calibration target from different orientations and/or from different distances. In one embodiment, the successive estimates for parameters of the camera may be weighted and averaged to obtain a single estimate for the camera's parameters.
- Once a camera's intrinsic parameters are known, such as by the calibration method just disclosed, a
calibration target 100 may also be used to determine camera imaging parameters used in the capture of subsequent images. The imaging parameters may then be used by a virtual camera within a computing system. One embodiment according to the invention for placing a virtual camera within a computer generated graphics scene is set forth inFIG. 9B asmethod 950. As shown inFIG. 9B , at least one digital image of a 3D scene containing thecalibration target 100 is obtained from a camera and the camera's intrinsic parameters are determined from the at least one image (block 955). The procedure for determining the camera's intrinsic parameters includes, as described above with respect toFIG. 9A , locating, by a computing system, components of the digital image corresponding to a fiducial marker, uniquely identifying the fiducial marker from among all the fiducial markers known to be on thecalibration target 100, identifying 2D image components of parts or points of the identified fiducial markers, reprojecting the 2D parts into a representation of the 3D scene, and obtaining estimates for at least one of the position, the location, and the orientation of thecalibration target 100 in the representation of the scene. - In an exemplary embodiment,
method 950 further includes determining an estimated distance from the camera to the calibration target 100 (block 960). The overall dimensions of thecalibration target 100, in addition to the estimated 3D locations of the source points on the identified fiducial markers, can be used in the determination. Well known operations such as triangulation based on known geometric values of thecalibration target 100 and its fiducial markers may be used. - In an exemplary embodiment, the method further includes determining the field of view of the camera that produced the received 2D image (block 965). In one embodiment an estimated distance between the camera and the
calibration target 100 may be obtained, as described, and used in conjunction with an observed height or width of thecalibration target 100 or its fiducial markers within the 2D image to determine a vertical, horizontal and/or diagonal viewing angle of the camera. - The focal length of the camera may also be calculated (block 970). As described previously, the intrinsic parameters of the camera contain information from which the focal length may be calculated. In additional and/or alternative embodiments, the focal length may be obtained using the field of view and the relationship between the field of view and the focal length.
- The imaging parameters of the camera obtained from a digital image of a 3D scene may be used to implement in a computer system a virtual camera that replicates the performance of the physical camera that obtained the original image or images (block 975). The virtual camera may be used to create animation scenes based on original physical 3D scene.
-
FIG. 10 is a schematic diagram of ageneric computer system 1000. Thesystem 1000 can be used for the operations described in association with any of the computer-implement methods described previously, according to one implementation. Thesystem 1000 includes aprocessor 1010, amemory 1020, astorage device 1030, and an input/output device 1040. Each of the 1010, 1020, 1030, and 1040 are interconnected using acomponents system bus 1050. Theprocessor 1010 is capable of processing instructions for execution within thesystem 1000. In one implementation, theprocessor 1010 is a single-threaded processor. In another implementation, theprocessor 1010 is a multi-threaded processor. Theprocessor 1010 is capable of processing instructions stored in thememory 1020 or on thestorage device 1030 to display graphical information for a user interface on the input/output device 1040. - The
memory 1020 stores information within thesystem 1000. In one implementation, thememory 1020 is a computer-readable medium. In one implementation, thememory 1020 is a volatile memory unit. In another implementation, thememory 1020 is a non-volatile memory unit. - The
storage device 1030 is capable of providing mass storage for thesystem 1000. In one implementation, thestorage device 1030 is a computer-readable medium. In various different implementations, thestorage device 1030 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device. - The input/
output device 1040 provides input/output operations for thesystem 1000. In one implementation, the input/output device 1040 includes a keyboard and/or pointing device. In another implementation, the input/output device 1040 includes a display unit for displaying graphical user interfaces. - The features described can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. The apparatus can be implemented in a computer program product tangibly embodied in an information carrier, e.g., in a machine-readable storage device or in a propagated signal, for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions of the described implementations by operating on input data and generating output. The described features can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. A computer program is a set of instructions that can be used, directly or indirectly, in a computer to perform a certain activity or bring about a certain result. A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- Suitable processors for the execution of a program of instructions include, by way of example, both general and special purpose microprocessors, and the sole processor or one of multiple processors of any kind of computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memories for storing instructions and data. Generally, a computer will also include, or be operatively coupled to communicate with, one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- To provide for interaction with a user, the features can be implemented on a computer having a display device such as a CRT (cathode ray tube) or LCD (liquid crystal display) monitor for displaying information to the user and a keyboard and a pointing device such as a mouse or a trackball by which the user can provide input to the computer.
- The features can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them. The components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks forming the Internet.
- The computer system can include clients and servers. A client and server are generally remote from each other and typically interact through a network, such as the described one. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- A number of embodiments have been described. As will be understood by those skilled in the art, the present invention may be embodied in other specific forms without departing from the essential characteristics thereof. For example, while embodiments of the calibration target according to the present invention were discussed above with respect to
calibration target 100 having a particular shape, the invention is not limited to any particularly shaped calibration target and calibration targets having other regular or irregular polygonal cross-sectional shapes are possible. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments of the invention described herein. Such equivalents are intended to be encompassed by the following claims.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/502,647 US9307232B1 (en) | 2014-04-08 | 2014-09-30 | Calibration target for video processing |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/248,124 US9641830B2 (en) | 2014-04-08 | 2014-04-08 | Automated camera calibration methods and systems |
| US14/502,647 US9307232B1 (en) | 2014-04-08 | 2014-09-30 | Calibration target for video processing |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20160094840A1 true US20160094840A1 (en) | 2016-03-31 |
| US9307232B1 US9307232B1 (en) | 2016-04-05 |
Family
ID=54210891
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/248,124 Expired - Fee Related US9641830B2 (en) | 2014-04-08 | 2014-04-08 | Automated camera calibration methods and systems |
| US14/502,647 Active US9307232B1 (en) | 2014-04-08 | 2014-09-30 | Calibration target for video processing |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/248,124 Expired - Fee Related US9641830B2 (en) | 2014-04-08 | 2014-04-08 | Automated camera calibration methods and systems |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US9641830B2 (en) |
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160104285A1 (en) * | 2014-10-10 | 2016-04-14 | IEC Infrared Systems LLC | Calibrating Panoramic Imaging System In Multiple Dimensions |
| US20160189358A1 (en) * | 2014-12-29 | 2016-06-30 | Dassault Systemes | Method for calibrating a depth camera |
| US20180007346A1 (en) * | 2015-09-25 | 2018-01-04 | Olympus Corporation | Image calibration inspection tool and endoscope system |
| WO2018145025A1 (en) * | 2017-02-03 | 2018-08-09 | Abb Schweiz Ag | Calibration article for a 3d vision robotic system |
| WO2019067283A1 (en) * | 2017-09-29 | 2019-04-04 | Waymo Llc | Target, method, and system for camera calibration |
| US10477186B2 (en) * | 2018-01-17 | 2019-11-12 | Nextvr Inc. | Methods and apparatus for calibrating and/or adjusting the arrangement of cameras in a camera pair |
| US10623727B1 (en) | 2019-04-16 | 2020-04-14 | Waymo Llc | Calibration systems usable for distortion characterization in cameras |
| US20200226762A1 (en) * | 2019-01-15 | 2020-07-16 | Nvidia Corporation | Graphical fiducial marker identification suitable for augmented reality, virtual reality, and robotics |
| US20200400959A1 (en) * | 2017-02-14 | 2020-12-24 | Securiport Llc | Augmented reality monitoring of border control systems |
| US11080884B2 (en) | 2019-05-15 | 2021-08-03 | Matterport, Inc. | Point tracking using a trained network |
| US20240354990A1 (en) * | 2023-04-24 | 2024-10-24 | Google Llc | Camera calibration of a telepresence system |
| US20250139923A1 (en) * | 2023-10-31 | 2025-05-01 | Sony Interactive Entertainment Inc. | Normalizing individual depth perception for vr |
Families Citing this family (42)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9282326B2 (en) * | 2013-10-28 | 2016-03-08 | The Regents Of The University Of Michigan | Interactive camera calibration tool |
| US9307231B2 (en) | 2014-04-08 | 2016-04-05 | Lucasfilm Entertainment Company Ltd. | Calibration target for video processing |
| USD751627S1 (en) | 2014-09-30 | 2016-03-15 | Lucasfilm Entertainment Company Ltd. | Camera calibration tool |
| JP6507730B2 (en) * | 2015-03-10 | 2019-05-08 | 富士通株式会社 | Coordinate transformation parameter determination device, coordinate transformation parameter determination method, and computer program for coordinate transformation parameter determination |
| KR101666959B1 (en) * | 2015-03-25 | 2016-10-18 | ㈜베이다스 | Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor |
| TWI557393B (en) * | 2015-10-08 | 2016-11-11 | 微星科技股份有限公司 | Calibration method of laser ranging and device utilizing the method |
| US9978147B2 (en) * | 2015-12-23 | 2018-05-22 | Intel Corporation | System and method for calibration of a depth camera system |
| US20170270654A1 (en) | 2016-03-18 | 2017-09-21 | Intel Corporation | Camera calibration using depth data |
| US10339662B2 (en) | 2016-05-23 | 2019-07-02 | Microsoft Technology Licensing, Llc | Registering cameras with virtual fiducials |
| US10027954B2 (en) | 2016-05-23 | 2018-07-17 | Microsoft Technology Licensing, Llc | Registering cameras in a multi-camera imager |
| US10326979B2 (en) | 2016-05-23 | 2019-06-18 | Microsoft Technology Licensing, Llc | Imaging system comprising real-time image registration |
| EP3657455B1 (en) | 2016-06-22 | 2024-04-24 | Outsight | Methods and systems for detecting intrusions in a monitored volume |
| KR102261020B1 (en) * | 2016-06-28 | 2021-06-03 | 매직 립, 인코포레이티드 | Improved camera calibration system, target and process |
| EP3510562A1 (en) | 2016-09-07 | 2019-07-17 | Starship Technologies OÜ | Method and system for calibrating multiple cameras |
| DE102017000307A1 (en) * | 2017-01-16 | 2018-07-19 | Connaught Electronics Ltd. | Method for calibrating a camera for a motor vehicle taking into account a calibration error, camera and motor vehicle |
| EP3385747B1 (en) * | 2017-04-05 | 2021-03-31 | Axis AB | Method, device and system for mapping position detections to a graphical representation |
| CN110506297B (en) * | 2017-04-17 | 2023-08-11 | 康耐视公司 | High accuracy calibration system and method |
| US10122997B1 (en) | 2017-05-03 | 2018-11-06 | Lowe's Companies, Inc. | Automated matrix photo framing using range camera input |
| US10269140B2 (en) | 2017-05-04 | 2019-04-23 | Second Spectrum, Inc. | Method and apparatus for automatic intrinsic camera calibration using images of a planar calibration pattern |
| US11039121B2 (en) * | 2017-06-20 | 2021-06-15 | Sony Interactive Entertainment Inc. | Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method |
| DE102017113917A1 (en) * | 2017-06-23 | 2018-12-27 | Krones Ag | Calibration body for calibrating image recording devices and in particular cameras |
| CN107945235B (en) * | 2017-10-17 | 2022-02-01 | 许昌学院 | Geometric positioning simulation method for high-orbit large-area array stationary satellite image |
| US11364004B2 (en) * | 2018-02-08 | 2022-06-21 | Covidien Lp | System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target |
| US10942045B1 (en) * | 2018-04-03 | 2021-03-09 | Waymo Llc | Portable sensor calibration target for autonomous vehicle |
| CN110784693A (en) * | 2018-07-31 | 2020-02-11 | 中强光电股份有限公司 | Projector calibration method and projection system using this method |
| CN110969662B (en) * | 2018-09-28 | 2023-09-26 | 杭州海康威视数字技术股份有限公司 | Fisheye camera internal parameter calibration method, device, calibration device controller and system |
| CN109544635B (en) * | 2018-10-10 | 2020-11-13 | 长安大学 | An automatic camera calibration method based on enumeration and heuristic |
| CN111580117A (en) * | 2019-02-19 | 2020-08-25 | 光宝电子(广州)有限公司 | Control method of flight time distance measurement sensing system |
| GB2581792B (en) * | 2019-02-25 | 2023-01-04 | Mo Sys Engineering Ltd | Lens calibration system |
| JP2020154208A (en) * | 2019-03-22 | 2020-09-24 | 本田技研工業株式会社 | Camera focus adjustment jig and camera focus adjustment method |
| USD910738S1 (en) * | 2019-04-02 | 2021-02-16 | Lucasfilm Entertainment Company Ltd. LLC | Cornergrid system |
| US11221631B2 (en) | 2019-04-24 | 2022-01-11 | Innovation First, Inc. | Performance arena for robots with position location system |
| CA3046609C (en) * | 2019-06-14 | 2025-12-09 | Hinge Health, Inc. | Method and system for extrinsic camera calibration |
| US11423573B2 (en) * | 2020-01-22 | 2022-08-23 | Uatc, Llc | System and methods for calibrating cameras with a fixed focal point |
| WO2021236107A1 (en) * | 2020-05-22 | 2021-11-25 | Purdue Research Foundation | Fiducial patterns |
| USD976992S1 (en) * | 2020-05-22 | 2023-01-31 | Lucasfilm Entertainment Company Ltd. | Camera calibration tool |
| DE102021201172A1 (en) * | 2021-02-09 | 2022-08-11 | Robert Bosch Gesellschaft mit beschränkter Haftung | Calibration body device, calibration body system and method for calibrating a camera system, a depth sensor and/or a radar system using the calibration body device or the calibration body system |
| WO2023244937A1 (en) * | 2022-06-14 | 2023-12-21 | Argo AI, LLC | Systems and methods for autonomous vehicle sensor calibration and validation |
| US12400365B2 (en) | 2022-06-14 | 2025-08-26 | Volkswagen Group of America Investments, LLC | Systems and methods for autonomous vehicle sensor calibration and validation |
| US12211161B2 (en) | 2022-06-24 | 2025-01-28 | Lowe's Companies, Inc. | Reset modeling based on reset and object properties |
| US12189915B2 (en) | 2022-06-24 | 2025-01-07 | Lowe's Companies, Inc. | Simulated environment for presenting virtual objects and virtual resets |
| US20240233317A9 (en) * | 2022-10-25 | 2024-07-11 | Htc Corporation | Luminary measurement system and method |
Family Cites Families (26)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4241349A (en) | 1979-03-09 | 1980-12-23 | Davis Instruments Corporation | Apparatus for disposing corner cube reflector for detection |
| USD314736S (en) | 1989-04-07 | 1991-02-19 | Ring Guy L | Bicycle pump and bottle bracket |
| US6377300B1 (en) | 1998-04-14 | 2002-04-23 | Mcdonnell Douglas Corporation | Compact flat-field calibration apparatus |
| GB0205484D0 (en) | 2002-03-08 | 2002-04-24 | Bae Systems Plc | Improvements in or relating to the calibration of infra red cameras |
| US20040080447A1 (en) * | 2002-10-17 | 2004-04-29 | Bas Christophe F. | Miniature omni-directional corner reflector |
| US20040179098A1 (en) | 2003-02-25 | 2004-09-16 | Haehn Craig S. | Image reversing for infrared camera |
| KR100517889B1 (en) | 2003-05-09 | 2005-09-30 | 주라형 | Phantom for accuracy evaluation of image registration |
| US7071966B2 (en) | 2003-06-13 | 2006-07-04 | Benq Corporation | Method of aligning lens and sensor of camera |
| US7152984B1 (en) | 2003-08-13 | 2006-12-26 | Microfab Technologies Inc. | Cat's eye retro-reflector array coding device and method of fabrication |
| JP2005303524A (en) | 2004-04-08 | 2005-10-27 | Olympus Corp | Camera for calibration and calibration system |
| JP4496354B2 (en) | 2004-06-18 | 2010-07-07 | 独立行政法人 宇宙航空研究開発機構 | Transmission type calibration equipment for camera calibration and its calibration method |
| DE102005047200B4 (en) | 2005-10-01 | 2021-05-06 | Carl Zeiss Microscopy Gmbh | Method for correcting a control of an optical scanner in a device for scanning imaging of a sample and device for generating an image of a sample by scanning the sample |
| GB0608841D0 (en) * | 2006-05-04 | 2006-06-14 | Isis Innovation | Scanner system and method for scanning |
| JP4757142B2 (en) * | 2006-08-10 | 2011-08-24 | キヤノン株式会社 | Imaging environment calibration method and information processing apparatus |
| US7907271B2 (en) | 2007-06-15 | 2011-03-15 | Historx, Inc. | Method and system for standardizing microscope instruments |
| US7679046B1 (en) | 2007-10-08 | 2010-03-16 | Flir Systems, Inc. | Infrared camera calibration systems and methods |
| CN101419705B (en) | 2007-10-24 | 2011-01-05 | 华为终端有限公司 | Video camera demarcating method and device |
| US8265376B2 (en) * | 2008-07-21 | 2012-09-11 | Cognitens Ltd. | Method and system for providing a digital model of an object |
| EP2161597B1 (en) * | 2008-09-03 | 2014-06-04 | Brainlab AG | Image-assisted operation system |
| US9185401B2 (en) | 2011-07-15 | 2015-11-10 | Electronics And Telecommunications Research Institute | Method and apparatus for camera network calibration with small calibration pattern |
| US20130063558A1 (en) | 2011-09-14 | 2013-03-14 | Motion Analysis Corporation | Systems and Methods for Incorporating Two Dimensional Images Captured by a Moving Studio Camera with Actively Controlled Optics into a Virtual Three Dimensional Coordinate System |
| US9498182B2 (en) * | 2012-05-22 | 2016-11-22 | Covidien Lp | Systems and methods for planning and navigation |
| US8907290B2 (en) | 2012-06-08 | 2014-12-09 | General Electric Company | Methods and systems for gain calibration of gamma ray detectors |
| US8937682B2 (en) | 2012-07-02 | 2015-01-20 | Axis Ab | Focusing device |
| USD704900S1 (en) | 2013-10-08 | 2014-05-13 | H3R Aviation, Inc. | Fire extinguisher bracket and mount assembly |
| US9307231B2 (en) * | 2014-04-08 | 2016-04-05 | Lucasfilm Entertainment Company Ltd. | Calibration target for video processing |
-
2014
- 2014-04-08 US US14/248,124 patent/US9641830B2/en not_active Expired - Fee Related
- 2014-09-30 US US14/502,647 patent/US9307232B1/en active Active
Cited By (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10084960B2 (en) | 2014-10-10 | 2018-09-25 | Iec Infrared Systems, Llc | Panoramic view imaging system with drone integration |
| US20160104285A1 (en) * | 2014-10-10 | 2016-04-14 | IEC Infrared Systems LLC | Calibrating Panoramic Imaging System In Multiple Dimensions |
| US9876954B2 (en) * | 2014-10-10 | 2018-01-23 | Iec Infrared Systems, Llc | Calibrating panoramic imaging system in multiple dimensions |
| US10033924B2 (en) | 2014-10-10 | 2018-07-24 | Iec Infrared Systems, Llc | Panoramic view imaging system |
| US10367996B2 (en) | 2014-10-10 | 2019-07-30 | Iec Infrared Systems, Llc | Calibrating panoramic imaging system in multiple dimensions |
| US20160189358A1 (en) * | 2014-12-29 | 2016-06-30 | Dassault Systemes | Method for calibrating a depth camera |
| US10070121B2 (en) * | 2014-12-29 | 2018-09-04 | Dassault Systemes | Method for calibrating a depth camera |
| US10574971B2 (en) * | 2015-09-25 | 2020-02-25 | Olympus Corporation | Image calibration inspection tool and endoscope system |
| US20180007346A1 (en) * | 2015-09-25 | 2018-01-04 | Olympus Corporation | Image calibration inspection tool and endoscope system |
| WO2018145025A1 (en) * | 2017-02-03 | 2018-08-09 | Abb Schweiz Ag | Calibration article for a 3d vision robotic system |
| US10661442B2 (en) | 2017-02-03 | 2020-05-26 | Abb Schweiz Ag | Calibration article for a 3D vision robotic system |
| US20200400959A1 (en) * | 2017-02-14 | 2020-12-24 | Securiport Llc | Augmented reality monitoring of border control systems |
| JP2020535511A (en) * | 2017-09-29 | 2020-12-03 | ウェイモ エルエルシー | Targets, methods, and systems for camera calibration |
| US10930014B2 (en) | 2017-09-29 | 2021-02-23 | Waymo Llc | Target, method, and system for camera calibration |
| US11657536B2 (en) | 2017-09-29 | 2023-05-23 | Waymo Llc | Target, method, and system for camera calibration |
| WO2019067283A1 (en) * | 2017-09-29 | 2019-04-04 | Waymo Llc | Target, method, and system for camera calibration |
| US10432912B2 (en) | 2017-09-29 | 2019-10-01 | Waymo Llc | Target, method, and system for camera calibration |
| US10477186B2 (en) * | 2018-01-17 | 2019-11-12 | Nextvr Inc. | Methods and apparatus for calibrating and/or adjusting the arrangement of cameras in a camera pair |
| US20210366124A1 (en) * | 2019-01-15 | 2021-11-25 | Nvidia Corporation | Graphical fiducial marker identification |
| US11113819B2 (en) * | 2019-01-15 | 2021-09-07 | Nvidia Corporation | Graphical fiducial marker identification suitable for augmented reality, virtual reality, and robotics |
| US20200226762A1 (en) * | 2019-01-15 | 2020-07-16 | Nvidia Corporation | Graphical fiducial marker identification suitable for augmented reality, virtual reality, and robotics |
| US12322114B2 (en) * | 2019-01-15 | 2025-06-03 | Nvidia Corporation | Graphical fiducial marker identification |
| US10965935B2 (en) | 2019-04-16 | 2021-03-30 | Waymo Llc | Calibration systems usable for distortion characterization in cameras |
| US10623727B1 (en) | 2019-04-16 | 2020-04-14 | Waymo Llc | Calibration systems usable for distortion characterization in cameras |
| US11080884B2 (en) | 2019-05-15 | 2021-08-03 | Matterport, Inc. | Point tracking using a trained network |
| US20240354990A1 (en) * | 2023-04-24 | 2024-10-24 | Google Llc | Camera calibration of a telepresence system |
| US20250139923A1 (en) * | 2023-10-31 | 2025-05-01 | Sony Interactive Entertainment Inc. | Normalizing individual depth perception for vr |
Also Published As
| Publication number | Publication date |
|---|---|
| US9307232B1 (en) | 2016-04-05 |
| US9641830B2 (en) | 2017-05-02 |
| US20150288951A1 (en) | 2015-10-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9307232B1 (en) | Calibration target for video processing | |
| US9307231B2 (en) | Calibration target for video processing | |
| US12272020B2 (en) | Method and system for image generation | |
| US12462432B2 (en) | Multi view camera registration | |
| US11080911B2 (en) | Mosaic oblique images and systems and methods of making and using same | |
| US8970709B2 (en) | Reduced homography for recovery of pose parameters of an optical apparatus producing image data with structural uncertainty | |
| EP2111530B1 (en) | Automatic stereo measurement of a point of interest in a scene | |
| CN109801374B (en) | Method, medium, and system for reconstructing three-dimensional model through multi-angle image set | |
| CN108289208A (en) | A kind of projected picture auto-correction method and device | |
| US20160063706A1 (en) | Reduced Homography based on Structural Redundancy of Conditioned Motion | |
| US20110293142A1 (en) | Method for recognizing objects in a set of images recorded by one or more cameras | |
| US20140267254A1 (en) | Accurate Image Alignment to a 3D Model | |
| US9182220B2 (en) | Image photographing device and method for three-dimensional measurement | |
| WO2013169332A1 (en) | Camera scene fitting of real world scenes for camera pose determination | |
| US20060215935A1 (en) | System and architecture for automatic image registration | |
| Ackermann et al. | Geometric Point Light Source Calibration. | |
| US20150276400A1 (en) | Reduced homography for ascertaining conditioned motion of an optical apparatus | |
| CN113971724A (en) | Method for 3D scanning of real objects | |
| CN111179347A (en) | Positioning method, positioning device and storage medium based on regional characteristics | |
| Gillihan | Accuracy comparisons of iphone 12 pro LiDAR outputs | |
| CN114155233B (en) | Apparatus and method for obtaining a registration error map representing a level of sharpness of an image | |
| US8260007B1 (en) | Systems and methods for generating a depth tile | |
| KR101189665B1 (en) | System for analyzing golf putting result using camera and method therefor | |
| KR100991570B1 (en) | Method for remotely measuring the size of signboards having various shapes and signage size telemetry device using the method | |
| Bonaccordo | Evaluation of an Automated Panoramic Imaging System for the Photographic Recording and Analysis of Blood Spatter in Crime Scenes |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: LUCASFILM ENTERTAINMENT COMPANY LTD., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WARNER, PAIGE;REEL/FRAME:033856/0114 Effective date: 20140930 |
|
| FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |