CN117226832A - An error assessment method, device and electronic equipment - Google Patents
An error assessment method, device and electronic equipment Download PDFInfo
- Publication number
- CN117226832A CN117226832A CN202311184736.4A CN202311184736A CN117226832A CN 117226832 A CN117226832 A CN 117226832A CN 202311184736 A CN202311184736 A CN 202311184736A CN 117226832 A CN117226832 A CN 117226832A
- Authority
- CN
- China
- Prior art keywords
- error
- determining
- coordinate
- motion mechanism
- image acquisition
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J18/00—Arms
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0014—Image feed-back for automatic industrial control, e.g. robot with camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The embodiment of the application provides an error evaluation method, an error evaluation device and electronic equipment, and relates to the technical field of machine vision. Acquiring a first group of images and a second group of images which are acquired by an image acquisition device and are related to a target object; determining a camera comprehensive error of the image acquisition device by using a first maximum coordinate and a first minimum coordinate of a characteristic point of a target object on a first coordinate axis of an image coordinate system and a second maximum coordinate and a second minimum coordinate on a second coordinate axis in the first group of images; determining a systematic translational error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature points in the second group of images on the first coordinate axis and a fourth maximum coordinate and a fourth minimum coordinate on the second coordinate axis; based on the camera integrated error and the system translation error, a system integrated error is determined with respect to the image capturing device and the motion mechanism. By applying the scheme provided by the embodiment of the application, the system integrated error can be determined.
Description
Technical Field
The present application relates to the field of machine vision, and in particular, to an error evaluation method, apparatus, and electronic device.
Background
With the continuous development of machine vision technology, machine vision systems are widely used in production processes of various high-precision products, for example, in production processes of electronic products, capturing used elements by using the machine vision systems, and the like.
The machine vision system described above may generally include, among other things, an image acquisition device for performing image acquisition and a motion mechanism for moving the article or the image acquisition device. For example, the machine vision system may include a lathe and an image acquisition device; the machine vision system may include a camera, a robotic arm base, a robotic arm, and the like.
Generally, an error of a machine vision system will affect the operation accuracy of the machine vision system, and based on this, how to evaluate a system integrated error of the machine vision system to perform error compensation by using the determined system integrated error is a technical problem to be solved currently.
Disclosure of Invention
The embodiment of the application aims to provide an error evaluation method, an error evaluation device and electronic equipment, so as to determine the system integrated error of a machine vision system related to image acquisition equipment and a movement mechanism. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides an error evaluation method, including:
Acquiring a first group of images and a second group of images which are acquired by an image acquisition device and are related to a target object; wherein the first set of images is acquired by the image acquisition device at a first location for the target object at a second location; the second group of images are acquired by the image acquisition device when the motion mechanism rotates from an initial position to a target position every time so that the image acquisition device and the target object are at a specified relative position;
determining a camera comprehensive error of the image acquisition device by using a first maximum coordinate and a first minimum coordinate of a characteristic point of the target object on a first coordinate axis of an image coordinate system corresponding to the image acquisition device and a second maximum coordinate and a second minimum coordinate of the characteristic point on a second coordinate axis of the image coordinate system in the first group of images;
determining a system translation error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature point in the second group of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the feature point in the second group of images on the second coordinate axis;
Based on the camera integrated error and the system translation error, a system integrated error is determined with respect to the image acquisition device and the motion mechanism.
Optionally, in a specific implementation manner, the motion mechanism includes a rotation shaft; before said determining a systematic integrated error with respect to said image acquisition device and said motion mechanism based on said camera integrated error and said systematic translational error, said method further comprises:
acquiring a third group of images about the target object acquired by the image acquisition device; wherein the image capturing device captures the target object at each rotation of the rotation axis by a specified angle in a process in which the rotation axis is continuously rotated a plurality of times in a first direction so that the target object is continuously rotated a plurality of times in a second direction with respect to the image capturing device;
determining a systematic rotation error of the motion mechanism by using a rotation axis length of the rotation axis and a maximum value of differences between the specified angle and the angle variation of the feature points in the third group of images;
the determining a system integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error and the system translation error, comprising:
A systematic integrated error is determined for the image acquisition device and the motion mechanism based on the camera integrated error, the systematic translational error, and the systematic rotational error.
Optionally, in a specific implementation manner, the determining, by using a first maximum coordinate and a first minimum coordinate of a feature point of the target object on a first coordinate axis of an image coordinate system corresponding to the image capturing device and a second maximum coordinate and a second minimum coordinate of the feature point on a second coordinate axis of the image coordinate system in the first set of images, includes:
calculating a difference value between a first maximum coordinate and a first minimum coordinate of a characteristic point of the target object in the first group of images on a first coordinate axis of an image coordinate system corresponding to the image acquisition equipment to obtain a first difference value;
calculating a difference value between a second maximum coordinate and a second minimum coordinate of the feature point on a second coordinate axis of the image coordinate system in the first group of images to obtain a second difference value;
determining a camera integrated error of the image acquisition equipment by using the first difference value, the second difference value and a preset relation; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
Optionally, in a specific implementation manner, the determining the systematic translational error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature point in the second set of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the feature point in the second set of images on the second coordinate axis includes:
calculating a difference value between a third maximum coordinate and a third minimum coordinate of the feature point on the first coordinate axis in the second group of images to obtain a third difference value;
calculating a difference value between a fourth maximum coordinate and a fourth minimum coordinate of the feature point on the second coordinate axis in the second group of images to obtain a fourth difference value;
determining a repeated positioning error of the motion mechanism by utilizing the camera integrated error, the third difference value, the fourth difference value and a preset relation, and determining a system translation error of the motion mechanism based on the repeated positioning error; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
Optionally, in a specific implementation manner, the preset relationship includes: the hand-eye calibration matrix; the determining a systematic translational error of the motion mechanism based on the repeated positioning error comprises:
determining an absolute positioning error of the motion mechanism by using the camera integrated error, the hand-eye calibration matrix and a plurality of groups of calibration coordinates used for determining the hand-eye calibration matrix;
and determining a systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error.
Optionally, in a specific implementation manner, a system rotation error of the motion mechanism is affected by an angle extraction precision error and a rotation precision error of the rotation shaft; the determining mode of the angle extraction precision error comprises the following steps:
calculating the difference value of the maximum angle and the minimum angle of the feature points in the first group of images to obtain an angle difference;
determining an angle extraction accuracy error with respect to angle extraction of the feature points in the first group of images using the angle difference and the rotation axis length;
the method for determining the rotation precision error comprises the following steps:
and determining a rotation accuracy error with respect to the rotation mechanism using the angle difference, the maximum value, and the rotation axis length.
Optionally, in a specific implementation manner, before determining the systematic rotation error of the motion mechanism by using the maximum value of the rotation axis length of the rotation axis and the difference value between the specified angle and the angle variation of the feature point in the third set of images, the method further includes:
determining a rotation axis calibration error of the motion mechanism using a hand-eye calibration matrix for the image acquisition device and the motion mechanism;
the determining a systematic rotation error of the motion mechanism using a maximum value of a rotation axis length of the rotation axis and a difference value between the specified angle and an angle variation amount of the feature point in the third group of images, includes:
determining a rotation error of the rotation shaft of the movement mechanism by using a rotation shaft length of the rotation shaft and a maximum value of differences between the specified angle and the angle variation of the feature points in the third group of images;
and determining the system rotation error of the motion mechanism by using the rotation error of the rotating shaft and the calibration error of the rotating shaft.
Optionally, in a specific implementation, before the determining a system integrated error regarding the image capturing device and the motion mechanism based on the camera integrated error and the system translation error, the method further includes:
Determining a teaching error of the motion mechanism;
the determining a system integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error and the system translation error, comprising:
based on the camera integrated error, the system translation error, and the teaching error, a system integrated error is determined with respect to the image acquisition device and the motion mechanism.
Optionally, in a specific implementation manner, the determining the camera integrated error of the image capturing device by using the first difference value, the second difference value, and a preset relationship includes:
determining a camera integrated error of the image acquisition device by using a first formula;
wherein, the first formula is:
wherein CIE is the camera integrated error of the image acquisition equipment; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; XPixRange_S is the first difference; ypixrange_s is the second difference;
or,
determining a camera integrated error of the image acquisition device by using a second formula;
wherein the second formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
Optionally, in a specific implementation manner, the determining the repeated positioning error of the motion mechanism by using the integrated error of the camera, the third difference value, the fourth difference value, and a preset relationship includes:
determining a repeated positioning error of the motion mechanism by using a third formula;
wherein the third formula is:
wherein MRTE is the repeated positioning error of the motion mechanism; XPixRange_D is the third difference value and YPIxRange_D is the fourth difference value; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; CIE is the camera integrated error;
or,
determining a repeated positioning error of the motion mechanism by using a fourth formula;
wherein the fourth formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
Optionally, in a specific implementation manner, the determining the absolute positioning error of the motion mechanism by using the camera integrated error, the hand-eye calibration matrix, and a plurality of sets of calibration coordinates for determining the hand-eye calibration matrix includes:
determining an absolute positioning error of the motion mechanism by using a fifth formula;
Wherein the fifth formula is:
wherein MATE is the absolute positioning error of the motion mechanism; x_ Wld i The method comprises the steps of calibrating an X coordinate in an ith set of coordinates in a world coordinate system for the calibration point; X_Pix i An X coordinate in an ith set of calibration coordinates in the image coordinate system for the calibration point; y_ Wld i Y coordinates in an ith set of calibration coordinates in the world coordinate system for the calibration point; Y_Pix i Y coordinates of the calibration point in an ith set of calibration coordinates in the image coordinate system; n is the number of calibration coordinate sets of the calibration points;
the determining a systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error comprises:
determining a systematic translational error of the motion mechanism using a sixth formula:
wherein the sixth formula is:
wherein STLE is the systematic translational error of the motion mechanism; MRTE is the repeated positioning error of the motion mechanism; MATE is the absolute positioning error of the motion mechanism.
Optionally, in a specific implementation manner, the determining the system rotation error of the motion mechanism by using the rotation error of the rotation shaft and the calibration error of the rotation shaft includes:
Determining a system rotation error of the motion mechanism by using a seventh formula;
wherein the seventh formula is:
wherein SRE is the systematic rotational error of the motion mechanism; MRAE is the rotation error of the rotation shaft; MRCE is the rotation shaft calibration error; deltaR is the maximum value of the difference between the specified angle and the angle variation of the feature point in the third set of images.
In a second aspect, an embodiment of the present application provides an error evaluation apparatus, including:
the image acquisition module is used for acquiring a first group of images and a second group of images, which are acquired by the image acquisition equipment and related to the target object; wherein the first set of images is acquired by the image acquisition device at a first location for the target object at a second location; the second group of images are acquired by the image acquisition device when the motion mechanism rotates from an initial position to a target position every time so that the image acquisition device and the target object are at a specified relative position;
the first determining module is used for determining a camera comprehensive error of the image acquisition device by using a first maximum coordinate and a first minimum coordinate of the characteristic point of the target object on a first coordinate axis of an image coordinate system corresponding to the image acquisition device and a second maximum coordinate and a second minimum coordinate of the characteristic point on a second coordinate axis of the image coordinate system in the first group of images;
The second determining module is used for determining a system translation error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the characteristic points in the second group of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the characteristic points in the second group of images on the second coordinate axis;
and a comprehensive error determination module for determining a systematic comprehensive error with respect to the image acquisition device and the motion mechanism based on the camera comprehensive error and the systematic translational error.
Optionally, in a specific implementation manner, the motion mechanism includes a rotation shaft; the apparatus further comprises:
a third determining module for acquiring a third set of images acquired by the image acquisition device with respect to the target object before determining a systematic integrated error with respect to the image acquisition device and the movement mechanism based on the camera integrated error and the systematic translation error; wherein the image capturing device captures the target object at each rotation of the rotation axis by a specified angle in a process in which the rotation axis is continuously rotated a plurality of times in a first direction so that the target object is continuously rotated a plurality of times in a second direction with respect to the image capturing device;
A system rotation error determining module, configured to determine a system rotation error of the motion mechanism using a rotation axis length of the rotation axis and a maximum value of differences between the specified angle and the angle variation of the feature points in the third group of images;
the comprehensive error determining module is specifically configured to:
a systematic integrated error is determined for the image acquisition device and the motion mechanism based on the camera integrated error, the systematic translational error, and the systematic rotational error.
Optionally, in a specific implementation manner, the first determining module includes:
the first computing sub-module is used for computing the difference value between the first maximum coordinate and the first minimum coordinate of the characteristic point of the target object on the first coordinate axis of the image coordinate system corresponding to the image acquisition equipment in the first group of images to obtain a first difference value;
the second calculation sub-module is used for calculating the difference value between the second maximum coordinate and the second minimum coordinate of the characteristic points on the second coordinate axis of the image coordinate system in the first group of images to obtain a second difference value;
the first determining submodule is used for determining the camera integrated error of the image acquisition equipment by utilizing the first difference value, the second difference value and a preset relation; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
Optionally, in a specific implementation manner, the second determining module includes:
a third calculation sub-module, configured to calculate a difference value between a third maximum coordinate and a third minimum coordinate of the feature point on the first coordinate axis in the second set of images, to obtain a third difference value;
a fourth computing sub-module, configured to compute a difference value between a fourth maximum coordinate and a fourth minimum coordinate of the feature point on the second coordinate axis in the second set of images, to obtain a fourth difference value;
the second determining submodule is used for determining repeated positioning errors of the moving mechanism by utilizing the camera integrated errors, the third difference value, the fourth difference value and a preset relation and determining systematic translation errors of the moving mechanism based on the repeated positioning errors; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
Optionally, in a specific implementation manner, the preset relationship includes: the hand-eye calibration matrix; the first determination submodule includes:
the third determining submodule is used for determining the absolute positioning error of the motion mechanism by utilizing the camera comprehensive error, the hand-eye calibration matrix and a plurality of groups of calibration coordinates of the hand-eye calibration matrix;
And a fourth determination sub-module for determining a systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error.
Optionally, in a specific implementation manner, a system rotation error of the motion mechanism is affected by an angle extraction precision error and a rotation precision error of the rotation shaft; the device also comprises a fourth determining module and a fifth determining module;
the fourth determining module is specifically configured to calculate a difference value between a maximum angle and a minimum angle of the feature points in the first set of images, so as to obtain an angle difference; determining an angle extraction accuracy error with respect to angle extraction of the feature points in the first group of images using the angle difference and the rotation axis length;
the fifth determining module is specifically configured to determine a rotation accuracy error about the rotation mechanism using the angle difference, the maximum value, and the rotation axis length.
Optionally, in a specific implementation manner, the apparatus further includes:
a sixth determining module configured to determine a rotation axis calibration error of the moving mechanism using a hand-eye calibration matrix with respect to the image capturing apparatus and the moving mechanism before determining a systematic rotation error of the moving mechanism using a rotation axis length of the rotation axis and a maximum value of differences in angle variation of the specified angle and the feature points in the third group of images;
The system rotation error determination module includes:
a fifth determining sub-module for determining a rotation error of the rotation shaft of the movement mechanism using a rotation shaft length of the rotation shaft and a maximum value among differences of the specified angle and the angle variation amounts of the feature points in the third group of images;
and the sixth determining submodule is used for determining the system rotation error of the motion mechanism by utilizing the rotation error of the rotating shaft and the calibration error of the rotating shaft.
Optionally, in a specific implementation manner, the apparatus further includes:
a seventh determining module for determining a teaching error of the motion mechanism before determining a systematic integrated error with respect to the image capturing device and the motion mechanism based on the camera integrated error and the systematic translational error;
the comprehensive error determining module is specifically configured to:
based on the camera integrated error, the system translation error, the system rotation error, and the teaching error, a system integrated error with respect to the image acquisition device and the motion mechanism is determined.
Optionally, in a specific implementation manner, the first determining submodule is specifically configured to:
Determining a camera integrated error of the image acquisition device by using a first formula;
wherein, the first formula is:
wherein CIE is the camera integrated error of the image acquisition equipment; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; XPixRange_S is the first difference; ypixrange_s is the second difference;
or,
determining a camera integrated error of the image acquisition device by using a second formula;
wherein the second formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
Optionally, in a specific implementation manner, the second determining submodule is specifically configured to:
determining a repeated positioning error of the motion mechanism by using a third formula;
wherein the third formula is:
wherein MRTE is the repeated positioning error of the motion mechanism; XPixRange_D is the third difference value and YPIxRange_D is the fourth difference value; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; CIE is the camera integrated error;
or,
determining a repeated positioning error of the motion mechanism by using a fourth formula;
Wherein the fourth formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
Optionally, in a specific implementation manner, the third determining submodule is specifically configured to:
determining an absolute positioning error of the motion mechanism by using a fifth formula;
wherein the fifth formula is:
wherein MATE is the absolute positioning error of the motion mechanism; x_ Wld i The method comprises the steps of calibrating an X coordinate in an ith set of coordinates in a world coordinate system for the calibration point; X_Pix i An X coordinate in an ith set of calibration coordinates in the image coordinate system for the calibration point; y_ Wld i Y coordinates in an ith set of calibration coordinates in the world coordinate system for the calibration point; Y_Pix i Y coordinates of the calibration point in an ith set of calibration coordinates in the image coordinate system; n is the number of calibration coordinate sets of the calibration points;
the fourth determination submodule is specifically configured to:
determining a systematic translational error of the motion mechanism using a sixth formula:
wherein the sixth formula is:
wherein STLE is the systematic translational error of the motion mechanism; MRTE is the repeated positioning error of the motion mechanism; MATE is the absolute positioning error of the motion mechanism.
Optionally, in a specific implementation manner, the sixth determining submodule is specifically configured to:
determining a system rotation error of the motion mechanism by using a seventh formula;
wherein the seventh formula is:
wherein SRE is the systematic rotational error of the motion mechanism; MRAE is the rotation error of the rotation shaft system; MRCE is the rotation axis calibration error of the preset motion mechanism; deltaR is the maximum value of the difference between the specified angle and the angle variation of the feature point in the third set of images.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing a computer program;
and the processor is used for realizing the steps of any method embodiment when executing the program stored in the memory.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored therein, which when executed by a processor, implements the steps of any of the method embodiments described above.
In a fifth aspect, embodiments of the present application also provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform the steps of any of the method embodiments described above.
The embodiment of the application has the beneficial effects that:
the above can be seen that, by applying the scheme provided by the embodiment of the present application, the first set of images and the second set of images about the target object acquired by the image acquisition device can be acquired by using the positional relationship and the related motion among the image acquisition device, the target object and the motion mechanism, and then, the camera integrated error and the system translational error included in the system about the image acquisition device and the motion mechanism are respectively determined by using the two sets of images, so that the system integrated error is determined according to the camera integrated error and the system translational error. And because the errors of all the components in the system are comprehensively considered, the accuracy of the determined system comprehensive errors is improved.
Of course, it is not necessary for any one product or method of practicing the application to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the application, and other embodiments may be obtained according to these drawings to those skilled in the art.
Fig. 1 is a schematic flow chart of an error evaluation method according to an embodiment of the present application;
fig. 2 (a) -fig. 2 (b) are schematic diagrams of the positional relationship between the image capturing device and the motion mechanism according to the embodiment of the present application;
FIG. 3 is a flowchart illustrating another error estimation method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a system rotation error according to an embodiment of the present application;
fig. 5 is a flowchart illustrating a specific example of an error evaluation method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an error evaluation device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. Based on the embodiments of the present application, all other embodiments obtained by the person skilled in the art based on the present application are included in the scope of protection of the present application.
Generally, an error of a machine vision system will affect the operation accuracy of the machine vision system, and based on this, how to evaluate a system integrated error of the machine vision system to perform error compensation by using the determined system integrated error is a technical problem to be solved currently.
In order to solve the technical problems, an embodiment of the present application provides an error evaluation method.
The method can be suitable for various application scenes in which the system integrated errors of a machine vision system related to an image acquisition device and a motion mechanism are required to be evaluated, such as determining the system integrated errors related to a camera and a mechanical arm mounted on a fixed base, determining the system integrated errors of a system consisting of a punch bed and the camera, and the like.
And, the method may be applied to various image capturing devices having an image data function, such as a video camera, a still camera, etc., and further, after capturing various images about a target object, the method may be performed to determine a systematic integrated error about the image capturing device and a movement mechanism; the method can also be applied to various electronic devices which can communicate with the image acquisition device, such as a server, a mobile phone, a computer and the like, and the electronic device can acquire various images about a target object acquired by the image acquisition device, and further, the method is executed to determine the system comprehensive error about the image acquisition device and a movement mechanism; in addition, when the execution subject of the method is an electronic device, the electronic device may be one electronic device or may be a device cluster formed by a plurality of electronic devices, which will be referred to as an electronic device hereinafter for clarity of text.
Based on this, the embodiment of the present application does not specifically limit the application scenario and execution subject of the method.
The error evaluation method provided by the embodiment of the application can comprise the following steps:
acquiring a first group of images and a second group of images which are acquired by an image acquisition device and are related to a target object; wherein the first set of images is acquired by the image acquisition device at a first location for the target object at a second location; the second group of images are acquired by the image acquisition device when the motion mechanism rotates from an initial position to a target position every time so that the image acquisition device and the target object are at a specified relative position;
determining a camera comprehensive error of the image acquisition device by using a first maximum coordinate and a first minimum coordinate of a characteristic point of the target object on a first coordinate axis of an image coordinate system corresponding to the image acquisition device and a second maximum coordinate and a second minimum coordinate of the characteristic point on a second coordinate axis of the image coordinate system in the first group of images;
determining a system translation error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature point in the second group of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the feature point in the second group of images on the second coordinate axis;
Based on the camera integrated error and the system translation error, a system integrated error is determined with respect to the image acquisition device and the motion mechanism.
The above can be seen that, by applying the scheme provided by the embodiment of the present application, the first set of images and the second set of images about the target object acquired by the image acquisition device can be acquired by using the positional relationship and the related motion among the image acquisition device, the target object and the motion mechanism, and then, the camera integrated error and the system translational error included in the system about the image acquisition device and the motion mechanism are respectively determined by using the two sets of images, so that the system integrated error is determined according to the camera integrated error and the system translational error. And because the errors of all the components in the system are comprehensively considered, the accuracy of the determined system comprehensive errors is improved.
In general, in order to improve the work accuracy, a machine vision system concerning an image capturing apparatus and a movement mechanism may be applied to a production flow of various high-accuracy products, and a production task may be performed by the machine vision system. For example, in order to improve the gripping accuracy of electronic components, a component gripping task may be performed using a machine vision system with respect to the image capturing apparatus and the robot arm. When the production task is executed, the image acquisition equipment can acquire an image of the product, then determine the position of the product in the image, and determine the position of the product in the world coordinate system by utilizing the homography transformation relation of the image coordinate system and the world coordinate system. Thus, the movement mechanism can work on the product based on the position determined by the image acquisition equipment.
In the machine vision system, the positional relationship between the image acquisition device and the movement mechanism may be eye-in-hand (eye-in-hand) or eye-out-of-hand (eye-to-hand); wherein, "eye" refers to an image capturing apparatus, and "hand" refers to a movement mechanism.
Wherein, the eye is on the hand, the image acquisition device is arranged on the movement mechanism, and the image acquisition device can move along with the movement of the movement mechanism; for example, with respect to a manipulator, the term "eye on the hand" means that the image pickup apparatus is mounted on the manipulator and moves with the movement of the manipulator.
By eye outside the hand is meant that the image capturing device is mounted outside the movement mechanism and the image capturing device may not move with the movement of the movement mechanism; for example, with respect to a manipulator, by eye outside the manipulator is meant that the image acquisition device is mounted outside the manipulator and does not move with the movement of the manipulator.
In order to facilitate determination of systematic integrated errors for a machine vision system, embodiments of the present application provide various errors involved in a machine vision system.
The motion mechanisms in a machine vision system can be divided into: a movement mechanism that does not include a rotation shaft and a movement mechanism that includes a rotation shaft.
The moving mechanism not including the rotation shaft may include a moving mechanism such as a milling machine, a sliding table module, and the like, and the moving mechanism not including the rotation shaft may perform translational movement, for example, the XY-axis sliding table module may perform translational movement along an X-axis direction or a Y-axis direction in a world coordinate system corresponding to the module.
The movement mechanism including the rotation shaft may include a movement mechanism such as a robot, a movement module with the rotation shaft, or the like, and the movement mechanism including the rotation shaft may perform translational movement and rotational movement, for example, the robot may perform rotational movement about a Z axis perpendicular to the XY plane, or the like.
Based on this, during operation, the motion mechanism in the machine vision system can perform translational motion, and is affected by the mechanism manufacturing process and motion parameters, and the motion mechanism can be affected by the system translational error (System Translation Error, STLE) during translational motion, so the error of any motion mechanism can include the system translational error.
In addition, the image acquisition device in the machine vision system has a camera integrated error (Camera Integrated Error, CIE) under the influence of various factors such as lens optical deviation, temperature drift, imaging edge sharpness or edge transition zone, image feature point coordinate extraction precision, device shake and the like.
Thus, the system integrated error (System Total Error, STE) of the machine vision system may include the camera integrated error and the system translation error described above.
In addition, since the motion mechanism in part of the machine vision system may include a rotating shaft capable of rotational movement, the motion mechanism may be affected by a mechanism manufacturing process and motion parameters, and may also be affected by a system rotation error (System Rotational Error, SRE) when performing rotational movement, and thus, for a motion mechanism including a rotating shaft, the motion mechanism error may include a system translation error and a system rotation error. Further, the systematic integrated errors of the machine vision system may include the camera integrated errors, systematic translational errors, and systematic rotational errors described above.
The camera integrated error, the system translational error, and the system rotational error, which are related to the system integrated error of the machine vision system, are described below, respectively.
The camera integrated error of the image acquisition device is the error generated by the image acquisition device under the influence of various factors such as optical error, camera temperature drift, imaging edge sharpness or edge transition zone, image characteristic point coordinate extraction precision, mechanism shake and the like.
The system translation error refers to the position error between the actual position reached by the moving mechanism and the designated position when the moving mechanism repeatedly carries the image acquisition device or the target object to the designated position repeatedly.
And, the system translational errors may include a repeat positioning error (Machine Relative Translation Error, MRTE) and an absolute positioning error (Machine Absolute Translation Error, MATE).
Wherein, the repeated positioning error refers to the relative positioning error of the moving mechanism, namely, when the moving mechanism repeatedly moves for a plurality of times, fluctuation exists between the positioning positions reached each time due to the repeated positioning error;
the absolute positioning error means that the movement mechanism cannot move exactly to the target position every time it moves, and there is a fixed position error between the reached position and the target position.
The system rotation error is an angle error between a rotation angle and a target angle, which is affected by a mechanism manufacturing process and a motion parameter and cannot be strictly and accurately rotated by a motion mechanism when rotating the target angle.
Wherein, when the movement mechanism includes a rotation shaft, the rotational movement of the movement mechanism is accomplished by the rotation shaft, and thus, the systematic rotational error of the movement mechanism may include the rotational error of the rotation shaft of the movement mechanism.
For a movement mechanism that includes a rotational axis, when the rotational axis center of the rotational axis of the movement mechanism is not coaxial with the center of mass of the end effector, the end effector rotates following the rotational axis, and the rotational axis center of the rotational axis is the rotational axis length from the end effector.
The end effector may include a suction nozzle, a clamping jaw, a dispensing head, and the like.
The rotation axis length can be calibrated when the hand and eye calibration of the image acquisition equipment and the movement mechanism is carried out. The system rotation errors described above also include the rotation axis calibration errors (Machine Rotation Axis Calibration Error, MRCE) because errors may exist in the calibration process. The rotating shaft calibration error refers to an error existing when rotating shaft calibration is carried out, namely, the deviation between the calibrated length of the rotating shaft obtained by calibration and the real length of the real rotating shaft.
Further, the above-described rotation shaft rotation error is affected by a rotation accuracy error (Machine Rotation Axis Dynamic Error, MRDE) and an angle extraction accuracy error (Machine Rotation Axis Static Error, MRSE).
The rotation accuracy error is a deviation between an actual angle at which the rotation shaft of the movement mechanism rotates after rotating by a fixed angle and the fixed angle.
The angle extraction accuracy error refers to the deviation between the actual angle of the feature point of the target object and the image extraction angle of the feature point, which is affected by the accuracy of the image feature point angle extraction algorithm, or the deviation between the actual coordinate reached by the rotation axis of the motion mechanism after rotating according to the image angle variation of the feature point and the ideal coordinate.
The rotation accuracy error and the angle extraction accuracy error are both related to the length of the rotation shaft of the movement mechanism.
For clarity of the text, the above-described calculation modes of the camera integrated error, the system translational error, and the system rotational error will be described below.
On the basis of the errors related to the machine vision system provided by the embodiment of the application, the embodiment of the application provides an error evaluation method for determining the system integrated errors of the machine vision system.
The following describes a specific error evaluation method according to an embodiment of the present application with reference to the accompanying drawings.
Fig. 1 is a flow chart of an error evaluation method according to an embodiment of the present application, as shown in fig. 1, the method may include the following steps S101 to S104:
s101: acquiring a first group of images and a second group of images which are acquired by an image acquisition device and are related to a target object;
wherein the first group of images are acquired by the image acquisition equipment positioned at the first position for the target object positioned at the second position; the second group of images are acquired by the image acquisition device when the motion mechanism rotates from the initial position to the target position every time so that the image acquisition device and the target object are at a specified relative position;
as described above, the system integrated errors with respect to the image capturing apparatus and the movement mechanism each include a camera integrated error and a system translational error, on the basis of which, in order to determine the system integrated error, the camera integrated error and the system translational error may be calculated, respectively.
Under the influence of the camera integrated error, when the image acquisition device still shoots the same object for a plurality of times, the pixel coordinates of the feature points on the object in the image coordinate system corresponding to the image acquisition device may be different. And the coordinate range of the feature point in each coordinate axis direction of the image coordinate system can reflect the camera integrated error of the image acquisition equipment.
The coordinate range of a feature point in one coordinate axis direction is the difference between the maximum coordinate and the minimum coordinate of the feature point in the coordinate axis.
Based on this, in order to calculate the camera integrated error of the image pickup apparatus, a first set of images obtained when the image pickup apparatus performs still shooting on the target object a plurality of times can be obtained. That is, the image pickup device may be placed at a first position and the target object is placed at a second position within the pickup range of the image pickup device, and thereafter, the image pickup device is controlled to pick up a plurality of images with respect to the target object, thereby obtaining a first group of images with respect to the target object.
In order to calculate the systematic translation error of the movement mechanism, the movement mechanism in the initial position can be controlled to translate from the initial position to the target position each time, so that when the movement mechanism translates to the target position, the image acquisition device and the target object can be in a designated relative position, and when the movement mechanism reaches the target position, the image acquisition device is controlled to acquire the target object, and therefore, a second group of images related to the target object are obtained.
If the position relationship between the motion mechanism and the image acquisition device is that the eye is on the hand, the target object can be placed at a fixed position, and then the motion mechanism can be controlled to carry the image acquisition device to repeatedly translate from the initial position to the target position capable of carrying out image acquisition on the target object, and when the motion mechanism reaches the target position each time, the image acquisition device is controlled to carry out image acquisition on the target object, so that a second group of images is obtained.
If the position relationship between the motion mechanism and the image acquisition device is that the eye is outside the hand, the image acquisition device can be arranged at a fixed position, and then the motion mechanism can be controlled to carry the target object repeatedly from the initial position to the target position in the acquisition area of the image acquisition device, and the image acquisition device is controlled to acquire the image of the target object when the target position is reached each time, so that a second group of images is obtained.
Thus, after the first and second sets of images about the target object acquired by the image acquisition device are acquired, the camera integrated error and the system translational error can be calculated using the first and second sets of images, respectively.
S102: determining a camera comprehensive error of the image acquisition equipment by using a first maximum coordinate and a first minimum coordinate of a characteristic point of a target object on a first coordinate axis of an image coordinate system corresponding to the image acquisition equipment and a second maximum coordinate and a second minimum coordinate of the characteristic point on a second coordinate axis of the image coordinate system in the first group of images;
a plurality of image coordinates of the feature point of the target object in the image coordinate system may be determined using each of the images in the first set of images. Then, using the plurality of image coordinates, a first maximum coordinate and a first minimum coordinate of a feature point of the target object on a first coordinate axis of the image coordinate system and a second maximum coordinate and a second minimum coordinate of the feature point on a second coordinate axis of the image coordinate system can be determined, and then, using the first maximum coordinate, the first minimum coordinate, the second maximum coordinate and the second minimum coordinate, determining a camera integrated error of the image acquisition device.
Optionally, in a specific implementation manner, the step S102 may include the following steps 11-13:
step 11: calculating a difference value between a first maximum coordinate and a first minimum coordinate of a characteristic point of a target object in a first group of images on a first coordinate axis of an image coordinate system corresponding to image acquisition equipment to obtain a first difference value;
step 12: calculating a difference value between a second maximum coordinate and a second minimum coordinate of the feature points on a second coordinate axis of the image coordinate system in the first group of images to obtain a second difference value;
step 13: determining a camera integrated error of the image acquisition equipment by using the first difference value, the second difference value and a preset relation;
the preset relation comprises the following steps: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
In this specific implementation manner, after a first group of images about a target object acquired by an image acquisition device is acquired, a difference value between a first maximum coordinate and a first minimum coordinate of a feature point of the target object on a first coordinate axis of an image coordinate system corresponding to the image acquisition device in the first group of images is calculated, so as to obtain a first difference value; and calculating the difference value between the second maximum coordinate and the second minimum coordinate of the feature point on the second coordinate axis of the image coordinate system in the first group of images to obtain a second difference value.
Because the first difference value and the second difference value are coordinate difference values in the image coordinate system, and the camera integrated error to be calculated is an error in the world coordinate system, when the camera integrated error is calculated by using the first difference value and the second difference value, the purpose of calculating the camera integrated error of the image acquisition device in the world coordinate system by using the image coordinate in the image coordinate system can be realized by using a hand-eye calibration matrix related to the image acquisition device and a motion mechanism or using single pixel precision related to the image acquisition device.
The hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism is calibrated by a calibration object in advance and is used for representing the conversion relation between the image coordinates of the calibration object in an image coordinate system and the world coordinates of the calibration object in a world coordinate system.
The world coordinate system is a three-dimensional rectangular coordinate system, and can reflect the position of an object in the real world. The origin of the world coordinate system may be determined according to the actual situation.
The image coordinate system is a two-dimensional rectangular coordinate system and reflects the arrangement condition of pixels in the image acquisition equipment. The origin of the image coordinate system is positioned at the upper left corner of the image acquired by the image acquisition equipment, and the X axis and the Y axis are respectively parallel to the two sides of the image plane. The units of the coordinate axes in the image coordinate system are pixels (integers).
Moreover, as shown in fig. 2 (a), if the relationship between the image acquisition device and the motion mechanism is that the eye is out of the hand, it is advantageous that the calibration method of the eye out of the hand calibrates the hand-eye calibration matrix; as shown in fig. 2 (b), if the relationship between the image acquisition device and the movement mechanism is that the eye is on the hand, the hand-eye calibration matrix can be calibrated by using the calibration method of the eye on the hand; the method for calibrating the eyes outside the hands and the method for calibrating the eyes on the hands can be set according to actual needs, can be a Zhang Youzheng calibration method or other calibration methods, are reasonable, and are not particularly limited in the embodiment of the application.
The single-pixel precision of the image acquisition device is used for representing the proportional relation between the coordinate size in the image coordinate system and the coordinate size in the world coordinate system, and is the difference value between the single-direction visual field range of the image acquisition device and the single-direction resolution of the image acquisition device, and the unit is mm/pixel.
In addition, the above-described hand-eye calibration matrix with respect to the image capturing apparatus and the motion mechanism, and the single-pixel precision with respect to the image capturing apparatus, may be referred to as homography relation of the image coordinate system and the world coordinate system.
In general, when the machine vision system has not completed calibration or entered the debug phase, single pixel accuracy may be employed to perform system integrated error assessment; when the machine vision system is calibrated or enters a debugging stage, the hand-eye calibration matrix can be adopted to carry out system comprehensive error evaluation, so that a more comprehensive and accurate system comprehensive error evaluation result can be obtained.
After the first difference and the second difference are determined, the camera integrated error of the image acquisition device can be determined by using the first difference, the second difference and a preset relationship.
Wherein, the first difference value and the second difference value can be respectively expressed as:
XPixRange_S=XPixMax_S-XPixMin_S
YPixRange_S=YPixMax_S-YPixMin_S
the XPixRange_S is a first difference value, and XPixMax_S is a first maximum coordinate of a characteristic point of a target object on a first coordinate axis of an image coordinate system; XPixMin_S is the first minimum coordinate of the feature point of the target object on the first coordinate axis of the image coordinate system; ypixrange_s is a second difference value, ypixmax_s is a second maximum coordinate of the feature point of the target object on a second coordinate axis of the image coordinate system; ypixmin_s is the second smallest coordinate of the feature point of the target object on the second coordinate axis of the image coordinate system.
Optionally, in a specific implementation manner, when the preset relationship is a hand-eye calibration matrix related to the image capturing device and the motion mechanism, step 13, determining the camera integrated error of the image capturing device by using the first difference value, the second difference value, and the preset relationship may include the following step 131:
step 131: determining a camera integrated error of the image acquisition device by using a first formula;
wherein, the first formula is:
wherein CIE is the camera integrated error of the image acquisition equipment; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; XPixRange_S is the first difference; ypixrange_s is the second difference.
Optionally, in a specific implementation manner, when the preset relationship is about single pixel precision of the image capturing device, step 13, determining the camera integrated error of the image capturing device by using the first difference value, the second difference value and the preset relationship may include the following step 132:
step 132: determining a camera integrated error of the image acquisition device by using a second formula;
wherein, the second formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
S103: determining a systematic translational error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature points in the second group of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the feature points in the second group of images on the second coordinate axis;
A plurality of image coordinates of the feature point of the target object in the image coordinate system may be determined using each of the images in the second set of images. And then, determining a third maximum coordinate and a third minimum coordinate of the characteristic point of the target object on a first coordinate axis of the image coordinate system and a fourth maximum coordinate and a fourth minimum coordinate of the characteristic point on a second coordinate axis of the image coordinate system, and determining a system translation error of the motion mechanism by using the third maximum coordinate, the third minimum coordinate, the fourth maximum coordinate and the fourth minimum coordinate.
When the second group of images are used for determining the system translation error of the motion mechanism, the camera integrated error of the image acquisition device is included in each image acquired by the image acquisition device, so that the camera integrated error needs to be removed when the second group of images are used for determining the system translation error of the motion mechanism.
In the second group of images, the coordinate range of the characteristic point of the target object in each coordinate axis direction of the image coordinate system can reflect repeated positioning errors; the absolute positioning error of the motion mechanism can be determined by utilizing a plurality of groups of calibration coordinates for performing the hand-eye calibration matrix.
Based on this, in an optional specific implementation manner, the step S103 may include the following steps 21 to 23:
step 21: calculating a difference value between a third maximum coordinate and a third minimum coordinate of the feature points on the first coordinate axis in the second group of images to obtain a third difference value;
step 22: calculating a difference value between a fourth maximum coordinate and a fourth minimum coordinate of the feature points on the second coordinate axis in the second group of images to obtain a fourth difference value;
step 23: determining a repeated positioning error of the motion mechanism by utilizing the camera integrated error, the third difference value, the fourth difference value and a preset relation, and determining a system translation error of the motion mechanism based on the repeated positioning error;
the preset relation comprises the following steps: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
In this specific implementation manner, after the second group of images about the target object acquired by the image acquisition device is acquired, a difference value between a third maximum coordinate and a third minimum coordinate of the feature point of the target object on the first coordinate axis of the image coordinate system corresponding to the image acquisition device in the second group of images can be calculated, so as to obtain a third difference value; and calculating the difference value between the fourth maximum coordinate and the fourth minimum coordinate of the feature point on the second coordinate axis of the image coordinate system in the second group of images to obtain a fourth difference value.
Wherein, the first difference value and the second difference value can be respectively expressed as:
XPixRange_D=XPixMax_D-XPixMin_D
YPixRange_D=YPixMax_D-YPixMin_D
the XPixRange_D is a third difference value, and XPixMax_D is a third maximum coordinate of the characteristic point of the target object on the first coordinate axis of the image coordinate system; XPixMin_D is the third minimum coordinate of the feature point of the target object on the first coordinate axis of the image coordinate system; ypixrange_d is a fourth difference value, ypixmax_d is a fourth maximum coordinate of the feature point of the target object on the second coordinate axis of the image coordinate system; ypixmin_d is the fourth smallest coordinate of the feature point of the target object on the second coordinate axis of the image coordinate system.
Because the third difference value and the fourth difference value are coordinate difference values in the image coordinate system, and the camera comprehensive error to be calculated is an error in the world coordinate system, when the system translation error is calculated by using the third difference value and the fourth difference value, the purpose of calculating the system translation error of the motion mechanism by using the image coordinate in the image coordinate system can be realized by using a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism or single-pixel precision related to the image acquisition equipment.
And when the image acquired by the image acquisition equipment is utilized to determine the system translation error of the movement mechanism, the calculation result comprises the camera integrated error, so that the repeated positioning error of the movement mechanism can be determined by utilizing the camera integrated error, the third difference value, the fourth difference value and the preset relation.
Further, after the above-mentioned repeated positioning error is calculated, a systematic translational error of the movement mechanism may be determined based on the above-mentioned repeated positioning error.
The absolute positioning error of the motion mechanism can only be determined by using the hand-eye calibration matrix, so that the absolute positioning error cannot be calculated when the machine vision system does not complete calibration or does not enter a debugging stage, and in this case, when the preset relationship only includes single-pixel precision about the image acquisition device, the repeated positioning error can be calculated by using the single-pixel precision and used as a system translation error of the motion mechanism.
Optionally, in a specific implementation manner, the determining the repeated positioning error of the movement mechanism by using the camera integrated error, the third difference value, the fourth difference value and the preset relationship in step 23 may include the following step 231:
step 231: determining a repeated positioning error of the motion mechanism by using a fourth formula;
wherein the fourth formula is:
wherein MRTE is the repeated positioning error of the motion mechanism.
When the machine vision system has completed calibration or has entered a debugging stage, i.e. the above-mentioned preset relationship includes a hand-eye calibration matrix for the image acquisition device and the motion mechanism, then the above-mentioned hand-eye calibration matrix can be used to calculate the repetitive positioning error and calculate the absolute positioning error, and based on the above-mentioned repetitive positioning error and absolute positioning error, calculate the systematic translation error of the motion mechanism.
Based on this, if the preset relationship may include: a hand-eye calibration matrix;
optionally, in a specific implementation manner, the determining the repeated positioning error of the movement mechanism by using the camera integrated error, the third difference value, the fourth difference value and the preset relationship in step 23 may include the following step 232:
determining a repeated positioning error of the motion mechanism by using a third formula;
wherein, the third formula is:
accordingly, in an optional specific implementation manner, the determining the systematic translational error of the motion mechanism based on the repeated positioning error in step 23 may include the following steps 233-234:
step 233: determining an absolute positioning error of the motion mechanism by utilizing the camera integrated error, the hand-eye calibration matrix and a plurality of groups of calibration coordinates used for determining the hand-eye calibration matrix;
step 234: and determining the systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error.
In this specific implementation manner, when the preset relationship includes a hand-eye calibration matrix, the absolute positioning error of the motion mechanism may be determined by using the camera integrated error, the hand-eye calibration matrix, and a plurality of sets of calibration coordinates for determining the hand-eye calibration matrix.
Wherein, each set of calibration coordinates used for determining the hand-eye calibration matrix can comprise world coordinates of the calibration object in a world coordinate system and image coordinates of the calibration point in an image coordinate system.
After the repeated positioning error and the absolute positioning error are obtained through calculation, the square sum of the repeated positioning error and the absolute positioning error can be calculated, and the system translation error of the motion mechanism is obtained.
Optionally, in a specific implementation manner, the step 233 may include the following step 233A:
step 233A: determining an absolute positioning error of the motion mechanism by using a fifth formula;
wherein, the fifth formula is:
wherein MATE is the absolute positioning error of the motion mechanism; x_ Wld i The method comprises the steps of calibrating an X coordinate in an ith set of coordinates in a world coordinate system for the calibration point; X_Pix i The X coordinate of the calibration point in the ith set of calibration coordinates in the image coordinate system; y_ Wld i Y coordinates in an ith set of calibration coordinates in the world coordinate system for the calibration point; Y_Pix i Y coordinates in an ith set of calibration coordinates in an image coordinate system for the calibration point; n is the number of calibration coordinate sets of the calibration points.
Accordingly, the foregoing step 234 may include the following step 234A:
Step 234A: determining a systematic translational error of the motion mechanism using a sixth formula:
wherein, the sixth formula is:
wherein STLE is the systematic translational error of the motion mechanism.
S104: based on the camera integrated error and the system translation error, a system integrated error is determined with respect to the image capturing device and the motion mechanism.
After determining the camera integrated error and the system translation error, the system integrated error with respect to the image capturing device and the movement mechanism may be determined based on the camera integrated error and the system translation error.
Alternatively, after determining the camera integrated error and the system translational error, a sum of the camera integrated error and the system translational error may be calculated as the system integrated error with respect to the image capturing apparatus and the movement mechanism.
Wherein, the system integrated error can be expressed as:
STE=CIE+STLE
wherein STE is a systematic integrated error with respect to the image capturing device and the movement mechanism.
Optionally, after determining the camera integrated error and the system translation error, weights of the camera integrated error and the system translation error may be determined, and then the system integrated error related to the image capturing device and the motion mechanism is calculated based on the weights of the camera integrated error, the system translation error, and the system translation error.
Wherein, the system integrated error can be expressed as:
STE=αCIE+βSTLE
wherein alpha is the weight of the camera integrated error; beta is the weight of the systematic translational error.
The above can be seen that, by applying the scheme provided by the embodiment of the present application, the first set of images and the second set of images about the target object acquired by the image acquisition device can be acquired by using the positional relationship and the related motion among the image acquisition device, the target object and the motion mechanism, and then, the camera integrated error and the system translational error included in the system about the image acquisition device and the motion mechanism are respectively determined by using the two sets of images, so that the system integrated error is determined according to the camera integrated error and the system translational error. And because the errors of all the components in the system are comprehensively considered, the accuracy of the determined system comprehensive errors is improved.
In addition, when the movement mechanism includes a rotation shaft, the systematic error may include a systematic rotation error of the movement mechanism.
Based on this, as shown in fig. 3, the error evaluation method provided in the embodiment of the present application may further include the following steps S105 to S106 before the step S104:
S105: acquiring a third group of images about the target object acquired by the image acquisition device;
the image acquisition device acquires the target object when the rotating shaft rotates for a designated angle every time in the process that the rotating shaft continuously rotates for a plurality of times along the first direction so that the target object continuously rotates for a plurality of times along the second direction relative to the image acquisition device;
s106: determining a system rotation error of the motion mechanism by using the rotation axis length of the rotation axis and the maximum value of the difference value between the designated angle and the angle variation of the characteristic points in the third group of images;
accordingly, the step S104 may include the following step S1041:
s1041: based on the camera integrated error, the system translational error, and the system rotational error, a system integrated error is determined for the image acquisition device and the motion mechanism.
In this specific implementation manner, when the motion mechanism includes a rotation axis, the motion mechanism may perform a rotation motion, so, in order to calculate a system rotation error of the motion mechanism, the motion mechanism may be controlled to perform multiple continuous rotations in a first direction, so that the target object performs multiple continuous rotations in a second direction with respect to the image capturing device, and, in a process of rotating the motion mechanism, when the motion mechanism rotates by a specified angle, the image capturing device may be controlled to perform image capturing on the target object, so as to obtain a third set of images related to the target object.
The first direction and the second direction may be the same or different.
Specifically, if the positional relationship between the motion mechanism and the image acquisition device is that the eye is on the hand, the motion mechanism can be controlled to carry the image acquisition device to perform multiple continuous rotations along the first direction, so that the target object performs multiple continuous rotations along the second direction relative to the image acquisition device; in this case, the first direction and the second direction are the same direction;
if the position relationship between the motion mechanism and the image acquisition equipment is that the eye is outside the hand, the motion mechanism can be controlled to carry the target object to continuously rotate for a plurality of times along the first direction, so that the target object continuously rotates for a plurality of times along the second direction relative to the image acquisition equipment; in this case, the first direction and the second direction are different directions.
Thus, in the process of controlling the movement mechanism to rotate, the image capturing device may be controlled to capture one third type of image with respect to the target object every time the movement mechanism is controlled to rotate by a specified angle, and the movement mechanism may be controlled to stop rotating when the number of the third type of images captured by the image capturing device reaches a preset number, or the movement mechanism is controlled to rotate by a specified number of specified angles.
The specific angle, the preset number, or the specific number may be set according to actual needs, for example, the specific angle may be 10 degrees, 15 degrees, or the like, the preset number may be 5, 10, or the like, or the specific number may be 5, 10, or the like, which is reasonable, and is not specifically limited in the embodiment of the present application.
After the third group of images is acquired, for each image in the third group of images, an angle change amount of the feature point of the target object in the image may be determined, and further, a maximum value of differences between the specified angle and the angle change amount of the feature point of the target object in the third group of images may be determined. The rotation axis length of the rotation axis of the movement mechanism may be determined, and then, the system rotation error of the movement mechanism may be determined using the maximum value and the rotation axis length of the movement mechanism.
The system rotation error can comprise a rotation axis rotation error and a rotation axis calibration error of the motion mechanism. Because the rotation axis calibration error of the motion mechanism can only be determined by using the hand-eye calibration matrix, when the machine vision system does not complete calibration or does not enter a debugging stage, the rotation axis calibration error cannot be calculated, in this case, the rotation axis length of the motion mechanism and the maximum value can be used to determine the rotation axis rotation error, and the rotation axis rotation error can be used as the system rotation error.
The rotation error of the rotation shaft of the motion mechanism can be expressed as:
wherein L is the rotation axis length; deltaR is the maximum value of the difference between the specified angle and the angle variation of the feature point of the target object in the third group of images.
Alternatively, when the center of rotation of the rotation shaft of the movement mechanism is coaxial with the centroid of the end effector, the rotation error of the rotation shaft may be determined using the maximum value and the rotation axis length of the movement mechanism, and the rotation error of the rotation shaft may be used as the system rotation error.
When the machine vision system has completed calibration or has entered the commissioning phase, and when the rotational axis center of the rotational axis of the motion mechanism is not coaxial with the center of mass of the end effector, the rotational axis calibration error may be determined using the hand-eye calibration matrix, whereby the system rotation error of the motion mechanism may be determined based on the rotational axis rotation error and the rotational axis calibration error.
Based on this, in an optional specific implementation manner, before S105, the error evaluation method provided by the embodiment of the present application may further include the following step 31:
step 31: determining a rotation axis calibration error of the motion mechanism by using a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism;
Correspondingly, the step S106 may include the following steps 32-33:
step 32: determining a rotation error of the rotation shaft of the motion mechanism by using the rotation shaft length of the rotation shaft and the maximum value of the difference value between the designated angle and the angle variation of the characteristic points in the third group of images;
step 33: and determining the system rotation error of the motion mechanism by using the rotation error of the rotation shaft and the calibration error of the rotation shaft.
In this particular implementation, the rotation axis calibration error of the motion mechanism may be determined using a hand-eye calibration matrix for the image acquisition device and the motion mechanism.
Alternatively, the rotation axis length may be determined by using a 12-point calibration method to calibrate the hand and eye of the image capturing device and the motion mechanism, and the obtained hand and eye calibration matrix of the image capturing device and the motion mechanism, and the physical coordinates of the rotation center of the motion mechanism.
Specifically, the hand-eye calibration can be performed by using the designated calibration point, and the obtained rotation fitting circle is obtained, so that the fitting error of the rotation fitting circle of the motion mechanism is the rotation axis calibration error.
The specified calibration points may be a plurality of calibration points among 12 calibration points, for example, the last three points among 12 calibration points, the last two points among 12 calibration points, all 12 calibration points, etc., which are reasonable, and in this specific implementation, no specific limitation is made.
After determining the rotation axis calibration error of the movement mechanism, a systematic rotation error of the movement mechanism may be determined based on the rotation axis rotation error and the rotation axis calibration error.
Optionally, in a specific implementation manner, the step 33 may include the following step 331:
step 331: determining a system rotation error of the motion mechanism by using a seventh formula;
wherein, the seventh formula is:
in order to facilitate understanding of the systematic rotation error of the movement mechanism, the systematic rotation error of the movement mechanism can be described with reference to fig. 4.
As shown in fig. 4, the point P is the actual position of the feature point of the target object, and the point P' is the actual position reached after the motion mechanism rotates by a specified angle, and the length of the connecting line between the two points is the systematic rotation error of the motion mechanism.
The rotation center is set as a point O, the real length of the rotation shaft is set as OP, the calibrated length of the rotation shaft obtained by calibration is set as a line segment OP ', the intersection point of the straight line OP' and the circle OP is set as a point C, and the maximum value in the difference value between the designated angle and the angle variation of the characteristic point in the third group of images is set as deltaR.
In the figure, the length of the line segment CP is the rotation axis rotation error MRAE of the motion mechanism, the length of the line segment CP 'is the rotation axis calibration error MRCE of the motion mechanism, and the length of the line segment P' P is the system rotation error SRE of the motion mechanism.
From trigonometric function knowledge, it can be seen that:
∠P′CP=180-(180-deltaR)/2
thus, the SRE may be expressed as:
after determining the systematic rotational error, the systematic rotational error with respect to the image capturing device and the motion mechanism may be determined based on the camera rotational error, the systematic translational error, and the systematic rotational error.
Optionally, after determining the camera integrated error, the system translation error, and the system rotation error, a sum of the camera integrated error, the system translation error, and the system rotation error may be calculated as the system integrated error with respect to the image capturing device and the motion mechanism.
Wherein, the system integrated error can be expressed as:
STE=CIE+STLE+SRE
optionally, after determining the camera integrated error, the system translation error, and the system rotation error, weights of the camera integrated error, the system translation error, and the system rotation error may be determined, and then, based on the weights of the camera integrated error, the system translation error, the system rotation error, and the system rotation error, the system integrated error about the image capturing device and the motion mechanism is calculated.
Wherein, the system integrated error can be expressed as:
STE=αCIE+βSTLE+γSRE
Wherein, gamma is the weight of the system rotation error.
Since the above-described system rotation error is caused by the insufficient accuracy of the image extraction algorithm and the insufficient accuracy of the mechanism rotation, the system rotation error is affected by the angle extraction accuracy error and the rotation accuracy error of the rotation shaft.
In some cases, the user may need to analyze the influencing factors of the system rotation error, for example, when the system rotation error is large, the error duty ratio of the angle extraction precision error and the rotation precision error of the rotation shaft that influence the system rotation error may be analyzed, so that the angle extraction precision error and the rotation precision error of the rotation shaft may be calculated respectively, so that error analysis and parameter adjustment may be performed based on the calculation result.
Optionally, in a specific implementation manner, the determining manner of the angle extraction precision error may include the following steps 41-42:
step 41: calculating the difference value between the maximum angle and the minimum angle of the feature points in the first group of images to obtain an angle difference;
step 42: determining an angle extraction accuracy error with respect to angle extraction of the feature points in the first group of images using the angle difference and the rotation axis length;
Accordingly, the determining manner of the angle extraction accuracy error may include the following step 43:
step 43: and determining the rotation precision error of the motion mechanism by utilizing the angle difference, the maximum value and the rotation axis length.
In this specific implementation manner, when extracting an angle in an image, under the influence of factors such as accuracy of an image feature point extraction algorithm, image edge quality, lens distortion and the like, the extracted angle is affected by an angle extraction accuracy error, and a certain deviation exists.
Based on this, in order to determine the above-described angle extraction accuracy error, the maximum angle and the minimum angle of the feature point of the target object may be determined by using each image in the first group of images after the first group of images is acquired.
Wherein, for each point on the target object, in the first set of images, the maximum angle of each point is the same and the minimum angle of each point is the same.
After determining the maximum angle and the minimum angle of the feature points of the target object in the first set of images, the difference between the maximum angle and the minimum angle can be calculated to obtain an angle difference.
Wherein, the above angle difference can be expressed as:
RPixRange_S=RPixMax_S-RPixMin_S
Wherein RPixRange_S is the angle difference; RPixMax_S is the maximum angle of the feature point of the target object in the first group of images; RPixMin_S is the minimum angle of the feature point of the target object in the first group of images.
Then, an angle extraction accuracy error with respect to angle extraction of the feature points in the first group of images can be determined using the above-described angle difference and the rotation axis length of the movement mechanism.
The above error in the accuracy of angle extraction with respect to angle extraction of the feature points in the first group of images may be expressed as:
the rotation angle of the movement mechanism is affected by the rotation accuracy error of the movement mechanism, and there is a certain deviation.
Based on this, in order to determine the rotation accuracy error, the rotation accuracy error with respect to the movement mechanism can be determined using the maximum value among the angle difference, the difference between the specified angle and the angle variation amount of the feature point in the third group image, and the rotation axis length.
In calculating the rotation accuracy error using an image, the influence of the angle extraction accuracy error is also considered, and thus, the rotation accuracy error with respect to the movement mechanism can be expressed as:
after the angle extraction precision error and the rotation precision error are calculated, the angle extraction precision error and the rotation precision error can be output, so that a user can analyze the system rotation error based on the angle extraction precision error and the rotation precision error and adjust various parameters of a machine vision system.
The system integrated error of the machine vision system may include a teaching error (System Teaching Error, STHE) of the movement mechanism.
The teaching error refers to an error caused by factors such as resolution of human eyes and a manual operation error when setting a reference motion pose of a motion mechanism. The reference motion gesture may include setting a grabbing gesture, a placing gesture, or a fitting gesture.
Typically, the teaching error may be determined by metrology tool measurements or image sensor review. For example, after a worker controls the movement mechanism to place the material at the reference position in the tray, the material and the tray are kept still, images on the center of the material and the center of the tray are collected by using the camera, and then the coordinate deviation amount of the center of the material and the center of the tray is calculated by using the collected images, so that a teaching error is obtained.
Optionally, after the teaching error is obtained, the teaching error may be compensated.
In the embodiment of the application, in order to improve the accuracy of the determined system integrated error, the influence of the teaching error on the system integrated error can be considered when calculating the system integrated error.
Based on this, in an optional specific implementation manner, before determining the integrated error of the system about the image capturing device and the motion mechanism based on the integrated error of the camera and the translational error of the system in step S104, the error evaluation method provided in the embodiment of the present application may further include the following step 51:
step 51: determining a teaching error of the motion mechanism;
accordingly, the step S104 of determining the system integrated error regarding the image capturing device and the motion mechanism based on the camera integrated error and the system translation error may include the step 52 of:
step 52: based on the camera integrated error, the system translation error, and the teaching error, a system integrated error with respect to the image capturing device and the movement mechanism is determined.
In this particular implementation, in order to improve the accuracy of the determined systematic integrated error, a teaching error of the motion mechanism may also be determined.
Wherein, optionally, the equipment parameter used for indicating the teaching error can be read from the equipment parameters of the machine vision system;
alternatively, the teaching error of the motion mechanism can be determined by a teaching error calibration model.
Alternatively, after the target object is moved to the reference position by the manual control movement mechanism, the target object is kept stationary, and then the coordinate deviation amount measurement is performed by using the image about the target object acquired by the image acquisition device, thereby obtaining the teaching error.
After determining the teaching error described above, a systematic integrated error with respect to the image capturing apparatus and the movement mechanism may be determined based on the camera integrated error, the systematic translational error, and the teaching error.
Wherein, the system integrated error can be expressed as:
STE=CIE+STLE+STHE
alternatively, when the above-described movement mechanism includes the rotation axis, the systematic integrated error with respect to the image capturing apparatus and the movement mechanism may be determined based on the camera integrated error, the systematic translational error, the systematic rotational error, and the teaching error.
Wherein, the system integrated error can be expressed as:
STE=CIE+STLE+SRE+STHE
in order to facilitate understanding of the error evaluation method according to the embodiment of the present application, a specific implementation flow and effect of the error evaluation method will be described below with reference to fig. 5.
For a machine vision system with respect to a camera and a motion mechanism, the error assessment program may perform the steps shown in fig. 5 to calculate the system integrated error for the system.
As shown in fig. 5, the error evaluation flow may include the following steps S501 to S504:
s501: the camera static test is carried out, the coordinate polar difference is obtained, and the camera comprehensive error is calculated;
s502: the mechanism is dynamically tested, the coordinate polar difference is obtained, and the translational error of the system is calculated;
s503: the rotation test of the mechanism is carried out, the angular range and the length of the rotating shaft are obtained, and the rotation error of the system is calculated;
S504: and (3) mechanism teaching test, acquiring teaching errors, and calculating system teaching errors.
When the camera static test is carried out, the target object can be placed at a fixed position in the image acquisition range of the camera, and the camera position is kept unchanged. The camera may then be controlled to acquire images of the target object several times in succession (typically not less than 100 times), resulting in a first set of images relating to the target object. And then, acquiring the coordinate polar difference of the characteristic points on each coordinate axis of the image coordinate system by using the first maximum coordinate and the first minimum coordinate of the characteristic points of the target objects on the first coordinate axis of the image coordinate system corresponding to the image acquisition equipment and the second maximum coordinate and the second minimum coordinate of the characteristic points on the second coordinate axis of the image coordinate system in the first group of images, and calculating the camera comprehensive error by using the coordinate polar difference.
When the dynamic test of the mechanism is carried out, if the camera is arranged on the moving mechanism, a target object can be placed at a fixed position in the image acquisition range of the camera, then the moving mechanism is controlled to repeatedly move to the target position from the initial position, so that the image acquisition equipment and the target object are positioned at a specified relative position, and when the camera moves to the target position each time, the camera is controlled to carry out image acquisition on the target object, and a second group of images related to the target object are obtained.
If the camera is arranged outside the moving mechanism and the position of the camera relative to the base of the moving mechanism is fixed, the moving mechanism can be controlled to repeatedly move to the target position from the initial position for a plurality of times, so that the image acquisition equipment and the target object are positioned at the appointed relative position, and the camera is controlled to acquire images of the target object when the camera moves to the target position each time, so that a second group of images related to the target object is obtained.
After the second set of images is obtained, the third maximum coordinate and the third minimum coordinate of the feature points in the second set of images on the first coordinate axis and the fourth maximum coordinate and the fourth minimum coordinate of the feature points in the second set of images on the second coordinate axis can be utilized to obtain the coordinate polar difference of the feature points on each coordinate axis of the image coordinate system, and the system translation error of the system is determined by utilizing the coordinate polar difference and the camera comprehensive error.
When the mechanism rotation test is performed, if the camera is mounted on the moving mechanism, a target object is placed at a fixed position within the image acquisition range of the camera, then the moving mechanism is controlled to rotate a fixed angle from an initial position to a clockwise direction or a counterclockwise direction for a plurality of times (generally not less than three times), and when the moving mechanism rotates by a fixed angle, the camera is controlled to perform image acquisition on the target object, so as to obtain a third group of images related to the target object.
If the camera is mounted outside the moving mechanism and the position of the camera relative to the base of the moving mechanism is fixed, the moving mechanism can be controlled to rotate a fixed angle for several times (generally not less than three times) continuously from an initial position to a clockwise direction or a counterclockwise direction, and the camera is controlled to acquire images of the target object every time the moving mechanism rotates the fixed angle, so that a third group of images related to the target object is obtained.
After the third group of images are obtained, the rotation axis length and the angle range of the motion mechanism can be obtained, and the system rotation error of the motion mechanism can be determined.
And finally, the teaching error of the system can be obtained through the mechanism teaching test. Further, the system integrated error of the machine vision system is calculated using the camera integrated error, the system translation error, the system rotation error, and the teaching error.
Further, after the camera integrated error, the system translation error, the system rotation error and the system integrated error are calculated by using the first set of images, the second set of images and the third set of images, and the calculated camera integrated error, system translation error, system rotation error, system teaching error and system integrated error are calculated, the camera integrated error, system translation error, system rotation error, system teaching error and system integrated error can be respectively output in a preset display interface, so that a user can know the system integrated error of a system formed by the camera and the motion mechanism.
Based on the same inventive concept, the embodiment of the present application further provides an error evaluation device, corresponding to the error evaluation method shown in fig. 1 provided in the above embodiment of the present application.
Fig. 6 is a schematic structural diagram of an error assessment device according to an embodiment of the present application, as shown in fig. 6, the device may include the following modules:
an image acquisition module 610 for acquiring a first set of images and a second set of images about a target object acquired by an image acquisition device; wherein the first set of images is acquired by the image acquisition device at a first location for the target object at a second location; the second group of images are acquired by the image acquisition device when the motion mechanism rotates from an initial position to a target position every time so that the image acquisition device and the target object are at a specified relative position;
a first determining module 620, configured to determine a camera integrated error of the image capturing device using a first maximum coordinate and a first minimum coordinate of a feature point of the target object in a first coordinate axis of an image coordinate system corresponding to the image capturing device, and a second maximum coordinate and a second minimum coordinate of the feature point in a second coordinate axis of the image coordinate system in the first set of images;
A second determining module 630, configured to determine a systematic translational error of the motion mechanism using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature point in the second set of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the feature point in the second set of images on the second coordinate axis;
an integrated error determination module 640 for determining a systematic integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error and the systematic translational error.
The above can be seen that, by applying the scheme provided by the embodiment of the present application, the first set of images and the second set of images about the target object acquired by the image acquisition device can be acquired by using the positional relationship and the related motion among the image acquisition device, the target object and the motion mechanism, and then, the camera integrated error and the system translational error included in the system about the image acquisition device and the motion mechanism are respectively determined by using the two sets of images, so that the system integrated error is determined according to the camera integrated error and the system translational error. And because the errors of all the components in the system are comprehensively considered, the accuracy of the determined system comprehensive errors is improved.
Optionally, in a specific implementation manner, the apparatus further includes:
a third determining module for acquiring a third set of images acquired by the image acquisition device with respect to the target object before determining a systematic integrated error with respect to the image acquisition device and the movement mechanism based on the camera integrated error and the systematic translation error; wherein the image capturing device captures the target object at each rotation of the rotation axis by a specified angle in a process in which the rotation axis is continuously rotated a plurality of times in a first direction so that the target object is continuously rotated a plurality of times in a second direction with respect to the image capturing device;
a system rotation error determining module, configured to determine a system rotation error of the motion mechanism using a rotation axis length of the rotation axis and a maximum value of differences between the specified angle and the angle variation of the feature points in the third group of images;
the integrated error determination module 640 is specifically configured to:
a systematic integrated error is determined for the image acquisition device and the motion mechanism based on the camera integrated error, the systematic translational error, and the systematic rotational error.
Optionally, in a specific implementation manner, the first determining module 620 includes:
the first computing sub-module is used for computing the difference value between the first maximum coordinate and the first minimum coordinate of the characteristic point of the target object on the first coordinate axis of the image coordinate system corresponding to the image acquisition equipment in the first group of images to obtain a first difference value;
the second calculation sub-module is used for calculating the difference value between the second maximum coordinate and the second minimum coordinate of the characteristic points on the second coordinate axis of the image coordinate system in the first group of images to obtain a second difference value;
the first determining submodule is used for determining the camera integrated error of the image acquisition equipment by utilizing the first difference value, the second difference value and a preset relation; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
Optionally, in a specific implementation manner, the second determining module 630 includes:
a third calculation sub-module, configured to calculate a difference value between a third maximum coordinate and a third minimum coordinate of the feature point on the first coordinate axis in the second set of images, to obtain a third difference value;
A fourth computing sub-module, configured to compute a difference value between a fourth maximum coordinate and a fourth minimum coordinate of the feature point on the second coordinate axis in the second set of images, to obtain a fourth difference value;
the second determining submodule is used for determining repeated positioning errors of the moving mechanism by utilizing the camera integrated errors, the third difference value, the fourth difference value and a preset relation and determining systematic translation errors of the moving mechanism based on the repeated positioning errors; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
Optionally, in a specific implementation manner, the preset relationship includes: the hand-eye calibration matrix; the first determination submodule includes:
the third determining submodule is used for determining the absolute positioning error of the motion mechanism by utilizing the camera comprehensive error, the hand-eye calibration matrix and a plurality of groups of calibration coordinates of the hand-eye calibration matrix;
and a fourth determination sub-module for determining a systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error.
Optionally, in a specific implementation manner, a system rotation error of the motion mechanism is affected by an angle extraction precision error and a rotation precision error of the rotation shaft; the device also comprises a fourth determining module and a fifth determining module;
the fourth determining module is specifically configured to calculate a difference value between a maximum angle and a minimum angle of the feature points in the first set of images, so as to obtain an angle difference; determining an angle extraction accuracy error with respect to angle extraction of the feature points in the first group of images using the angle difference and the rotation axis length;
the fifth determining module is specifically configured to determine a rotation accuracy error about the rotation mechanism using the angle difference, the maximum value, and the rotation axis length.
Optionally, in a specific implementation manner, the apparatus further includes:
a sixth determining module configured to determine a rotation axis calibration error of the moving mechanism using a hand-eye calibration matrix with respect to the image capturing apparatus and the moving mechanism before determining a systematic rotation error of the moving mechanism using a rotation axis length of the rotation axis and a maximum value of differences in angle variation of the specified angle and the feature points in the third group of images;
The system rotation error determination module includes:
a fifth determining sub-module for determining a rotation error of the rotation shaft of the movement mechanism using a rotation shaft length of the rotation shaft and a maximum value among differences of the specified angle and the angle variation amounts of the feature points in the third group of images;
and the sixth determining submodule is used for determining the system rotation error of the motion mechanism by utilizing the rotation error of the rotating shaft and the calibration error of the rotating shaft.
Optionally, in a specific implementation manner, the apparatus further includes:
a seventh determining module for determining a teaching error of the motion mechanism before determining a systematic integrated error with respect to the image capturing device and the motion mechanism based on the camera integrated error and the systematic translational error;
the integrated error determination module 640 is specifically configured to:
based on the camera integrated error, the system translation error, the system rotation error, and the teaching error, a system integrated error with respect to the image acquisition device and the motion mechanism is determined.
Optionally, in a specific implementation manner, the first determining submodule is specifically configured to:
Determining a camera integrated error of the image acquisition device by using a first formula;
wherein, the first formula is:
wherein CIE is the camera integrated error of the image acquisition equipment; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; XPixRange_S is the first difference; ypixrange_s is the second difference;
or,
determining a camera integrated error of the image acquisition device by using a second formula;
wherein the second formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
Optionally, in a specific implementation manner, the second determining submodule is specifically configured to:
determining a repeated positioning error of the motion mechanism by using a third formula;
wherein the third formula is:
wherein MRTE is the repeated positioning error of the motion mechanism; XPixRange_D is the third difference value and YPIxRange_D is the fourth difference value; m is the hand-eye calibration matrix CIE of the image acquisition equipment and the motion mechanism, and is the camera integrated error;
or determining a repeated positioning error of the motion mechanism by using a fourth formula;
wherein the fourth formula is:
Where PixAcc is the single pixel precision with respect to the image acquisition device.
Optionally, in a specific implementation manner, the third determining submodule is specifically configured to:
determining an absolute positioning error of the motion mechanism by using a fifth formula;
wherein the fifth formula is:
wherein MATE is the absolute positioning error of the motion mechanism; x_ Wld i The method comprises the steps of calibrating an X coordinate in an ith set of coordinates in a world coordinate system for the calibration point; X_Pix i An X coordinate in an ith set of calibration coordinates in the image coordinate system for the calibration point; y_ Wld i Y coordinates in an ith set of calibration coordinates in the world coordinate system for the calibration point; Y_Pix i Y coordinates of the calibration point in an ith set of calibration coordinates in the image coordinate system; n is the number of calibration coordinate sets of the calibration points;
the fourth determination submodule is specifically configured to:
determining a systematic translational error of the motion mechanism using a sixth formula:
wherein the sixth formula is:
wherein STLE is the systematic translational error of the motion mechanism; MRTE is the repeated positioning error of the motion mechanism; MATE is the absolute positioning error of the motion mechanism.
Optionally, in a specific implementation manner, the sixth determining submodule is specifically configured to:
Determining a system rotation error of the motion mechanism by using a seventh formula;
wherein the seventh formula is:
wherein SRE is the systematic rotational error of the motion mechanism; MRAE is the rotation error of the rotation shaft system; MRCE is the rotation axis calibration error of the preset motion mechanism.
The embodiment of the application also provides an electronic device, as shown in fig. 7, including:
a memory 701 for storing a computer program;
the processor 702 is configured to implement any one of the steps of the error assessment method provided in the embodiment of the present application when executing the program stored in the memory 701.
And the electronic device may further comprise a communication bus and/or a communication interface, through which the processor 702, the communication interface, and the memory 701 communicate with each other.
The communication bus mentioned above for the electronic devices may be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present application, there is also provided a computer readable storage medium having stored therein a computer program which, when executed by a processor, implements the steps of any of the error assessment methods described above.
In yet another embodiment of the present application, a computer program product containing instructions that, when run on a computer, cause the computer to perform any of the error assessment methods of the above embodiments is also provided.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a Solid State Disk (SSD), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the apparatus embodiments, the electronic device embodiments, and the storable medium embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to a partial description of the method embodiments for matters.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application are included in the protection scope of the present application.
Claims (16)
1. A method of error assessment, the method comprising:
acquiring a first group of images and a second group of images which are acquired by an image acquisition device and are related to a target object; wherein the first set of images is acquired by the image acquisition device at a first location for the target object at a second location; the second group of images are acquired by the image acquisition device when the motion mechanism rotates from an initial position to a target position every time so that the image acquisition device and the target object are at a specified relative position;
determining a camera comprehensive error of the image acquisition device by using a first maximum coordinate and a first minimum coordinate of a characteristic point of the target object on a first coordinate axis of an image coordinate system corresponding to the image acquisition device and a second maximum coordinate and a second minimum coordinate of the characteristic point on a second coordinate axis of the image coordinate system in the first group of images;
Determining a system translation error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature point in the second group of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the feature point in the second group of images on the second coordinate axis;
based on the camera integrated error and the system translation error, a system integrated error is determined with respect to the image acquisition device and the motion mechanism.
2. The method of claim 1, wherein the movement mechanism comprises a rotational axis; before said determining a systematic integrated error with respect to said image acquisition device and said motion mechanism based on said camera integrated error and said systematic translational error, said method further comprises:
acquiring a third group of images about the target object acquired by the image acquisition device; wherein the image capturing device captures the target object at each rotation of the rotation axis by a specified angle in a process in which the rotation axis is continuously rotated a plurality of times in a first direction so that the target object is continuously rotated a plurality of times in a second direction with respect to the image capturing device;
Determining a systematic rotation error of the motion mechanism by using a rotation axis length of the rotation axis and a maximum value of differences between the specified angle and the angle variation of the feature points in the third group of images;
the determining a system integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error and the system translation error, comprising:
a systematic integrated error is determined for the image acquisition device and the motion mechanism based on the camera integrated error, the systematic translational error, and the systematic rotational error.
3. The method according to claim 1, wherein determining the camera integrated error of the image capturing device using a first maximum coordinate and a first minimum coordinate of the feature point of the target object on a first coordinate axis of an image coordinate system corresponding to the image capturing device and a second maximum coordinate and a second minimum coordinate of the feature point on a second coordinate axis of the image coordinate system in the first set of images includes:
calculating a difference value between a first maximum coordinate and a first minimum coordinate of a characteristic point of the target object in the first group of images on a first coordinate axis of an image coordinate system corresponding to the image acquisition equipment to obtain a first difference value;
Calculating a difference value between a second maximum coordinate and a second minimum coordinate of the feature point on a second coordinate axis of the image coordinate system in the first group of images to obtain a second difference value;
determining a camera integrated error of the image acquisition equipment by using the first difference value, the second difference value and a preset relation; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
4. The method of claim 1, wherein determining the systematic translational error of the motion mechanism using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the feature point in the second set of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the feature point in the second set of images on the second coordinate axis, comprises:
calculating a difference value between a third maximum coordinate and a third minimum coordinate of the feature point on the first coordinate axis in the second group of images to obtain a third difference value;
calculating a difference value between a fourth maximum coordinate and a fourth minimum coordinate of the feature point on the second coordinate axis in the second group of images to obtain a fourth difference value;
Determining a repeated positioning error of the motion mechanism by utilizing the camera integrated error, the third difference value, the fourth difference value and a preset relation, and determining a system translation error of the motion mechanism based on the repeated positioning error; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single pixel precision for the image acquisition device.
5. The method of claim 4, wherein the predetermined relationship comprises: the hand-eye calibration matrix; the determining a systematic translational error of the motion mechanism based on the repeated positioning error comprises:
determining an absolute positioning error of the motion mechanism by using the camera integrated error, the hand-eye calibration matrix and a plurality of groups of calibration coordinates used for determining the hand-eye calibration matrix;
and determining a systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error.
6. The method of claim 2, wherein a systematic rotational error of the motion mechanism is affected by an angle extraction accuracy error and a rotational accuracy error of the rotational shaft; the determining mode of the angle extraction precision error comprises the following steps:
Calculating the difference value of the maximum angle and the minimum angle of the feature points in the first group of images to obtain an angle difference;
determining an angle extraction accuracy error with respect to angle extraction of the feature points in the first group of images using the angle difference and the rotation axis length;
the method for determining the rotation precision error comprises the following steps:
and determining a rotation accuracy error with respect to the rotation mechanism using the angle difference, the maximum value, and the rotation axis length.
7. The method according to claim 6, wherein before determining the systematic rotation error of the moving mechanism using the maximum value of the rotation axis length of the rotation axis and the difference between the specified angle and the angle variation amount of the feature point in the third group of images, the method further comprises:
determining a rotation axis calibration error of the motion mechanism using a hand-eye calibration matrix for the image acquisition device and the motion mechanism;
the determining a systematic rotation error of the motion mechanism using a maximum value of a rotation axis length of the rotation axis and a difference value between the specified angle and an angle variation amount of the feature point in the third group of images, includes:
Determining a rotation error of the rotation shaft of the movement mechanism by using a rotation shaft length of the rotation shaft and a maximum value of differences between the specified angle and the angle variation of the feature points in the third group of images;
and determining the system rotation error of the motion mechanism by using the rotation error of the rotating shaft and the calibration error of the rotating shaft.
8. The method of any of claims 1-7, wherein prior to the determining a systematic integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error and the systematic translational error, the method further comprises:
determining a teaching error of the motion mechanism;
the determining a system integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error and the system translation error, comprising:
based on the camera integrated error, the system translation error, and the teaching error, a system integrated error is determined with respect to the image acquisition device and the motion mechanism.
9. A method according to claim 3, wherein said determining a camera integrated error of the image capturing device using the first difference, the second difference, and a predetermined relationship comprises:
Determining a camera integrated error of the image acquisition device by using a first formula;
wherein, the first formula is:
wherein CIE is the camera integrated error of the image acquisition equipment; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; XPixRange_S is the first difference; ypixrange_s is the second difference;
or,
determining a camera integrated error of the image acquisition device by using a second formula;
wherein the second formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
10. The method of claim 4, wherein determining the repeated positioning error of the motion mechanism using the camera integrated error, the third difference, the fourth difference, and a predetermined relationship comprises:
determining a repeated positioning error of the motion mechanism by using a third formula;
wherein the third formula is:
wherein MRTE is the repeated positioning error of the motion mechanism; XPixRange_D is the third difference value and YPIxRange_D is the fourth difference value; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; CIE is the camera integrated error;
Or,
determining a repeated positioning error of the motion mechanism by using a fourth formula;
wherein the fourth formula is:
where PixAcc is the single pixel precision with respect to the image acquisition device.
11. The method of claim 5, wherein determining the absolute positioning error of the motion mechanism using the camera integrated error, the hand-eye calibration matrix, and a plurality of sets of calibration coordinates for determining the hand-eye calibration matrix comprises:
determining an absolute positioning error of the motion mechanism by using a fifth formula;
wherein the fifth formula is:
wherein MATE is the absolute positioning error of the motion mechanism; x_ Wld i The method comprises the steps of calibrating an X coordinate in an ith set of coordinates in a world coordinate system for the calibration point; X_Pix i An X coordinate in an ith set of calibration coordinates in the image coordinate system for the calibration point; y_ Wld i Y coordinates in an ith set of calibration coordinates in the world coordinate system for the calibration point; Y_Pix i At the marked pointY coordinates in an ith set of calibration coordinates in the image coordinate system; n is the number of calibration coordinate sets of the calibration points;
the determining a systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error comprises:
Determining a systematic translational error of the motion mechanism using a sixth formula:
wherein the sixth formula is:
wherein STLE is the systematic translational error of the motion mechanism; MRTE is the repeated positioning error of the motion mechanism; MATE is the absolute positioning error of the motion mechanism.
12. The method of claim 7, wherein said determining a systematic rotational error of said motion mechanism using said rotational axis rotational error and said rotational axis calibration error comprises:
determining a system rotation error of the motion mechanism by using a seventh formula;
wherein the seventh formula is:
wherein SRE is the systematic rotational error of the motion mechanism; MRAE is the rotation error of the rotation shaft; MRCE is the rotation shaft calibration error; deltaR is the maximum value of the difference between the specified angle and the angle variation of the feature point in the third set of images.
13. An error assessment apparatus, the apparatus comprising:
the image acquisition module is used for acquiring a first group of images and a second group of images, which are acquired by the image acquisition equipment and related to the target object; wherein the first set of images is acquired by the image acquisition device at a first location for the target object at a second location; the second group of images are acquired by the image acquisition device when the motion mechanism rotates from an initial position to a target position every time so that the image acquisition device and the target object are at a specified relative position;
The first determining module is used for determining a camera comprehensive error of the image acquisition device by using a first maximum coordinate and a first minimum coordinate of the characteristic point of the target object on a first coordinate axis of an image coordinate system corresponding to the image acquisition device and a second maximum coordinate and a second minimum coordinate of the characteristic point on a second coordinate axis of the image coordinate system in the first group of images;
the second determining module is used for determining a system translation error of the motion mechanism by using the camera integrated error, a third maximum coordinate and a third minimum coordinate of the characteristic points in the second group of images on the first coordinate axis, and a fourth maximum coordinate and a fourth minimum coordinate of the characteristic points in the second group of images on the second coordinate axis;
and a comprehensive error determination module for determining a systematic comprehensive error with respect to the image acquisition device and the motion mechanism based on the camera comprehensive error and the systematic translational error.
14. The apparatus of claim 13, wherein the movement mechanism comprises a rotational axis; the apparatus further comprises:
a third determining module for acquiring a third set of images acquired by the image acquisition device with respect to the target object before determining a systematic integrated error with respect to the image acquisition device and the movement mechanism based on the camera integrated error and the systematic translation error; wherein the image capturing device captures the target object at each rotation of the rotation axis by a specified angle in a process in which the rotation axis is continuously rotated a plurality of times in a first direction so that the target object is continuously rotated a plurality of times in a second direction with respect to the image capturing device;
A system rotation error determining module, configured to determine a system rotation error of the motion mechanism using a rotation axis length of the rotation axis and a maximum value of differences between the specified angle and the angle variation of the feature points in the third group of images;
the comprehensive error determining module is specifically configured to:
determining a systematic integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error, the systematic translational error, and the systematic rotational error;
and/or the number of the groups of groups,
the first determining module includes:
the first computing sub-module is used for computing the difference value between the first maximum coordinate and the first minimum coordinate of the characteristic point of the target object on the first coordinate axis of the image coordinate system corresponding to the image acquisition equipment in the first group of images to obtain a first difference value;
the second calculation sub-module is used for calculating the difference value between the second maximum coordinate and the second minimum coordinate of the characteristic points on the second coordinate axis of the image coordinate system in the first group of images to obtain a second difference value;
the first determining submodule is used for determining the camera integrated error of the image acquisition equipment by utilizing the first difference value, the second difference value and a preset relation; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single-pixel precision for the image acquisition device;
And/or the number of the groups of groups,
the second determining module includes:
a third calculation sub-module, configured to calculate a difference value between a third maximum coordinate and a third minimum coordinate of the feature point on the first coordinate axis in the second set of images, to obtain a third difference value;
a fourth computing sub-module, configured to compute a difference value between a fourth maximum coordinate and a fourth minimum coordinate of the feature point on the second coordinate axis in the second set of images, to obtain a fourth difference value;
the second determining submodule is used for determining repeated positioning errors of the moving mechanism by utilizing the camera integrated errors, the third difference value, the fourth difference value and a preset relation and determining systematic translation errors of the moving mechanism based on the repeated positioning errors; wherein, the preset relation includes: a hand-eye calibration matrix for the image acquisition device and the motion mechanism, or a single-pixel precision for the image acquisition device;
and/or the number of the groups of groups,
the preset relation comprises the following steps: the hand-eye calibration matrix; the first determination submodule includes:
the third determining submodule is used for determining the absolute positioning error of the motion mechanism by utilizing the camera comprehensive error, the hand-eye calibration matrix and a plurality of groups of calibration coordinates of the hand-eye calibration matrix;
A fourth determination submodule for determining a systematic translational error of the motion mechanism based on the repeated positioning error and the absolute positioning error;
and/or the number of the groups of groups,
the system rotation error of the motion mechanism is influenced by the angle extraction precision error and the rotation precision error of the rotating shaft; the device also comprises a fourth determining module and a fifth determining module;
the fourth determining module is specifically configured to calculate a difference value between a maximum angle and a minimum angle of the feature points in the first set of images, so as to obtain an angle difference; determining an angle extraction accuracy error with respect to angle extraction of the feature points in the first group of images using the angle difference and the rotation axis length;
the fifth determining module is specifically configured to determine a rotation accuracy error about the rotation mechanism using the angle difference, the maximum value, and the rotation axis length;
and/or the number of the groups of groups,
the apparatus further comprises:
a sixth determining module configured to determine a rotation axis calibration error of the moving mechanism using a hand-eye calibration matrix with respect to the image capturing apparatus and the moving mechanism before determining a systematic rotation error of the moving mechanism using a rotation axis length of the rotation axis and a maximum value of differences in angle variation of the specified angle and the feature points in the third group of images;
The system rotation error determination module includes:
a fifth determining sub-module for determining a rotation error of the rotation shaft of the movement mechanism using a rotation shaft length of the rotation shaft and a maximum value among differences of the specified angle and the angle variation amounts of the feature points in the third group of images;
a sixth determining submodule, configured to determine a system rotation error of the motion mechanism using the rotation error of the rotation shaft and the calibration error of the rotation shaft;
and/or the number of the groups of groups,
the apparatus further comprises:
a seventh determining module for determining a teaching error of the motion mechanism before determining a systematic integrated error with respect to the image capturing device and the motion mechanism based on the camera integrated error and the systematic translational error;
the comprehensive error determining module is specifically configured to:
determining a systematic integrated error with respect to the image acquisition device and the motion mechanism based on the camera integrated error, the systematic translational error, the systematic rotational error, and the teaching error;
and/or the number of the groups of groups,
the first determining submodule is specifically configured to:
determining a camera integrated error of the image acquisition device by using a first formula;
Wherein, the first formula is:
wherein CIE is the camera integrated error of the image acquisition equipment; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; XPixRange_S is the first difference; ypixrange_s is the second difference;
or,
determining a camera integrated error of the image acquisition device by using a second formula;
wherein the second formula is:
wherein PixAcc is the single pixel precision with respect to the image acquisition device;
and/or the number of the groups of groups,
the second determining sub-module is specifically configured to:
determining a repeated positioning error of the motion mechanism by using a third formula;
wherein the third formula is:
wherein MRTE is the repeated positioning error of the motion mechanism; XPixRange_D is the third difference value and YPIxRange_D is the fourth difference value; m is a hand-eye calibration matrix related to the image acquisition equipment and the motion mechanism; CIE is the camera integrated error;
or,
determining a repeated positioning error of the motion mechanism by using a fourth formula;
wherein the fourth formula is:
wherein PixAcc is the single pixel precision with respect to the image acquisition device;
and/or the number of the groups of groups,
The third determining sub-module is specifically configured to:
determining an absolute positioning error of the motion mechanism by using a fifth formula;
wherein the fifth formula is:
wherein MATE is the absolute positioning error of the motion mechanism; x_ Wld i The method comprises the steps of calibrating an X coordinate in an ith set of coordinates in a world coordinate system for the calibration point; X_Pix i An X coordinate in an ith set of calibration coordinates in the image coordinate system for the calibration point; y_ Wld i Y coordinates in an ith set of calibration coordinates in the world coordinate system for the calibration point; Y_Pix i Y coordinates of the calibration point in an ith set of calibration coordinates in the image coordinate system; n is the number of calibration coordinate sets of the calibration points;
the fourth determination submodule is specifically configured to:
determining a systematic translational error of the motion mechanism using a sixth formula:
wherein the sixth formula is:
wherein STLE is the systematic translational error of the motion mechanism; MRTE is the repeated positioning error of the motion mechanism; MATE is the absolute positioning error of the motion mechanism;
and/or the number of the groups of groups,
the sixth determination submodule is specifically configured to:
determining a system rotation error of the motion mechanism by using a seventh formula;
wherein the seventh formula is:
Wherein SRE is the systematic rotational error of the motion mechanism; MRAE is the rotation error of the rotation shaft system; MRCE is the rotation axis calibration error of the preset motion mechanism; deltaR is the maximum value of the difference between the specified angle and the angle variation of the feature point in the third set of images.
15. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the method of any of claims 1-12 when executing a program stored on a memory.
16. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the method of any of claims 1-12.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311184736.4A CN117226832A (en) | 2023-09-08 | 2023-09-08 | An error assessment method, device and electronic equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311184736.4A CN117226832A (en) | 2023-09-08 | 2023-09-08 | An error assessment method, device and electronic equipment |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN117226832A true CN117226832A (en) | 2023-12-15 |
Family
ID=89092403
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311184736.4A Pending CN117226832A (en) | 2023-09-08 | 2023-09-08 | An error assessment method, device and electronic equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN117226832A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118876056A (en) * | 2024-07-19 | 2024-11-01 | 杭州海康机器人股份有限公司 | A motion mechanism control method, device, electronic device and storage medium |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020173240A1 (en) * | 2019-02-27 | 2020-09-03 | 广东拓斯达科技股份有限公司 | Image acquisition apparatus calibration method and apparatus, computer device, and storage medium |
| CN114519746A (en) * | 2022-01-27 | 2022-05-20 | 北京芯海视界三维科技有限公司 | Error acquisition method and device for binocular camera |
| WO2022152194A1 (en) * | 2021-01-14 | 2022-07-21 | 杭州海康威视数字技术股份有限公司 | Calibration method of monitoring camera |
| CN115713563A (en) * | 2022-11-21 | 2023-02-24 | 杭州海康机器人股份有限公司 | Camera calibration method and device, electronic equipment and storage medium |
-
2023
- 2023-09-08 CN CN202311184736.4A patent/CN117226832A/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020173240A1 (en) * | 2019-02-27 | 2020-09-03 | 广东拓斯达科技股份有限公司 | Image acquisition apparatus calibration method and apparatus, computer device, and storage medium |
| WO2022152194A1 (en) * | 2021-01-14 | 2022-07-21 | 杭州海康威视数字技术股份有限公司 | Calibration method of monitoring camera |
| CN114519746A (en) * | 2022-01-27 | 2022-05-20 | 北京芯海视界三维科技有限公司 | Error acquisition method and device for binocular camera |
| CN115713563A (en) * | 2022-11-21 | 2023-02-24 | 杭州海康机器人股份有限公司 | Camera calibration method and device, electronic equipment and storage medium |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN118876056A (en) * | 2024-07-19 | 2024-11-01 | 杭州海康机器人股份有限公司 | A motion mechanism control method, device, electronic device and storage medium |
| CN118876056B (en) * | 2024-07-19 | 2025-10-31 | 杭州海康机器人股份有限公司 | Motion mechanism control method and device, electronic equipment and storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN107871328B (en) | Machine vision system and calibration method implemented by machine vision system | |
| US11049280B2 (en) | System and method for tying together machine vision coordinate spaces in a guided assembly environment | |
| CN110640747B (en) | Hand-eye calibration method and system for robot, electronic equipment and storage medium | |
| JP2020116734A (en) | System and method for automatic hand-eye calibration of a vision system for robot motion | |
| US11403780B2 (en) | Camera calibration device and camera calibration method | |
| CN105066884B (en) | A kind of robot end's deviations bearing calibration and system | |
| US9199379B2 (en) | Robot system display device | |
| CN114310901B (en) | Coordinate system calibration method, device, system and medium for robot | |
| JP2005201824A (en) | Measuring device | |
| JP2016185572A (en) | Robot, robot controller and robot system | |
| JP7185860B2 (en) | Calibration method for a multi-axis movable vision system | |
| JP2019115974A (en) | Calibration and operation of vision-based manipulation systems | |
| CN116061196B (en) | A method and system for calibrating kinematic parameters of a multi-axis motion platform | |
| CN112229323A (en) | Six degrees of freedom measurement method of checkerboard cooperation target based on monocular vision of mobile phone and its application | |
| CN117173254A (en) | A camera calibration method, system, device and electronic equipment | |
| CN117226832A (en) | An error assessment method, device and electronic equipment | |
| CN118305790A (en) | Robot vision positioning method, calibration method, device, equipment and medium | |
| CN113658270A (en) | Multi-view visual calibration method, device, medium and system based on workpiece hole center | |
| CN106959704B (en) | Control method and system of three-dimensional topography measuring instrument | |
| JP2021079527A (en) | Measurement system and method for accuracy of positioning of robot arm | |
| Scaria et al. | Cost Effective Real Time Vision Interface for Off Line Simulation of Fanuc Robots | |
| JPH0882505A (en) | Calibration method of camera parameter and measuring method of object position | |
| CN115265598A (en) | Method, device and system for calibrating an inertial measurement unit | |
| CN118876055B (en) | A method, device, electronic device, and storage medium for controlling a motion mechanism. | |
| CN113459084A (en) | Robot parameter calibration method, device, equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |