Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that like reference numerals and letters refer to like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
< Hardware configuration >
Fig. 1 is a block diagram of a hardware configuration of a decentration detection system 100 of a lens according to an embodiment of the present invention.
As shown in fig. 1, the decentration detection system 100 for a lens includes a light source 1000, an image sensor 2000, and a decentration detection device 3000 for a lens.
Wherein the light source 1000 may be used to emit a plurality of preset light beams.
The light source 1000 may be an area array beam or a single beam. In the case where the light source 1000 is a single light beam, the plurality of preset light beams may be realized by emitting light beams at different preset positions.
The plurality of preset beams may be parallel or non-parallel, which is not limited herein.
The image sensor 2000 is used for collecting actual projection images formed by a plurality of preset light beams through the lens to be detected at the focal plane thereof, and sending the actual projection images to the eccentric detection device 3000 of the lens.
The image sensor 2000 may be a Charge-Coupled Device (CCD) sensor, a complementary metal Oxide Semiconductor (Complementary Metal-Oxide-Semiconductor, CMOS) sensor, or the like, and is not limited herein.
The lens to be detected may be a spectacle lens, for example, a myopia, hyperopia, presbyopia lens, progressive addition lens, etc., or may be a camera lens, telescope lens, microscope lens, projector lens, etc.
Those skilled in the art will appreciate that the particular type of lens to be tested is not limited herein. That is, the method for detecting decentration of a lens according to the embodiment of the present application may be applied to a prescription lens, or may be applied to other lenses, which is not limited herein.
The device for detecting the decentration of the lens may be an electronic device, such as a computer, a mobile phone, or other devices, and is not limited herein.
The lens eccentricity detection device is used for determining the optical center position coordinate of the lens to be detected according to the actual projection image and the reference projection image, and determining the eccentricity value of the lens to be detected based on the optical center position coordinate and the geometric center position coordinate.
In the present embodiment, as shown with reference to fig. 1, the decentration detection device 3000 of the lens may include a processor 3100, a memory 3200, an interface device 3300, a communication device 3400, a display device 3500, an input device 3600, a speaker 300, a microphone 3800, and the like.
Processor 3100 may be a mobile processor. The memory 3200 includes, for example, ROM (read only memory), RAM (random access memory), nonvolatile memory such as a hard disk, and the like. The interface device 3300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 3400 may be, for example, a wired or wireless communication device, and the communication device 3400 may include a short-range communication device, for example, any device that performs short-range wireless communication based on a short-range wireless communication protocol such as Hilink protocol, wiFi (IEEE 802.11 protocol), mesh, bluetooth, zigBee, thread, Z-Wave, NFC, UWB, liFi, or the like, and the communication device 3400 may also include a remote communication device, for example, any device that performs WLAN, GPRS, 2G/3G/4G/5G remote communication. The display device 3500 is, for example, a liquid crystal display, a touch display, or the like, and the display device 3500 is for displaying an actual projection image or an decentration value of a lens to be detected. The input device 3600 may include, for example, a touch screen, a keyboard, and the like. A user may input/output voice information through the speaker 3700 and the microphone 3800.
In this embodiment, the memory 3200 of the decentration detection device 3000 of the lens is used to store instructions for controlling the processor 3100 to operate to perform at least the decentration detection method of the lens according to any embodiment of the present invention. The skilled person can design instructions according to the disclosed solution. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
Although a plurality of devices of the decentration detecting device 3000 of the lens are shown in fig. 1, the present invention may relate to only some of them, for example, the decentration detecting device 3000 of the lens relates to only the memory 3200, the processor 3100 and the display device 3500.
In this embodiment, the decentration detecting device of the lens performs the method according to any embodiment of the present invention based on the actual projection image, and determines the decentration value of the lens to be detected.
< Method example >
Fig. 2 is a flow chart of a method for detecting decentration of a lens according to an embodiment of the present invention, which can be implemented by the decentration detecting device 3000 of the lens.
According to the embodiment of fig. 2, the method for detecting the decentration of the lens may include the following steps S2100 to S2400:
In step S2100, a plurality of actual projection images formed by the preset light beams reaching the focal plane of the image sensor through the lens to be detected are obtained.
In this embodiment, the plurality of preset light beams may be emitted by an area array light beam, or may be emitted by an optical fiber bundle, a grating, a beam splitter, a microlens array, an LED matrix, a laser diode array, or the like, or may be formed by emitting light beams from different emission positions by one unit light beam.
It will be appreciated by those skilled in the art that the manner in which the plurality of predetermined light beams are generated is not limited herein.
The plurality of preset beams may be parallel or non-parallel, which is not limited herein.
The image sensor may be a Charge Coupled Device (CCD) sensor, a Complementary Metal Oxide Semiconductor (CMOS) sensor, or the like, without limitation.
The lens to be detected may be a spectacle lens, for example, a prescription lens such as a near-view lens, a far-view lens, or other lenses such as a camera lens, which is not limited herein.
When the lens to be detected is eccentrically detected, the lens to be detected is positioned between the light source emitting the plurality of preset light beams and the image sensor, and the plurality of preset light beams form an actual projection image on the focal plane of the image sensor through the lens to be detected.
Illustratively, as shown in FIG. 4, a schematic diagram of a lens decentration detection system is provided, which includes a light source 1 and an image sensor 2. Wherein, the light source 1 is an area array light beam, and a plurality of preset light beams emitted by the light source are parallel light beams. In the case of eccentric detection of the lens 3 to be detected, the lens 3 to be detected is located between the light source 1 and the image sensor 2 so that the plurality of preset light beams form an actual projection image on the image sensor 2 through the lens 3 to be detected.
If the lens to be detected is not parallel to the focal plane of the image sensor, on one hand, the preset light beams may be focused at different positions of the focal plane, which may cause blurring or distortion of the actual projected image, and on the other hand, the preset light beams may cause unnecessary refraction when passing through the lens to be detected, thereby causing distortion on the focal plane of the image sensor. The distortion, and the like of the actual projection image caused by the non-parallel focal planes of the lens to be detected and the image sensor further cause deviation of the optical center determined based on the actual projection image, and further cause inaccurate eccentric detection result. Therefore, in order to avoid the influence of non-parallelism of the focal planes of the lens to be detected and the image sensor on the measurement accuracy of the optical center, when the actual projection image is acquired, the focal planes of the lens to be detected and the image sensor are adjusted to be parallel, and then the actual projection image is acquired.
Based on this, in some embodiments, acquiring actual projection images formed by a plurality of preset light beams through the lens to be detected to the focal plane of the image sensor in step S2100 includes:
and under the condition that the lens to be detected is parallel to the image sensor, acquiring a plurality of actual projection images formed by the preset light beams reaching the focal plane of the image sensor through the lens to be detected.
In this embodiment, the actual projection image is acquired with the lens to be detected parallel to the focal plane of the image sensor.
Illustratively, as shown in fig. 4, the focal plane of the image sensor is disposed horizontally, and in order to make the lens to be detected parallel to the image sensor, only the lens to be detected needs to be disposed horizontally.
In one example, the lens to be detected may be a spherical lens, and at this time, a spherical profiling tool may be established with a bearing surface of the spherical lens as a reference, and after the spherical lens is placed in the spherical profiling tool, the spherical lens is compressed by elastic pieces at the top, bottom, left and right sides, so that the spherical lens is placed horizontally.
In this example, the bearing surface may be an edge portion of the spherical lens that contacts the frame. And (3) taking the bearing surface as a reference, and establishing a spherical profiling tool matched with the shape and the curvature of the spherical lens. In putting the spherical lens into the spherical profiling tool, the curvature of the spherical lens is matched with the curvature of the spherical profiling tool. After the spherical lens is placed into the spherical profiling tool, the upper, lower, left and right elastic sheets of the spherical lens are pressed, so that the spherical lens can be ensured to be uniformly contacted with the spherical profiling tool, and the spherical lens is placed horizontally.
In another example, if the lens to be detected has no obvious or stable positioning reference, the lens to be detected can be placed on the 5-axis platform, and then the reflection eccentricity of the lens to be detected is adjusted to 0 by adjusting the 5-axis platform, so that the lens to be detected is placed horizontally.
In this example, the 5-axis platform may be an adjustment platform with five degrees of freedom, and may accurately adjust the position and posture of the lens to be detected placed thereon in space. The five degrees of freedom include three translations (up and down, left and right, front and back) and two rotations (rotation and tilting). By placing a light source or reflector on the convex (outward) side of the lens to be inspected, the position of the reflected light can be observed, and thus the deviation between the reflected light and the expected path can be calculated, i.e., the reflection eccentricity can be obtained. If the lens to be detected is placed horizontally, the reflected light will be reflected directly back to the direction of the light source, and if the lens to be detected is not placed horizontally, the reflected light will deviate from the original direction. By adjusting the lens to be detected on the 5-axis platform, the reflected light completely returns to the position of the light source, namely, the reflection decentration is 0, and the lens to be detected is placed horizontally.
According to the embodiment of the application, under the condition that the lens to be detected is parallel to the focal plane of the image sensor, the actual projection images formed by the plurality of preset light beams reaching the focal plane of the image sensor through the lens to be detected are obtained, so that the clear actual projection images can be obtained, and the accuracy of determining the optical center based on the actual projection images is improved.
Step S2200, determining the optical center position coordinates of the lens to be detected according to the actual projection image and the reference projection image.
In this embodiment, since the light beam passing through the optical center of the lens to be detected is not deflected, the position coordinates of the optical center of the lens to be detected can be determined according to the actual projection image and the reference projection image. The reference projection image is a projection image formed by the plurality of preset light beams reaching the focal plane of the image sensor without light beam deflection.
The manner of determining the optical center of the lens to be detected based on the reference projection image and the actual projection image may be feature point matching, image registration, machine learning, beam analysis, etc., and is not limited herein.
Determining the optical center by means of feature point matching may be, for example, identifying corresponding feature points, such as spot centers or edge features, in the reference projection image and the actual projection image, and then determining the offset of the optical center by matching these feature points.
Determining the optical center by image registration may be, for example, using an image registration technique to align the two images, the reference projection image and the actual projection image, and determining the position of the optical center by comparing the transformation parameters.
Determining the optical center by beam analysis may be, for example, analyzing the propagation paths of the light beams in the reference projection image and the actual projection image, and determining the optical center by comparing the deflection angle and the positional change of the light beams.
The optical center position coordinate of the lens to be detected may be a position coordinate of the optical center of the lens to be detected at the first pixel coordinate. The position coordinate of the optical center of the lens to be detected may also be the position coordinate of the optical center of the lens to be detected under the coordinate system of the lens to be detected, which is not limited herein. The first pixel coordinate system may be a coordinate system constructed according to an actual projection image. The lens coordinate system to be detected is a coordinate system constructed according to the geometric parameters of the lens to be detected.
In some embodiments, the reference projection image includes a reference projection point corresponding to each of the plurality of preset beams, and the actual projection image includes an actual projection point corresponding to each of the plurality of preset beams.
Illustratively, as shown in fig. 3 (a), the reference projection image includes a reference projection point corresponding to each of a plurality of preset light beams, and as shown in fig. 3 (b), the actual projection image includes an actual projection point corresponding to each of the plurality of preset light beams.
In these embodiments, the step S2200 of determining the optical center position coordinates of the lens to be detected according to the actual projection image and the reference projection image includes steps S2200.1-S2200.2.
Step S2200.1, determining a position offset value corresponding to each of the plurality of preset light beams according to the position coordinate of each of the plurality of preset light beams at the actual projection point of the actual projection image and the position coordinate of each of the plurality of preset light beams at the reference projection point of the reference projection image.
In this embodiment, the position coordinates of the actual projection point in the actual projection image and the position coordinates of the reference projection point in the reference projection image are identified. The position coordinates of the projection point may be the center position coordinates of the projection point or the edge position coordinates of the projection point, which is not limited herein.
It should be noted that the position coordinates of the reference projection point and the position coordinates of the actual projection point are to be correlated, that is, when the position coordinates of the reference projection point are the center position coordinates of the reference projection point, the position coordinates of the actual projection point are also the center position coordinates of the actual projection point, and when the position coordinates of the reference projection point are the edge position coordinates of the reference projection point, the position coordinates of the actual projection point are also the edge position coordinates of the actual projection point.
And determining a position offset value corresponding to each of the plurality of preset light beams according to the position coordinates of the actual projection point and the position coordinates of the reference projection point corresponding to each of the plurality of preset light beams.
The position offset value corresponding to any one light beam may be the distance between the actual projection point corresponding to the light beam and the reference projection point in the first pixel coordinate system. The positional shift value may be constituted by a shift in the horizontal direction (X axis) and a shift in the vertical direction (Y axis) in the first pixel coordinate system.
Illustratively, as shown in fig. 3 (c), a positional shift map of the reference projection point of the reference projection image in fig. 3 (a) and the actual projection point of the actual projection image in fig. 3 (b) in the horizontal direction of the first pixel coordinate, i.e., the X-axis direction is shown.
In one embodiment, step S2200.1 may be performed by a lensometer to obtain a position offset value corresponding to each of the plurality of preset light beams.
Step S2200.2, determining the optical center position coordinate of the lens to be detected according to the position offset value corresponding to each of the plurality of preset light beams.
For example, the incidence point of the light beam corresponding to the minimum position deviation value in the position deviation values corresponding to each of the plurality of preset light beams on the lens to be detected may be used as the optical center position coordinate of the lens to be detected.
For another example, the prism degree corresponding to each of the plurality of preset light beams may be determined according to the position offset value, the effective optical thickness, and the beam deflection angle corresponding to each of the plurality of preset light beams, and the projection point (the reference projection point or the actual projection point) corresponding to the minimum prism degree may be used as the optical center position coordinate of the lens to be detected.
In some embodiments, in step S2200.2, the optical center position coordinates of the lens to be detected are determined according to the position offset value corresponding to each of the plurality of preset light beams, including steps SA1 to SA3.
Step SA1, obtaining the effective optical thickness and the beam deflection angle corresponding to each beam in the plurality of preset beams.
In this embodiment, the effective optical thickness is the optical path length of the light beam in the lens to be inspected. The beam deflection angle is the deflection angle of the beam after passing through the lens to be detected. For each beam, there is an effective optical thickness and a beam deflection angle.
In one embodiment, the effective optical thickness and beam deflection angle corresponding to each of the plurality of predetermined beams may be measured by a lensmeter.
In this embodiment, the lensmeter indirectly calculates the effective optical thickness by measuring the change in focal length of the light beam through the lens to be inspected. The lensometer uses a specific optical system (such as a beam splitter or a mirror) to measure the angle of deflection of the light beam after it has passed through the lens to be inspected.
Those skilled in the art will appreciate that the manner in which the effective optical thickness of the beam and the angle of beam deflection are measured by a lensometer is well known in the art and will not be described herein.
And step SA2, determining the prism degree corresponding to each of the plurality of preset light beams according to the position offset value corresponding to each of the plurality of preset light beams, the effective optical thickness and the light beam deflection angle.
In this embodiment, the Prism degree may refer to the ability of the lens to be detected to deflect the beam. And calculating the prism degree corresponding to each beam according to a calculation formula of the prism degree for each beam in the plurality of preset beams.
The calculation formula of the prism degree is as follows:
Where Δ is prismatic, the unit is Diopter (dioptre), and 1% means that the beam is deflected by 1 meter angle (1 meter radian). d is the position offset value of the light beam. 0 is the beam deflection angle in degrees or radians (rad). L is the effective optical thickness in meters (m).
And step SA3, determining the optical center position coordinates of the lens to be detected according to the prism degree corresponding to each of the plurality of preset light beams.
In one example, the position coordinate of the projection point (the reference projection point or the actual projection point) corresponding to the light beam with the prism degree of 0 at the first pixel coordinate may be used as the optical center position coordinate of the lens to be detected.
In another example, the position coordinate of the projection point (the reference projection point or the actual projection point) of the light beam of the smallest prism degree among the plurality of prism degrees corresponding to the plurality of preset light beams at the first pixel coordinate may be used as the optical center position coordinate of the lens to be detected.
According to the embodiment of the application, the optical center position coordinate of the lens to be detected is determined according to the prism degree corresponding to each of the plurality of preset light beams, so that the accuracy of determining the optical center can be improved, and the accuracy of detecting the eccentricity is further improved, compared with a mode of directly determining the optical center position coordinate according to the position offset value corresponding to each of the plurality of preset light beams.
Step S2300, obtaining the geometric center position coordinates of the lens to be detected.
In this embodiment, the geometric center position coordinate may be determined based on a vision system, a contact measurement system, or other manners, which is not limited herein.
After the system determines the geometric center position coordinates of the lens to be detected, the determined geometric center position coordinates may be transmitted to the decentration detection device of the lens so that the decentration detection device of the lens performs decentration detection.
In some embodiments, the decentration detection means of the lens may be a processor of the vision system, i.e. the decentration detection means of the lens performs the step of determining the geometric center position coordinates of the lens to be detected by a related image processing algorithm of the vision system.
In these embodiments, the step S2300 of obtaining the geometric center position coordinates of the lens to be inspected includes steps S2300.1-S2300.3.
Step S2300.1, obtaining a visual image shot by a visual camera.
Illustratively, as shown in fig. 4, the lens of the vision camera 4 is opposite to the focal plane of the image sensor 2 and above the light source 1, a visual image of the lens to be detected has been taken.
And step S2300.2, identifying the outline of the visual image and constructing an outline model of the lens to be detected.
In this embodiment, the outline edge of the lens to be detected in the visual image is identified by an image processing algorithm. After the edge of the lens to be inspected is identified, a series of discrete points are taken on the edge. And fitting the points using a mathematical model (such as polynomial fitting, circle fitting, or more complex curve fitting algorithms) to construct a model of the contour of the lens to be inspected. The best fit profile can also be found during the fitting process by least squares or other optimization techniques.
And step S2300.3, determining the geometric center position coordinates of the lens to be detected according to the contour model of the lens to be detected.
In this embodiment, the contour center, i.e. the geometric center, of the lens to be detected can be calculated by fitting the obtained contour model.
For a circular or elliptical lens to be inspected, the geometric center is the center point of the geometric shape. For more complex shapes of lenses to be inspected, the center of the contour may be the centroid or centroid of the shape as calculated.
Step S2400, determining an eccentricity value of the lens to be detected according to the optical center position coordinate and the geometric center position coordinate.
In this embodiment, when calculating the decentration value, the optical center position coordinate and the geometric center position coordinate need to be unified under the same coordinate system, and then the decentration value of the lens to be detected is calculated.
The decentration value may be the decentration distance between the optical center and the geometric center of the lens to be detected, or may be the decentration abscissa and the decentration ordinate of the optical center and the geometric center of the lens to be detected, which is not limited herein.
In some embodiments, the optical center position coordinates are position coordinates of the optical center in a first pixel coordinate system, and the geometric center position coordinates are position coordinates of the geometric center in a second pixel coordinate system. The first pixel coordinate system is a coordinate system constructed according to an actual projection image of a focal plane of the image sensor, and the second pixel coordinate system is a coordinate system constructed according to a visual image shot by the visual camera.
In some embodiments, the first pixel coordinate system and the second pixel coordinate system are the same, that is, the position coordinate of the optical center under the first pixel coordinate system is the position coordinate of the optical center under the second pixel coordinate system, and the position coordinate of the geometric center under the second pixel coordinate system is the position coordinate of the geometric center under the first pixel coordinate system, at this time, the calculation of the eccentricity value is directly performed without performing coordinate system conversion.
In other embodiments, the first pixel coordinate system and the second pixel coordinate system are different. In these embodiments, determining the decentration value of the lens to be detected in step S2400 according to the optical center position coordinates and the geometric center position coordinates includes steps S3100 to S3400.
Step S3100, obtaining a visual image of the lens to be detected, which is acquired by a visual camera.
Step S3200, determining a coordinate conversion relationship between the first pixel coordinate system and the second pixel coordinate system according to the visual image and the actual projection image.
For example, the coordinate conversion relationship between the first pixel coordinate system and the second pixel coordinate system may be obtained according to the position coordinate of the center point of the visual image under the second pixel coordinate system and the position coordinate of the center point of the actual projection image under the first pixel point coordinate.
And step S3300, converting the position coordinates of the optical center under the first pixel coordinate system and the position coordinates of the geometric center under the second pixel coordinate system into the position coordinates under the same pixel coordinate system according to the coordinate conversion relationship, to obtain the converted optical center position coordinates and the converted geometric center position coordinates.
For example, the position coordinates of the optical center in the first pixel coordinate system may be converted into the position coordinates of the optical center in the second pixel coordinate system by a coordinate conversion relationship as the converted optical center position coordinates. Or converting the position coordinate of the geometric center under the second pixel coordinate system into the position coordinate of the geometric center under the first pixel coordinate system through the coordinate conversion relation, and taking the position coordinate as the position coordinate of the geometric center after conversion.
And S3400, determining the eccentric value of the lens to be detected according to the converted optical center position coordinate and the converted geometric center position coordinate.
In some embodiments, the determining the decentration value of the lens to be detected according to the optical center position coordinates and the geometric center position coordinates in step S2400 includes step S5100 and step S5200.
Step S5100, determining an actually measured eccentric value of the lens to be detected according to the optical center position coordinate and the geometric center position coordinate.
For example, the optical center position coordinates are O 1 points (x 1,y1), the geometric center position coordinates are O 2 points (x 2,y2), and the measured eccentricity values are Δx, Δy, where Δx=x 1-x2,Δy=y1-y2.
And step S5200, taking the measured eccentric value as the eccentric value of the lens to be detected under the condition that the measured eccentric value is smaller than or equal to an eccentric threshold value.
In this embodiment, the decentration threshold may be a value or a range of values, which is not limited herein.
The decentration threshold may be a preset decentration reference value between the optical center and the geometric center of the lens. Which characterizes the acceptable error range for decentration detection, which may be specifically set based on a plurality of decentration detection values obtained for decentration detection of a plurality of standard lenses.
According to the embodiment of the application, by setting the eccentric threshold value, when the measured eccentric value is smaller than or equal to the eccentric threshold value, the measured eccentric value is taken as the eccentric value of the lens to be detected, so that the eccentric value is ensured to be correct only when the eccentric value is within an acceptable error range (namely smaller than or equal to the eccentric threshold value), and the accuracy of eccentric detection can be improved.
In some embodiments, the step of determining the eccentricity threshold includes step S6100 and step S6200.
In step S6100, an decentration value corresponding to each of the plurality of standard lenses is obtained.
In this embodiment, the decentration detection of any of the above method embodiments may be performed on the plurality of standard lenses, for example, the decentration detection of step S2100 to step S2400 may be performed, to obtain the decentration value of each standard lens. The decentration value of the standard lens is determined according to the optical center position coordinate of the standard lens and the geometric center position coordinate of the standard lens.
Step S6200, determining the decentration threshold according to a plurality of decentration values corresponding to the plurality of standard lenses.
In some examples, the decentration threshold is a maximum decentration distance value, and the maximum decentration value is set as the decentration threshold among a plurality of decentration values corresponding to the plurality of standard lenses.
In other examples, the decentration threshold includes an x-axis decentration threshold and a y-axis decentration threshold, which may be determined from a plurality of x-axis decentration values corresponding to a plurality of standard lenses, and a y-axis decentration threshold is determined from a plurality of y-axis decentration values corresponding to a plurality of standard lenses.
In some embodiments, after determining the measured eccentricity value in step S5100, the method further includes outputting a notification of a detection error if the measured eccentricity value is greater than the eccentricity threshold.
In this embodiment, the prompt information of the detection error may be output by text, voice, or other manners, which is not limited herein.
In some cases, the set eccentric threshold is improper, in which case, a prompt message of the detection error is always obtained according to the eccentric threshold, and in order to avoid the problem that the eccentric detection result cannot be output due to the improper setting of the eccentric threshold, the embodiment of the application provides a way for dynamically updating the eccentric threshold.
Based on this, in some embodiments, after determining the measured eccentricity value in step S5100, the method further includes steps S7100-S7300.
Step S7100, when the measured decentration value is larger than the decentration threshold, determining a decentration error value according to the measured decentration value and the decentration threshold.
For example, the measured eccentricity value is (Δx, Δy), the eccentricity threshold value is (Δx max,Δymax), and the eccentricity error value is (Δx- Δx max,Δy-Δymax).
Step S7200, when the eccentricity error value is within a preset tolerance range and the occurrence frequency of the eccentricity error value in a historical eccentricity test is greater than or equal to a frequency threshold, correcting the eccentricity threshold according to the eccentricity error value to obtain a corrected eccentricity threshold;
in this embodiment, if the eccentric error value is not within the preset tolerance range, it indicates that there is a hardware failure, for example, the focal plane of the image sensor is tilted, or the camera of the vision camera is tilted, and the maintenance personnel is required to repair the image manually.
If the eccentricity error value is within the preset tolerance range, an error caused by incorrect setting of the eccentricity threshold value is indicated, and at this time, historical eccentricity detection data is obtained, wherein the historical eccentricity detection data comprises an eccentricity value and an eccentricity error value corresponding to each eccentricity detection of the history. Judging whether the eccentric error value appears in the historical eccentric detection data, and judging whether the frequency of the occurrence of the eccentric detection value is larger than or equal to a preset frequency threshold value under the condition of the occurrence. And if the frequency is larger than or equal to the preset frequency threshold value, correcting the eccentric threshold value.
The preset number of times threshold may be 100 or 50, etc., and is not limited herein.
Continuing with the above example, the preset tolerance range is 0.5, the number of times threshold is 100, if the eccentricity error value (Δx- Δx max,Δy-Δymax) is (0.10,0.30), in the historical eccentricity detection data, if 102 times of eccentricity error value (0.10,0.30) occurs, the eccentricity threshold is corrected to (Δx max+0.10,Δymax +0.30), which is the corrected eccentricity threshold.
Step S7300, when the measured decentration value is less than or equal to the corrected decentration threshold, using the measured decentration value as the decentration value of the lens to be detected.
In this embodiment, since the eccentricity threshold is corrected, and the corrected eccentricity threshold is compared with the actually measured eccentricity value, a correct eccentricity detection result can be obtained, so that the problem that eccentricity detection cannot be performed due to improper setting of the eccentricity threshold is not displayed any more, and the accuracy of eccentricity detection is improved.
According to the embodiment of the application, the accuracy of determining the optical center can be improved by acquiring the actual projection images formed by the plurality of preset light beams reaching the focal plane of the image sensor through the lens to be detected and determining the optical center position coordinates of the lens to be detected according to the actual projection images and the reference projection images. And the eccentric value of the lens to be detected is determined according to the optical center position coordinate and the geometric center position coordinate by acquiring the geometric center position coordinate of the lens to be detected, so that the accuracy of eccentric detection of the lens can be improved.
< Device example >
Fig. 5 is a functional block diagram of a lens decentration detection device 5000 according to an embodiment of the present invention.
In the present embodiment, as shown in fig. 5, fig. 5 is a schematic structural view of an eccentric detecting device 5000 of a lens according to an embodiment.
As shown in fig. 5, the decentration detecting device 5000 of the lens of the present embodiment may include a memory 5200 and a processor 5100.
The memory 5200 is used for storing instructions for controlling the processor 5100 to operate to perform the decentration detection method of the lens according to any of the embodiments of the present invention. The skilled person can design instructions according to the disclosed solution. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
< System example >
Fig. 6 is a functional block diagram of a lens decentration detection system 6000 according to an embodiment of the present invention.
In the present embodiment, as shown in fig. 6, fig. 6 is a schematic structural diagram of an decentration detection system 6000 for a lens according to one embodiment.
As shown in fig. 6, the decentration detection system 6000 of the lens of the present embodiment may include a light source 610, an image sensor 620 and a decentration detection device 630 of the lens, wherein the light source 610 is used for emitting a plurality of preset light beams, and the image sensor 620 is used for collecting actual projection images formed by the plurality of preset light beams through the lens to be detected at the focal plane thereof and sending the actual projection images to the decentration detection device 630 of the lens.
In one embodiment, the light source 610 is an area array light beam as shown in FIG. 4.
In one embodiment, the system 6000 further comprises a vision camera. The vision camera is opposite to the image sensor 620 and is outside the light source 610.
Illustratively, the vision camera includes a CCD image sensor and a lens.
The present invention may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium include a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical encoding device, punch cards or intra-groove protrusion structures such as those having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present invention may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as SMALLTALK, C ++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information for computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvements in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.