WO2021229943A1 - 情報処理装置と情報処理方法およびプログラム - Google Patents
情報処理装置と情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2021229943A1 WO2021229943A1 PCT/JP2021/013540 JP2021013540W WO2021229943A1 WO 2021229943 A1 WO2021229943 A1 WO 2021229943A1 JP 2021013540 W JP2021013540 W JP 2021013540W WO 2021229943 A1 WO2021229943 A1 WO 2021229943A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- polarization
- parameter
- image
- polarized
- parameters
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 0 C[C@]1C(CCCCO)*CC1 Chemical compound C[C@]1C(CCCCO)*CC1 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Definitions
- This technology makes it possible to detect abnormalities that do not appear in color information and brightness information regarding information processing devices, information processing methods, and programs.
- Patent Document 1 a defect of a target object is detected based on the luminance information of each image included in the image captured by irradiating the target object with light from a plurality of directions. Further, in Patent Document 2, a rendered image corresponding to the visual field of the image pickup unit is generated, and the image feature of the image region corresponding to the object model in the rendered image is compared with the image feature of the change region extracted from the captured image. Therefore, the abnormality of the object corresponding to the object model is detected.
- the purpose of this technology is to provide an information processing device, an information processing method, and a program that can detect an abnormality that is difficult to detect based on brightness information and color information.
- the first aspect of this technology is A polarization rendering setting unit that sets multiple parameters used to generate a polarization rendering image for anomaly detection, and a polarization rendering setting unit.
- a polarization rendering image generation unit that generates a polarization rendering image of the abnormality detection target based on the parameters set in the polarization rendering setting unit, and a polarization rendering image generation unit. It is provided with an abnormality detection unit that detects an abnormality region of the abnormality detection target based on the difference between the polarized light captured image acquired by imaging the abnormality detection target and the polarization rendering image generated by the polarization rendering image generation unit. It is in the information processing device.
- the polarization rendering setting unit sets a plurality of parameters used to generate a polarization rendering image to be detected as an abnormality.
- the plurality of parameters include a light source parameter relating to a light source, a geometry parameter relating to an abnormality detection target, a material parameter relating to a polarization characteristic of the abnormality detection target, and a camera parameter of a polarized light image acquisition unit for acquiring a polarized image.
- the polarization rendering setting unit sets the plurality of measured parameters as parameters used for generating a polarized rendering image.
- the polarization rendering setting unit performs optimization processing for parameters that have not been measured, for example, when some or all of a plurality of parameters have not been measured.
- the unmeasured parameter is set as a parameter that can minimize the difference between the polarized image captured image and the polarized rendered image generated by the polarized rendered image generation unit.
- the polarization rendering setting unit makes it possible to adjust the convergence characteristic of the parameter by using the parameter value converged by repeating the update of the parameter using the difference, for example, as the optimized parameter value.
- the polarization rendering setting unit sets the measured parameter and the parameter calculated by the optimization process, or if there is no measured parameter, the parameter calculated by the optimization process as the parameter used for generating the polarized rendering image. At least the material parameters may be measured in advance, and the polarization rendering setting unit may use the parameters measured in advance by fixing them.
- the polarized rendering image generation unit generates a polarized rendering image for abnormality detection based on the parameters set in the polarized rendering setting unit.
- the abnormality detection unit detects the abnormality region of the abnormality detection target on a pixel-by-pixel basis based on the difference between the polarized light captured image acquired by imaging the abnormality detection target and the polarization rendering image generated by the polarization rendering image generation unit. Further, the abnormality detection unit may estimate the cause of the abnormality in the detected abnormality region based on the information indicating whether the parameter is a measured parameter or a parameter calculated by the optimization process. Further, the abnormality detection unit may detect an abnormality region to be detected as an abnormality based on the difference in the polarization information by using the polarization information calculated from the polarized light captured image and the polarization information calculated from the polarization rendering image.
- the second aspect of this technology is Setting multiple parameters used to generate a polarized rendering image for anomaly detection in the polarized rendering setting unit, and
- the polarization rendering image generation unit generates the polarization rendering image of the abnormality detection target based on the parameters set in the polarization rendering setting unit.
- the abnormality detection unit detects the abnormality region of the abnormality detection target based on the difference between the polarized light image obtained by imaging the abnormality detection target and the polarization rendering image generated by the polarization rendering image generation unit. It is in the information processing method including.
- the third aspect of this technology is A program that allows a computer to detect anomalies for anomaly detection targets.
- the program of the present technology is, for example, a storage medium, a communication medium, for example, a storage medium such as an optical disk, a magnetic disk, a semiconductor memory, etc., which is provided in a computer-readable format to a general-purpose computer capable of executing various program codes. It is a program that can be provided by a medium or a communication medium such as a network. By providing such a program in a computer-readable format, processing according to the program can be realized on the computer.
- polarized rendered image (hereinafter referred to as "polarized rendered image") is generated using polarization characteristics and the like.
- a light source parameter L the geometry parameter G of the non-light source (object)
- material parameters M indicating the polarization characteristics of the non-light sources
- a camera parameter C polarized rendered image I f based on the function f shown in equation (1)
- the light source parameter L is a vector of parameters indicating the position and direction of the light source, the Stokes vector of the irradiation light emitted from the light source, and the like.
- the non-light source geometry parameter G is a vector of parameters indicating the shape, position, and posture of the subject.
- the non-light source material parameter M is a parameter indicating the Mueller matrix of the subject.
- the camera parameter C is a vector of internal parameters and external parameters of the camera.
- the parameter ⁇ is the angle (also referred to as the polarization direction) of the linear polarizing element (simply also referred to as “polarizing plate”) used when acquiring a polarized image (hereinafter referred to as “polarized image captured image”) with a camera. ) Is shown.
- the angle phi polarization captured image I r phi obtained by performing imaging with the camera provided with the polarizing plate
- the polarizing rendered image I f the difference between the phi and the polarization captured image I r phi is calculated for each region of a predetermined pixel, an area larger than the determination threshold difference is set in advance, to the abnormal region.
- FIG. 1 illustrates the configuration of an information processing device.
- the information processing apparatus 10 includes a polarized light image acquisition unit 20, a polarized light rendering setting unit 30, a polarized light rendering image generation unit 40, and an abnormality detection unit 50.
- the polarized light image acquisition unit 20 captures an abnormality detection target and acquires a polarized image.
- FIG. 2 illustrates the configuration of a polarized light image acquisition unit. As shown in (a) of FIG. 2, for example, the polarized light image acquisition unit 20 is generated by arranging a polarizing plate 202 having a plurality of pixel configurations in the polarization direction on the image sensor 201 and performing image pickup. Note that FIG. 2A illustrates a case where the polarizing plate 202, in which each pixel is a pixel in any of four different polarization directions (the polarization direction is indicated by an arrow), is arranged in front of the image sensor 201. ing. Further, as shown in FIG.
- the polarized light image acquisition unit 20 may generate a plurality of polarized light image images having different polarization directions by using the configuration of the multi-lens array.
- a plurality of lenses 203 (four in the figure) are provided on the front surface of the image sensor 201, and each lens 203 forms an optical image of the subject on the image pickup surface of the image sensor 201.
- a polarizing plate 204 is provided on the front surface of each lens 203, and a plurality of polarized light captured images having different polarization directions are generated with the polarizing directions of the polarizing plates 204 (the polarization directions are indicated by arrows) as different directions.
- the polarized light captured image acquisition unit 20 is configured in this way, a plurality of polarized light captured images can be acquired by one imaging, so that abnormality detection can be performed quickly. Further, as shown in FIG. 2C, the polarizing plates 212-1 to 212-4 having different polarization directions are provided in front of the imaging units 210-1 to 210-4 from different viewpoints. A plurality of polarized images having different polarization directions (indicated by arrows) may be generated.
- the polarizing plate 211 may be provided in front of the image pickup unit 210 as shown in FIG. 2 (d). good. In this case, the polarizing plate 211 is rotated to take images in a plurality of different polarization directions, and a plurality of polarized light captured images having different polarization directions are acquired.
- the plurality of different polarization directions may be any combination of angles as long as the angles are all different. For example, 0 degrees, 60 degrees, and 120 degrees when using three polarization directions, and 0 when using four polarization directions. Degrees, 45 degrees, 90 degrees and 135 degrees are used.
- the polarized light image acquisition unit 20 can acquire a luminance polarized light image.
- an image equivalent to an unpolarized normal luminance image can be obtained by averaging the luminancees of four adjacent pixels having different polarization directions. ..
- the polarization direction if the positional distance between each lens 203 and the image pickup units 210-1 to 210-4 is short enough to be negligible with respect to the distance to the abnormality detection target, the polarization direction. Parallax can be ignored in multiple polarized image images with different values.
- the polarized image acquisition unit 20 may simultaneously generate not only the luminance polarized image but also the three primary color images by providing the image sensor 201 with a color filter, or may simultaneously generate an infrared image or the like. Further, the polarized image acquisition unit 20 may generate a luminance image by calculating the luminance from the three primary color images.
- FIG. 3 exemplifies a pixel configuration in a plurality of polarization directions, and the configuration shown in FIG. 3 is repeated in the horizontal direction and the vertical direction.
- (A) and (b) of FIG. 3 exemplify the pixel configuration in the case of acquiring a black-and-white image.
- FIG. 3A illustrates a case where a polarized pixel block of 2 ⁇ 2 pixels is composed of polarized pixels having polarization directions (polarization angles) of 0 degrees, 45 degrees, 90 degrees, and 135 degrees, for example. .. Further, in FIG.
- a polarized pixel block of 4 ⁇ 4 pixels is composed of polarized pixels having polarization directions of 0 degrees, 45 degrees, 90 degrees, and 135 degrees, for example, with 2 ⁇ 2 pixels as a unit of polarization direction. The case where it is done is illustrated.
- the polarization component unit of the polarizing plate is 2 ⁇ 2 pixels as shown in FIG. 3B
- the polarization component obtained for each polarization component unit is from the region of different polarization component units adjacent to each other.
- the proportion of the leakage of the polarized light component of FIG. 3 is smaller than that of the 1 ⁇ 1 pixel shown in FIG. 3 (a).
- the polarizing plate uses a wire grid
- polarized light whose electric field component is perpendicular to the direction of the grid (wire direction) is transmitted, and the longer the wire, the higher the transmittance. Therefore, when the unit of the polarization component is 2 ⁇ 2 pixels, the transmittance is higher than that of 1 ⁇ 1 pixel. Therefore, when the unit of the polarization component is 2 ⁇ 2 pixels, the transmittance is higher than that of 1 ⁇ 1 pixel, and the extinction ratio can be improved.
- FIG. 3C shows a case where the 2 ⁇ 2 pixel polarized pixel block shown in FIG. 3A is used as one color unit and the three primary color pixels (red pixel, green pixel, and red pixel) are arranged in a bayer. Shows.
- FIG. 3D exemplifies a case where the three primary color pixels are provided in a Bayer array for each pixel block of 2 ⁇ 2 pixels shown in FIG. 3B in the same polarization direction.
- FIG. 3 (e) shows a case where three primary color pixels are provided in a bayer array for each pixel block of 2 ⁇ 2 pixels in the same polarization direction, and blocks of 2 ⁇ 2 pixels having different polarization directions are pixels of the same color. Illustrate.
- the pixel blocks of the Bayer arrangement in the same polarization direction of 2 ⁇ 2 pixels have a phase difference of 90 in the polarization direction from the pixel blocks adjacent in the horizontal direction, and are adjacent to the pixel blocks in the vertical direction.
- the case where the phase difference in the polarization direction of is ⁇ 45 degrees is shown.
- the pixel blocks of the Bayer arrangement in the same polarization direction of 2 ⁇ 2 pixels have a phase difference of 90 in the polarization direction from the pixel blocks adjacent in the vertically direction, and are adjacent to the pixel blocks in the horizontal direction.
- the case where the phase difference in the polarization direction of is ⁇ 45 degrees is shown.
- a 2 ⁇ 2 pixel block may be composed of three primary color pixels and white pixels, or may be composed of three primary color pixel blocks and white pixel blocks.
- the block of 2 ⁇ 2 pixels may be composed of polarized pixels and unpolarized pixels having different polarization directions, or may be composed of polarized pixel blocks and unpolarized pixel blocks having different polarization directions.
- the polarized image acquisition unit 20 generates a polarized image having a resolution of each pixel, that is, a pixel block unit having a predetermined number of pixels, for each polarization direction.
- a polarized image image having a resolution of one pixel unit may be generated for each polarization direction by using the existing demosaic process.
- the polarization rendering setting unit 30 sets the plurality of measured parameters as parameters used for generating a polarized rendering image. Further, when some or all of the plurality of parameters are not measured, the polarization rendering setting unit 30 performs the optimization processing of the parameters that have not been measured, and the measured parameters and the parameters calculated by the optimization processing. Alternatively, if there is no measured parameter, the parameter calculated by the optimization process is set as the parameter used to generate the polarized rendered image.
- the polarization rendering setting unit 30 performs parameter optimization processing and calculates a parameter that can minimize the difference between the polarization captured image and the polarization rendering image. For example, the polarization rendering setting unit uses the converged parameter value as the optimized parameter value by repeatedly updating the parameter using the difference. Further, the polarization rendering setting unit 30 can adjust the convergence characteristic of the parameter by using the update rate described later.
- the plurality of parameters used to generate the polarized rendered image are the light source parameter related to the light source, the geometry parameter related to the abnormality detection target, the material parameter related to the polarization characteristic of the abnormality detection target, and the camera of the polarized image acquisition unit that acquires the polarized image.
- the polarization rendering setting unit 30 includes parameters, and has a light source parameter setting unit 31, a geometry parameter setting unit 32, a material parameter setting unit 33, and a camera parameter setting unit 34.
- the light source parameter setting unit 31 sets the position and direction of the light source, the Stokes vector of the irradiation light emitted from the light source, and the like as light source parameters in order to generate a polarized rendered image.
- the light source parameter setting unit 31 may use the light source parameter measured in advance in a fixed manner, or may use the measurement result of the light source parameter performed before the rendering process in a fixed manner. If the light source parameters have not been measured, the parameters can be optimized and the rendering process can be started using the preset initial values to minimize the difference between the polarized image and the polarized rendered image. Calculate and use the light source parameters.
- the geometry parameter setting unit 32 sets the shape, position, and posture of the abnormality detection target as geometry parameters.
- the geometry parameter setting unit 32 may use the geometry parameters measured in advance in a fixed manner, or may use the geometry parameter measurement results performed before the rendering process in a fixed manner. If the geometry parameters have not been measured, the parameters can be optimized and the rendering process can be started using the preset initial values to minimize the difference between the polarized image and the polarized rendered image. Calculate and use geometry parameters.
- the material parameter setting unit 33 sets a Mueller matrix indicating the polarization characteristics of the abnormality detection target as the material parameter.
- the material parameter setting unit 33 may be used by fixing the material parameter measured in advance, or may be used by fixing the measurement result of the material parameter performed before the rendering process. If the material parameters have not been measured, the parameters can be optimized and the rendering process can be started using the preset initial values to minimize the difference between the polarized image and the polarized rendered image. Calculate and use material parameters.
- the camera parameter setting unit 34 sets the internal parameters and external parameters of the camera as camera parameters.
- the camera parameter setting unit 34 may use the camera parameters measured in advance in a fixed manner, or may use the camera parameter measurement results performed before the rendering process in a fixed manner. If the camera parameters have not been measured, the parameters can be optimized and the rendering process can be started using the preset initial values to minimize the difference between the polarized image and the polarized rendered image. Calculate and use camera parameters.
- the polarization rendering setting unit 30 outputs the parameters set by the light source parameter setting unit 31, the geometry parameter setting unit 32, the material parameter setting unit 33, and the camera parameter setting unit 34 to the polarization rendering image generation unit 40. Further, the polarization rendering setting unit 30 indicates information for each parameter, such as whether the parameter is being measured, a parameter calculated by the optimization process, or whether the optimization process is completed (hereinafter, “parameter attribute information”). ”) And output to the abnormality detection unit 50.
- the polarization rendering image generation unit 40 generates a polarization rendering image based on the parameters set by the polarization rendering setting unit 30, and outputs the generated polarization rendering image to the abnormality detection unit 50.
- the abnormality detection unit 50 calculates the difference between the polarized light captured image acquired by the polarized light captured image acquisition unit 20 and the polarized rendered image generated by the polarized rendered image generation unit 40. Further, the abnormality detection unit 50 outputs the calculated difference to the polarization rendering setting unit 30 when the parameter optimization processing is performed based on the parameter attribute information generated by the polarization rendering setting unit 30. Further, when the parameter attribute information indicates that all the optimization processing is completed, the abnormality detection unit 50 compares the difference between the polarized image captured image and the polarized rendered image with the determination threshold value, and the difference is the determination threshold value. A pixel area larger than is detected as an abnormal area.
- FIG. 4 is a flowchart illustrating the operation of the information processing apparatus.
- the information processing apparatus acquires a polarized image.
- the polarized light image acquisition unit 20 of the information processing apparatus 10 captures an abnormality detection target, acquires a polarized image, and proceeds to step ST2.
- step ST2 the information processing apparatus sets the parameter update rate.
- the polarization rendering setting unit 30 of the information processing apparatus 10 sets the update rate ⁇ L for the light source parameter L. Further, the polarization rendering setting unit 30 sets the update rates ⁇ G , ⁇ M , and ⁇ C for each of the geometry parameter G, the material parameter M, and the camera parameter C, and proceeds to step ST3.
- the update rate is information for setting the parameter correction amount based on the difference between the polarized image captured image and the polarized rendered image when the parameter optimization process is performed.
- the update rate is "0"
- the polarization rendering setting unit 30 does not perform the optimization process because the parameter correction amount is "0”.
- the polarization rendering setting unit 30 calculates the parameter correction amount based on the abnormality detection result and performs the optimization process. The details of the operation using the update rate will be described later.
- step ST3 the information processing device sets a parameter with an update rate of 0.
- step ST4 the information processing apparatus sets the initial value of the parameter whose update rate> 0. Since the polarization rendering setting unit 30 performs optimization processing for the parameter with the update rate> 0 and automatically calculates the optimum parameter value, the initial value for the parameter with the update rate> 0 is an arbitrary value or a value specified in advance. To step ST5.
- step ST5 the information processing device generates a polarized rendered image.
- the polarized rendering image generation unit 40 of the information processing apparatus 10 performs rendering processing using the parameter values set in steps ST3 and ST4, or the parameter values set in step ST3 and the parameter values updated in step ST8 described later. To generate a polarized rendered image and proceed to step ST6.
- step ST6 the information processing device calculates the difference.
- the abnormality detection unit 50 of the information processing apparatus 10 calculates the difference between the polarized light captured image acquired in step ST1 and the polarized light rendered image generated in step ST5, and proceeds to step ST7.
- step ST7 the information processing apparatus determines whether the optimization process has been completed.
- the polarization rendering setting unit 30 performs optimization processing for updating the parameter by using the differential result for each parameter with respect to the difference at a ratio corresponding to the update rate for the parameter whose update rate> 0.
- the polarization rendering setting unit 30 determines that the optimization process is completed when each parameter whose update rate> 0 converges, and proceeds to step ST9. Further, if there is a parameter that has not converged, the process proceeds to step ST8.
- step ST8 the information processing device updates the parameters.
- the polarization rendering setting unit 30 calculates the correction amount of each parameter for which the update rate> 0 by using the differential result and the update rate for each parameter with respect to the difference. Further, the polarization rendering setting unit 30 corrects the parameters using the calculated correction amount and returns to step ST5.
- step ST9 the information processing device performs the discrimination process.
- the abnormality detection unit 50 of the information processing apparatus 10 compares the difference and the determination threshold value in pixel units, and determines that the pixel area smaller than the determination threshold value is the normal area and the pixel area equal to or larger than the determination threshold value is the abnormality area.
- the parameter acquisition unit may be provided separately from the polarization rendering setting unit 30. Further, when the parameter is measured at the site where the abnormality is detected, the function of the parameter acquisition unit may be provided in the parameter setting unit of the polarization rendering setting unit 30.
- the light source parameter acquisition unit for acquiring the light source parameter is configured by using, for example, an image pickup unit and an environment image pickup unit provided with a polarizing plate in front of the image pickup unit so that the polarization direction can be changed.
- the light source parameter acquisition unit calculates a Stokes vector based on a polarized light image obtained by imaging a light source for each of a plurality of predetermined polarization directions with the polarization direction as a predetermined plurality of polarization directions, and uses the light source parameter as a light source parameter.
- FIG. 5 illustrates the configuration of the environment imaging unit.
- the environment imaging unit 311 is configured by providing, for example, a plurality of imaging units 3111 having different imaging directions and a polarizing plate 3112 in front of each imaging unit so that the polarization direction can be changed.
- the polarizing plates 3112 are oriented in the same polarization direction as each other.
- the environment imaging unit 311 images the environment when the abnormality detection target is imaged, and generates, for example, a polarized image of the whole celestial sphere in each of a plurality of polarization directions.
- the environment imaging unit 311 may acquire a polarized image of the whole celestial sphere in each of a plurality of polarization directions by one imaging unit 3111 and a polarizing plate 3112 using a fisheye lens or the like. Further, the environment imaging unit 311 is not limited to the case of generating a polarized image of the whole celestial sphere. For example, when the light source is provided only in a limited range, a polarized image captured in a limited range may be generated.
- FIG. 6 illustrates a polarized image captured by the environment imaging unit. Note that FIG. 6A exemplifies a fisheye image showing a spherical image, and FIG. 6B exemplifies a developed image obtained by developing a fisheye image on a cylindrical surface.
- the light source parameter calculation unit divides the polarized image captured image generated by the environment image pickup unit 311 into the zenith direction and the azimuth direction, and uses the average incident Stokes vector in the area for each divided area as the light source parameter. Further, the light source parameter calculation unit may calculate the average incident direction of the light beam and include it in the light source parameter.
- FIG. 7 illustrates the division of a polarized image. Note that FIG. 7A shows a division example of the fisheye image shown in FIG. 6A, and FIG. 7B shows a division example of the developed image shown in FIG. 6B. ing.
- the light source parameter calculation unit when the light source position of the polarized incident light in the incident direction ⁇ i is included in the region ALi, the light source parameter calculation unit has the average incident direction in the region ALi and the equation (2).
- the average incident Stokes vector SLi is calculated as shown in.
- the average incident direction and the average incident Stokes vector are calculated for the other regions, respectively.
- the observed value IL (0 °) is an observed value obtained by imaging the light source and the polarization direction is 0 °
- the observed value IL (45 °) is obtained by imaging the light source.
- the observed value with a polarization direction of 45 ° is the observed value with a polarization direction of 90 ° obtained by imaging the light source
- the observed value IL (135 °) is an image of the light source.
- the obtained polarization direction is an observation value of 135 °.
- FIG. 8 illustrates the light source parameter L.
- FIG. 9 is a diagram for explaining the material polarization characteristics.
- the light emitted from the light source LT is irradiated to the measurement target OB via, for example, the polarizing plate PL1, and the image pickup unit (hereinafter referred to as “measurement target image pickup unit”) CM that images the measurement target of the material parameter is, for example, a polarizing plate.
- the measurement target OB is imaged via PL2.
- the Z direction indicates the zenith direction, and the angle ⁇ is the zenith angle.
- the polarization directions of the polarizing plates are, for example, 0 °, 45 °, 90 °, 135 °
- the pixel value obtained by imaging the measurement target with the measurement target imaging unit CM is the observation value I, the polarization direction.
- the relationship between the Stokes vector and the observed value is given by Eq. (3).
- component s 0 indicates unpolarized luminance or average luminance. Further, component s 1 shows the difference in intensity in the polarization direction between 0 ° and 90 °, and component s 2 shows the difference in intensity in the polarization direction between 45 ° and 135 °. That is, the 0 ° Stokes vector is [1,1,0] T , the 45 ° Stokes vector is [1,0,1] T , and the 90 ° Stokes vector is [1,-1,0] T , 135 °.
- the Stokes vector of is [1,0, -1] T.
- the Stokes vector of the light in the incident direction ⁇ i irradiated to the measurement target OB is “Si”
- the Stokes vector of the light in the emission direction ⁇ o observed by the measurement target imaging unit CM is “So”
- the incident direction ⁇ i and the emission If the Muller matrix in the direction ⁇ o is M ( ⁇ o, ⁇ i), the equation (4) holds.
- the equation (4) is a determinant of the equation (3).
- Equation (4) becomes equation (5) when the polarization direction of the incident light applied to the measurement target OB is 0 °. Further, the equation (4) is the equation (6) when the polarization direction of the incident light is 45 °, the equation (7) when the polarization direction of the incident light is 90 °, and the polarization direction of the incident light is 135 °. In the case of, the equation (8) is obtained.
- Equation (10) shows the normalized M-matrix M ( ⁇ o, ⁇ i).
- the Muller matrix calculated in this way shows the polarization reflection characteristics peculiar to the material to be measured, and the calculated Muller matrix is used as the material parameter. Since the polarization reflection characteristic does not depend on the external environment, it can be used everywhere once it is measured, and it is not necessary to repeatedly acquire the polarization reflection characteristic. Therefore, if the material parameters are measured in advance, it becomes easy to set the parameters required for the polarized rendering image.
- FIGS. 10 and 11 are flowcharts illustrating the acquisition operation of material parameters. Note that FIGS. 10 and 11 show a case where the light source parameters at the positions of each angle ⁇ a in the directional direction and each angle ⁇ b in the zenith direction are used. Further, in the measurement target imaging unit that captures the measurement target of the material parameter, the imaging direction is moved in the zenith direction for each angle ⁇ c, and the switching of the polarization direction is set to “0 °, 45 °, 90 °, 135 °”. ing. The light source parameters are measured in advance.
- step ST11 the material parameter acquisition unit initializes the measurement target imaging unit.
- the material parameter acquisition unit calibrates the measurement target imaging unit that images the measurement target of the material parameter, and proceeds to step ST12 with the azimuth and zenith angles set to 0 °.
- step ST12 the material parameter acquisition unit initializes the zenith angle.
- the material parameter acquisition unit proceeds to step ST13 with the direction in which the zenith angle of the measurement target imaging unit is 0 ° as the direction of the zenith angle of the imaging unit (hereinafter referred to as “light source imaging unit”) used for acquiring the light source parameter. ..
- step ST13 the material parameter acquisition unit initializes the azimuth angle.
- the material parameter acquisition unit initializes the measurement target imaging unit, and proceeds to step ST14 with the direction of the measurement target imaging unit at 0 ° as the direction of the azimuth angle of 0 ° in the light source imaging unit.
- step ST14 the material parameter acquisition unit initializes the polarizing plate on the light source side.
- the material parameter acquisition unit proceeds to step ST15 with the polarization direction of the polarizing plate used in the light source imaging unit set to 0 °.
- step ST15 the material parameter acquisition unit initializes the polarizing plate of the measurement target imaging unit.
- the material parameter acquisition unit proceeds to step ST16 with the polarization direction of the polarizing plate used in the measurement target imaging unit set to 0 °.
- step ST16 the material parameter acquisition unit captures the measurement target of the material parameter.
- the geodetic object imaging unit generates a polarized image captured by imaging the measurement object, and proceeds to step ST17.
- step ST17 the material parameter acquisition unit rotates the polarizing plate of the measurement target imaging unit by 45 °.
- the material parameter acquisition unit rotates the polarization direction of the polarizing plate by 45 ° and proceeds to step ST18.
- step ST18 the material parameter acquisition unit determines whether the polarization direction of the measurement target imaging unit is smaller than 180 °.
- the material parameter acquisition unit returns to step ST16 when the polarization direction after rotation is smaller than 180 °, and proceeds to step ST19 when the polarization direction is 180 ° or more.
- step ST19 the material parameter acquisition unit acquires the emission Stokes vector.
- the material parameter acquisition unit is generated because each polarized light image captured image having a polarization direction of "0 °, 45 °, 90 °, 135 °" is generated by performing the processing from step ST16 to step ST18. Based on the polarized image, the emitted Stokes vector is calculated and the process proceeds to step ST20.
- step ST20 the material parameter acquisition unit rotates the polarizing plate on the light source side by 45 °.
- the material parameter acquisition unit rotates the polarization direction of the polarizing plate by 45 ° and proceeds to step ST21.
- step ST21 the material parameter acquisition unit determines whether the polarization direction on the light source side is smaller than 180 °.
- the material parameter acquisition unit returns to step ST15 when the polarization direction after rotation is smaller than 180 °, and proceeds to step ST22 when the polarization direction is 180 ° or more.
- the material parameter acquisition unit calculates the polarization reflection characteristics.
- the material parameter acquisition unit calculates the Muller matrix based on the emitted Stokes vector when the polarization direction of the polarized light incident on the measurement target is “0 °, 45 °, 90 °, 135 °”. That is, the Muller matrix represented by the equation (9) or the equation (10) is calculated based on the above equations (5) to (8), and the process proceeds to step ST23.
- step ST23 the material parameter acquisition unit saves the material parameter.
- the material parameter acquisition unit generates a material parameter in which the incident direction ⁇ i indicating the light source direction and the emission direction ⁇ o indicating the direction of the measurement target imaging unit are associated with the Muller matrix calculated in step ST22, and stores the material parameter in the database unit or the like. Proceed to step ST24.
- step ST24 the material parameter acquisition unit moves the light source azimuth by ⁇ a °.
- the material parameter acquisition unit moves the azimuth angle of the light source imaging unit by ⁇ a ° and proceeds to step ST25.
- step ST25 the material parameter acquisition unit determines whether the light source azimuth is smaller than 360 °.
- the material parameter acquisition unit returns to step ST14 when the light source azimuth is smaller than 360 °, and proceeds to step ST26 when the light source azimuth is 360 ° or more.
- step ST26 the material parameter acquisition unit moves the light source zenith angle by ⁇ b °.
- the material parameter acquisition unit moves the zenith angle of the light source imaging unit by ⁇ b ° and proceeds to step ST27.
- step ST27 the material parameter acquisition unit determines whether the light source zenith angle is smaller than 90 °.
- the material parameter acquisition unit returns to step ST13 when the light source zenith angle is smaller than 90 °, and proceeds to step ST28 when the light source azimuth is 90 ° or more. That is, by performing the processing from step ST13 to step ST27, the material parameters for each incident direction in which the resolution in the directional direction is ⁇ a ° and the resolution in the zenith direction is ⁇ b ° are stored in the database unit or the like for one emission direction. Will be.
- step ST28 the material parameter acquisition unit moves the zenith angle of the image pickup unit to be measured by ⁇ c °.
- the material parameter acquisition unit moves the zenith angle of the measurement target imaging unit by ⁇ c ° and proceeds to step ST29.
- step ST29 the material parameter acquisition unit determines whether the zenith angle of the image pickup unit to be measured is smaller than 90 °.
- the material parameter acquisition unit returns to step ST12 when the zenith angle of the image pickup unit to be measured is smaller than 90 °, and ends the process when the azimuth angle is 90 ° or more. Therefore, the material parameters for each incident direction in which the resolution in the azimuth direction is the angle ⁇ a and the resolution in the zenith direction is the angle ⁇ b, and the material parameters for each emission direction in which the resolution in the zenith direction is the angle ⁇ c are stored in the database unit or the like. ..
- the existing measurement method may be used.
- the internal parameters and the external parameters are acquired by using the methods disclosed in JP-A-2001-264037 or JP-A-2008-131176.
- Abnormality detecting unit 50 calculates the polarization rendered image I f phi and the difference E I ⁇ the polarizing captured image I r ⁇ (L, G, M, C) a.
- the anomaly detection unit 50 may calculate a mean square error (MSE: Mean Square Error) as shown in the equation (11), for example, as a difference EI ⁇ (L, G, M, C). It may be calculated by using an arbitrary norm function as shown in. Further, PSNR (Peak Signal to Noise Ratio) calculated by using the maximum pixel value that the image can take and the mean square error may be used as the difference.
- MSE Mean Square Error
- the polarization rendering setting unit 30 calculates the partial differential for each parameter of the difference EI ⁇ (L, G, M, C), which is a multivariable function, and uses an existing optimization method (the steepest descent method, etc.). Then, as shown in the equation (13), the parameters Lem , Gem , Mem , and Cem that can minimize the difference EI ⁇ (L, G, M, C) are calculated.
- the polarization rendering setting unit 30 converges the light source parameter L to the optimum value by repeating the calculation of the equation (14). Further, the polarization rendering setting unit 30 converges the geometry parameter G to the optimum value by repeating the calculation of the equation (15), and converges the material parameter M to the optimum value by repeating the calculation of the equation (16). Further, the polarization rendering setting unit 30 converges the camera parameter C to the optimum value by repeating the calculation of the equation (17).
- the update rates ⁇ L , ⁇ G , ⁇ M , and ⁇ C used in the equations (14) to (17) are parameters for adjusting the correction amount when the parameters are converged using the calculation result of the partial differential.
- the update rate may be set in advance, and the update rate may be changeable so that the time required for convergence and the like can be adjusted. Further, the partial differential operation may be automatically performed by using a method called automatic differentiation, or may be manually performed by the user.
- the abnormality detection unit 50 can calculate the difference between the images. Further, the polarization rendering setting unit 30 can set each parameter necessary for generating a polarization rendering image by performing optimization processing for parameters that are not measured in advance or at the site where abnormality detection is performed.
- the abnormality detecting unit 50 the parameter measured in situ performing being measured parameters and abnormality detection in advance, polarization rendered image generated using the parameters calculated in the optimization process I f phi and the polarization captured image I r
- the difference from ⁇ is calculated for each pixel, and the pixel area where the difference EI ⁇ (L, G, M, C) is smaller than the judgment threshold Jth is judged as a normal area, and the pixel area where the difference EI ⁇ (L, G, M, C) is equal to or higher than the judgment threshold Jth is judged as an abnormal area. do.
- the abnormality detection unit 50 can detect the abnormality region generated in the abnormality detection target based on the polarization characteristics. Become.
- the abnormality detection unit 50 may estimate the cause of the detected abnormality based on the parameter attribute information indicating whether the parameter is a parameter measured in advance or a parameter calculated by the optimization process. .. For example, if the material parameter M is measured in advance and fixed by the polarization rendering setting unit 30, and the other parameters are automatically estimated by the optimization process, the cause of the detection as an abnormal region is in advance. It may be estimated that the material polarization characteristic is abnormal corresponding to the measured and fixed material parameter M.
- the abnormality detecting unit 50 the polarization information from the respective polarization rendered image I f phi and the polarization captured image I r phi (e.g. polarization degree, the polarization phase, normal, etc.) is calculated, and outputs the difference between polarization information May be.
- the polarization information such as the degree of polarization, the polarization phase, and the normal, for example, the method disclosed in the patent document of International Publication No. 2019/116708 may be used.
- the material parameter M of the abnormality detection target is measured in advance, and the imaging environment and imaging conditions of the abnormality detection target are set in a state specified in advance, and the light source parameter L, the geometry parameter G, and the camera are set in advance.
- the polarization rendering setting unit 30 fixes each parameter to the pre-measured parameter.
- the polarization rendering image generation unit 40 generates a polarization rendering image If ⁇ using the parameters fixed by the polarization rendering setting unit 30.
- Abnormality detecting unit 50 the difference E I ⁇ (L between the polarization captured image I r phi acquired by the polarization rendering setting unit 30 polarized rendered image I f phi and the polarization captured image acquisition unit 20 that is generated by, G, M, By calculating C) on a pixel-by-pixel basis and comparing it with the determination threshold Jth, the abnormal region of the abnormality detection target is detected.
- the light source parameter L, the geometry parameter G, the material parameter M, and the camera parameter C are not limited to the cases where they can be measured in advance.
- the polarization captured image I r polarized rendered image using the parameter corresponding to the image pickup environment and image-pickup condition when obtaining phi I f If ⁇ is not generated, the abnormal region cannot be detected based on the difference EI ⁇ (L, G, M, C).
- the can not be measured in advance parameters performs calculation of the product and the difference E I ⁇ polarized rendered image I f ⁇ (L, G, M, C) by using the parameter value is an initial value
- the calculated difference Update the parameter values using E I ⁇ (L, G, M, C).
- the updated parameter values generated and the difference E of the polarized rendered image I f phi with I ⁇ (L, G, M, C) performs calculation of the calculated difference E I ⁇ (L, G, M , C ) Is used to repeat the process of updating the parameter value to optimize the parameter.
- FIG. 12 shows an operation example.
- the operation example shown in FIG. 12 shows a case where an abnormality in the surface coating state is detected for each abnormality detection target, for example, in a production line or the like.
- Case 1 includes an abnormality detection target having a different shape, and shows a case where an abnormality in the surface coating state is detected for each abnormality detection target.
- the material parameter M to be detected for abnormality is measured in advance.
- the light source parameter L, the geometry parameter G, and the camera parameter C are fixed parameters according to the imaging environment and imaging conditions when acquiring the polarized image to be detected on the line, the light source parameter L is set in the field. And the geometry parameter G and the camera parameter C are acquired. By setting the parameters in this way, it is possible to detect an abnormal region in the surface coating state based on the difference between the generated polarized light rendered image and the acquired polarized light captured image.
- Case 2 shows a case where an abnormality in the surface coating state is detected for each abnormality detection target having the same shape and a different posture.
- the material parameter M to be detected for abnormality is measured in advance.
- the light source parameter L and the camera parameter C are fixed parameters according to the imaging environment and imaging conditions when acquiring the polarized image to be detected as an abnormality on the line, the light source parameter L and the camera parameter C are set in the field. To get.
- the geometry parameter G is automatically set by the optimization process. By setting the parameters in this way, it is possible to detect an abnormal region in the surface coating state based on the difference between the generated polarized light rendered image and the acquired polarized light captured image.
- Case 3 shows a case where an abnormality in the surface coating state is detected for each abnormality detection target having the same shape and posture without obtaining information on the polarized image acquisition unit.
- the material parameter M to be detected for abnormality is measured in advance.
- the light source parameter L and the geometry parameter G are fixed parameters according to the imaging environment and imaging conditions when acquiring the polarized image to be detected as an abnormality on the line, the light source parameter L and the geometry parameter G are set in the field. To get.
- the camera parameter C is automatically set by the optimization process. By setting the parameters in this way, it is possible to detect an abnormal region in the surface coating state based on the difference between the generated polarized light rendered image and the acquired polarized light captured image.
- Case 4 shows a case where an abnormality in the surface coating state is detected for each abnormality detection target having the same shape and posture without obtaining information on the light source.
- the material parameter M to be detected for abnormality is measured in advance.
- the geometry parameter G and the camera parameter C are fixed parameters according to the imaging environment and imaging conditions when acquiring the polarized image to be detected as an abnormality on the line, the geometry parameter G and the camera parameter C are set in the field. To get. Further, if information about the light source cannot be obtained, the light source parameter L is automatically set by the optimization process. By setting the parameters in this way, it is possible to detect an abnormal region in the surface coating state based on the difference between the generated polarized light rendered image and the acquired polarized light captured image.
- Case 5 shows a case where an abnormality in the surface coating state is detected for each abnormality detection target having the same shape without obtaining information on the posture and the light source.
- the material parameter M to be detected for abnormality is measured in advance.
- the camera parameter C is a fixed parameter according to the imaging environment and imaging conditions when acquiring the polarized image of the abnormality detection target on the line, the camera parameter C is acquired on site. Further, if information about the attitude and the light source cannot be obtained, the light source parameter L and the geometry parameter G are automatically set by the optimization process. By setting the parameters in this way, it is possible to detect an abnormal region in the surface coating state based on the difference between the generated polarized light rendered image and the acquired polarized light captured image.
- Case 6 shows a case where an abnormality in the surface coating state is detected for each abnormality detection target having the same shape and posture without obtaining information on the light source and the polarized image acquisition unit.
- the material parameter M to be detected for abnormality is measured in advance.
- the geometry parameter G is a fixed parameter according to the imaging environment and imaging conditions when acquiring the polarized image of the abnormality detection target on the line, the geometry parameter G is acquired in the field.
- the light source parameter L and the camera parameter C are automatically set by the optimization process when information about the light source and the polarized light image acquisition unit cannot be obtained. By setting the parameters in this way, it is possible to detect an abnormal region in the surface coating state based on the difference between the generated polarized light rendered image and the acquired polarized light captured image.
- Case 7 shows a case where an abnormality in the surface coating state is detected for each abnormality detection target having the same shape without obtaining information on the light source, the polarized image acquisition unit, and the posture.
- the material parameter M to be detected for abnormality is measured in advance.
- the light source parameter L, the geometry parameter G, and the camera parameter C are automatically set by the optimization process when information on the light source, the polarized light image acquisition unit, and the posture cannot be obtained. By setting the parameters in this way, it is possible to detect an abnormal region in the surface coating state based on the difference between the generated polarized light rendered image and the acquired polarized light captured image.
- FIG. 12 is an example, and the operation is not limited to the case shown in FIG.
- the spoiled portion may be detected from the surface condition of the fruit.
- the series of processes described in the specification can be executed by hardware, software, or a composite configuration of both.
- the program that records the processing sequence is installed in the memory in the computer built in the dedicated hardware and executed.
- the program can be recorded in advance on a hard disk as a recording medium, SSD (Solid State Drive), or ROM (Read Only Memory).
- the program is a flexible disc, CD-ROM (Compact Disc Read Only Memory), MO (Magneto Optical) disc, DVD (Digital Versatile Disc), BD (Blu-Ray Disc (registered trademark)), magnetic disc, semiconductor memory card. It can be temporarily or permanently stored (recorded) on a removable recording medium such as an optical disc.
- a removable recording medium such as an optical disc.
- Such removable recording media can be provided as so-called package software.
- the program may be transferred from the download site to the computer wirelessly or by wire via a network such as a LAN (Local Area Network) or the Internet.
- the computer can receive the program transferred in this way and install it on a recording medium such as a built-in hard disk.
- the information processing device of the present technology can have the following configurations.
- a polarization rendering setting unit that sets a plurality of parameters used to generate a polarization rendering image for abnormality detection, and a polarization rendering setting unit.
- a polarized rendering image generation unit that generates a polarized rendering image of the abnormality detection target based on the parameters set in the polarized rendering setting unit, and a polarized rendering image generation unit. It is provided with an abnormality detection unit that detects an abnormality region of the abnormality detection target based on the difference between the polarized light captured image acquired by imaging the abnormality detection target and the polarization rendering image generated by the polarization rendering image generation unit.
- Information processing device (2) The above-mentioned plurality of parameters have been measured.
- the information processing apparatus wherein the polarized light rendering setting unit sets a plurality of measured parameters as parameters used for generating the polarized light rendered image. (3) When some or all of the plurality of parameters are not measured The polarization rendering setting unit performs the optimization processing of the unmeasured parameters, and the measured parameters and the parameters calculated by the optimization processing, or if there is no measured parameter, the optimization processing is performed. The information processing apparatus according to (1), wherein the calculated parameters are set as the parameters used for generating the polarized rendering image. (4) The information processing apparatus according to (3), wherein in the parameter optimization process, parameters that can minimize the difference between the polarized image captured image and the polarized rendered image generated by the polarized rendered image generation unit are calculated. ..
- the polarization rendering setting unit can adjust the convergence characteristic of the parameter by using the parameter value converged by repeating the update of the parameter using the difference as the optimized parameter value (4).
- the information processing device described in. (6) The abnormality detection unit estimates the cause of the abnormality in the detected abnormal region based on the information indicating whether the parameter is the measured parameter or the parameter calculated by the optimization process.
- the information processing apparatus according to (3) or (5) The abnormality detection unit detects an abnormality region of the abnormality detection target based on the difference in the polarization information by using the polarization information calculated from the polarization captured image and the polarization information calculated from the polarization rendering image (1). ) To (6).
- the information processing apparatus detects an abnormality region of the abnormality detection target in pixel units.
- the plurality of parameters include a light source parameter relating to a light source, a geometry parameter relating to the abnormality detection target, a material parameter relating to the polarization characteristic of the abnormality detection target, and a camera of a polarization capture image acquisition unit for acquiring the polarization capture image.
- the information processing apparatus according to any one of (1) to (8), which includes parameters.
- the information processing apparatus according to (9), wherein at least the material parameters are measured in advance, and the polarization rendering setting unit fixes and uses the pre-measured parameters.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Chemical & Material Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Image Analysis (AREA)
- Studio Devices (AREA)
Abstract
Description
異常検出対象の偏光レンダリング画像の生成に用いる複数のパラメータを設定する偏光レンダリング設定部と、
前記偏光レンダリング設定部で設定されたパラメータに基づいて前記異常検出対象の偏光レンダリング画像を生成する偏光レンダリング画像生成部と、
前記異常検出対象を撮像して取得された偏光撮像画像と前記偏光レンダリング画像生成部で生成された前記偏光レンダリング画像との差分に基づき前記異常検出対象の異常領域を検出する異常検出部と
を備える情報処理装置にある。
異常検出対象の偏光レンダリング画像の生成に用いる複数のパラメータを偏光レンダリング設定部で設定することと、
前記偏光レンダリング設定部で設定されたパラメータに基づいて前記異常検出対象の偏光レンダリング画像を偏光レンダリング画像生成部で生成することと、
前記異常検出対象を撮像して取得された偏光撮像画像と前記偏光レンダリング画像生成部で生成された前記偏光レンダリング画像との差分に基づき前記異常検出対象の異常領域を異常検出部で検出すること
を含む情報処理方法にある。
異常検出対象に対する異常検出をコンピュータで実行させるプログラムであって、
前記異常検出対象の偏光レンダリング画像の生成に用いる複数のパラメータを設定する手順と、
前記設定された複数のパラメータに基づいて前記異常検出対象の偏光レンダリング画像を生成する手順と、
前記異常検出対象を撮像して取得された偏光撮像画像と生成した前記偏光レンダリング画像との差分に基づき前記異常検出対象の異常領域を検出する手順と
を前記コンピュータで実行させるプログラムにある。
1.偏光レンダリングと本技術について
2.実施の形態の構成
3.実施の形態の動作
3-1.光源パラメータの取得について
3-2.ジオメトリパラメータの取得ついて
3-3.材質パラメータの取得について
3-4.カメラパラメータの取得について
3-5.画像の差分とパラメータの最適化処理について
3-6.異常検出について
4.実施の形態の動作例
偏光レンダリングでは、偏光特性等を用いてレンダリング画像(以下「偏光レンダリング画像」という)を生成する。例えば、光源パラメータLと、非光源(被写体)のジオメトリパラメータG、非光源の偏光特性を示す材質パラメータM、カメラパラメータCを用いて、式(1)に示す関数fに基づき偏光レンダリング画像If φを生成する。
図1は、情報処理装置の構成を例示している。情報処理装置10は、偏光撮像画像取得部20、偏光レンダリング設定部30、偏光レンダリング画像生成部40、および異常検出部50を有している。
図4は、情報処理装置の動作を例示したフローチャートである。ステップST1で情報処理装置は偏光撮像画像を取得する。情報処理装置10の偏光撮像画像取得部20は、異常検出対象を撮像して偏光撮像画像を取得してステップST2に進む。
次に、光源パラメータの取得について説明する。光源パラメータを取得する光源パラメータ取得部は、例えば撮像部と撮像部の前方に偏光方向を変更可能として偏光板を設けた環境撮像部を用いて構成する。光源パラメータ取得部は、偏光方向を所定の複数の偏光方向として、偏光方向毎に光源を撮像して得られた複数偏光方向の偏光撮像画像に基づきストークスベクトルを算出して光源パラメータとして用いる。
次に、ジオメトリパラメータの取得について説明する。レンダリング処理前に、ジオメトリパラメータを取得する場合、既存の様々な形状測定方法を利用すればよい。例えば、ToF(Time Of Flight)センサや複数視点であるステレオカメラ等を用いて、異常検出対象の形状や位置および姿勢を示すパラメータをする。また、異常検出対象がプリミティブなジオメトリである場合、例えば、異常検出対象が円錐、キューブ、球等である場合、ジオメトリパラメータは、異常検出対象の形状や位置および姿勢を示す数式を用いてもよい。この場合には、形状測定を行わなくともレンダリング処理を行うことができる。
次に、材質パラメータの取得について説明する。図9は、材質偏光特性を説明するための図である。光源LTから出射された光が例えば偏光板PL1を介して測定対象OBに照射されており、材質パラメータの測定対象を撮像する撮像部(以下「測定対象撮像部」という)CMは、例えば偏光板PL2を介して測定対象OBを撮像する。なお、Z方向は天頂方向を示しており角度θは天頂角である。
次に、カメラパラメータの取得について説明する。カメラパラメータは、既存の測定方法を利用すればよい。例えば特開2001-264037号公報あるいは特開2008-131176号公報等で開示された方法を用いて内部パラメータと外部パラメータを取得する。
次に、画像の差分とパラメータの最適化処理について説明する。異常検出部50は、偏光レンダリング画像If φと偏光撮像画像Ir φとの差分EIφ(L,G,M,C)を算出する。異常検出部50は、例えば差分EIφ(L,G,M,C)として、式(11)に示すように平均二乗誤差(MSE:Mean Square Error)を算出してもよく、式(12)に示すような任意のノルム関数を用いて算出してもよい。また、画像がとりうる最大画素値と平均二乗誤差を用いて算出されるPSNR(Peak Signal to Noise Ratio)を差分として用いてもよい。
次に、異常検出について説明する。偏光レンダリング画像If φと偏光撮像画像Ir φが一致している場合、差分EIφ(L,G,M,C)が「0」となる。また、レンダリングに用いるパラメータが明らかでない場合、パラメータの最適化処理を行うことで偏光撮像画像Ir φを取得したときの撮像環境や撮像条件に近づけることができる。
次に、実施の形態の動作例について説明する。情報処理装置10は、異常検出対象の材質パラメータMが事前に測定されており、異常検出対象の撮像環境や撮像条件が予め指定した状態とされて、事前に光源パラメータLとジオメトリパラメータGとカメラパラメータCが測定されている場合、偏光レンダリング設定部30は、各パラメータを事前に測定されたパラメータに固定する。
(1) 異常検出対象の偏光レンダリング画像の生成に用いる複数のパラメータを設定する偏光レンダリング設定部と、
前記偏光レンダリング設定部で設定されたパラメータに基づいて前記異常検出対象の偏光レンダリング画像を生成する偏光レンダリング画像生成部と、
前記異常検出対象を撮像して取得された偏光撮像画像と前記偏光レンダリング画像生成部で生成された前記偏光レンダリング画像との差分に基づき前記異常検出対象の異常領域を検出する異常検出部と
を備える情報処理装置。
(2) 前記複数のパラメータは測定されており、
前記偏光レンダリング設定部は、前記測定されている複数のパラメータを、前記偏光レンダリング画像の生成に用いるパラメータに設定する(1)に記載の情報処理装置。
(3) 前記複数のパラメータの一部または全てが測定されていない場合、
前記偏光レンダリング設定部は、前記測定されてないパラメータの最適化処理を行い、測定されているパラメータと前記最適化処理によって算出したパラメータ、あるいは測定されているパラメータが無い場合は前記最適化処理によって算出したパラメータを前記偏光レンダリング画像の生成に用いるパラメータに設定する(1)に記載の情報処理装置。
(4) 前記パラメータの最適化処理では、前記偏光撮像画像と前記偏光レンダリング画像生成部で生成された前記偏光レンダリング画像との差分を最小化できるパラメータを算出する(3)に記載の情報処理装置。
(5) 前記偏光レンダリング設定部は、前記差分を用いて前記パラメータの更新を繰り返すことにより収束されたパラメータ値を最適化されたパラメータ値として、前記パラメータの収束特性を調整可能とした(4)に記載の情報処理装置。
(6) 前記異常検出部は、前記パラメータが前記測定されているパラメータであるか前記最適化処理によって算出されたパラメータであるかを示す情報に基づき、前記検出した異常領域の異常原因を推定する(3)または(5)に記載の情報処理装置。
(7) 前記異常検出部は、前記偏光撮像画像から算出した偏光情報と前記偏光レンダリング画像から算出した偏光情報を用いて、偏光情報の差分に基づき前記異常検出対象の異常領域を検出する(1)乃至(6)の何れかに記載の情報処理装置。
(8) 前記異常検出部は、前記異常検出対象の異常領域を画素単位で検出する(1)乃至(7)の何れかに記載の情報処理装置。
(9) 前記複数のパラメータは、光源に関する光源パラメータと、前記異常検出対象に関するジオメトリパラメータと、前記異常検出対象の偏光特性に関する材質パラメータと、前記偏光撮像画像を取得する偏光撮像画像取得部のカメラパラメータを含む(1)乃至(8)の何れかに記載の情報処理装置。
(10) 少なくとも前記材質パラメータは事前に測定されており、前記偏光レンダリング設定部は前記事前に測定されたパラメータを固定して用いる(9)に記載の情報処理装置。
20・・・偏光撮像画像取得部
30・・・偏光レンダリング設定部
31・・・光源パラメータ設定部
32・・・ジオメトリパラメータ設定部
33・・・材質パラメータ設定部
34・・・カメラパラメータ設定部
40・・・偏光レンダリング画像生成部
50・・・異常検出部
Claims (12)
- 異常検出対象の偏光レンダリング画像の生成に用いる複数のパラメータを設定する偏光レンダリング設定部と、
前記偏光レンダリング設定部で設定されたパラメータに基づいて前記異常検出対象の偏光レンダリング画像を生成する偏光レンダリング画像生成部と、
前記異常検出対象を撮像して取得された偏光撮像画像と前記偏光レンダリング画像生成部で生成された前記偏光レンダリング画像との差分に基づき前記異常検出対象の異常領域を検出する異常検出部と
を備える情報処理装置。 - 前記複数のパラメータは測定されており、
前記偏光レンダリング設定部は、前記測定されている複数のパラメータを、前記偏光レンダリング画像の生成に用いるパラメータに設定する
請求項1に記載の情報処理装置。 - 前記複数のパラメータの一部または全てが測定されていない場合、
前記偏光レンダリング設定部は、前記測定されてないパラメータの最適化処理を行い、測定されているパラメータと前記最適化処理によって算出したパラメータ、あるいは測定されているパラメータが無い場合は前記最適化処理によって算出したパラメータを前記偏光レンダリング画像の生成に用いるパラメータに設定する
請求項1に記載の情報処理装置。 - 前記パラメータの最適化処理では、前記偏光撮像画像と前記偏光レンダリング画像生成部で生成された前記偏光レンダリング画像との差分を最小化できるパラメータを算出する
請求項3に記載の情報処理装置。 - 前記偏光レンダリング設定部は、前記差分を用いて前記パラメータの更新を繰り返すことにより収束されたパラメータ値を最適化されたパラメータ値として、前記パラメータの収束特性を調整可能とした
請求項4に記載の情報処理装置。 - 前記異常検出部は、前記パラメータが前記測定されているパラメータであるか前記最適化処理によって算出されたパラメータであるかを示す情報に基づき、前記検出した異常領域の異常原因を推定する
請求項3に記載の情報処理装置。 - 前記異常検出部は、前記偏光撮像画像から算出した偏光情報と前記偏光レンダリング画像から算出した偏光情報を用いて、偏光情報の差分に基づき前記異常検出対象の異常領域を検出する
請求項1に記載の情報処理装置。 - 前記異常検出部は、前記異常検出対象の異常領域を画素単位で検出する
請求項1に記載の情報処理装置。 - 前記複数のパラメータは、光源に関する光源パラメータと、前記異常検出対象に関するジオメトリパラメータと、前記異常検出対象の偏光特性に関する材質パラメータと、前記偏光撮像画像を取得する偏光撮像画像取得部のカメラパラメータを含む
請求項1に記載の情報処理装置。 - 少なくとも前記材質パラメータは事前に測定されており、前記偏光レンダリング設定部は前記事前に測定されたパラメータを固定して用いる
請求項9に記載の情報処理装置。 - 異常検出対象の偏光レンダリング画像の生成に用いる複数のパラメータを偏光レンダリング設定部で設定することと、
前記偏光レンダリング設定部で設定されたパラメータに基づいて前記異常検出対象の偏光レンダリング画像を偏光レンダリング画像生成部で生成することと、
前記異常検出対象を撮像して取得された偏光撮像画像と前記偏光レンダリング画像生成部で生成された前記偏光レンダリング画像との差分に基づき前記異常検出対象の異常領域を異常検出部で検出すること
を含む情報処理方法。 - 異常検出対象に対する異常検出をコンピュータで実行させるプログラムであって、
前記異常検出対象の偏光レンダリング画像の生成に用いる複数のパラメータを設定する手順と、
前記設定された複数のパラメータに基づいて前記異常検出対象の偏光レンダリング画像を生成する手順と、
前記異常検出対象を撮像して取得された偏光撮像画像と生成した前記偏光レンダリング画像との差分に基づき前記異常検出対象の異常領域を検出する手順と
を前記コンピュータで実行させるプログラム。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/922,579 US12437380B2 (en) | 2020-05-14 | 2021-03-30 | Information processing device and information processing method |
| JP2022522553A JPWO2021229943A1 (ja) | 2020-05-14 | 2021-03-30 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020085004 | 2020-05-14 | ||
| JP2020-085004 | 2020-05-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021229943A1 true WO2021229943A1 (ja) | 2021-11-18 |
Family
ID=78525694
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/013540 Ceased WO2021229943A1 (ja) | 2020-05-14 | 2021-03-30 | 情報処理装置と情報処理方法およびプログラム |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US12437380B2 (ja) |
| JP (1) | JPWO2021229943A1 (ja) |
| WO (1) | WO2021229943A1 (ja) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120823211B (zh) * | 2025-09-18 | 2025-12-12 | 上海寅生信息科技有限公司 | 基于UE5像素回读与OpenCV的渲染异常自动化诊断系统 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007064888A (ja) * | 2005-09-01 | 2007-03-15 | Tokai Rika Co Ltd | 路面状態検出装置 |
| JP2011150689A (ja) * | 2009-12-25 | 2011-08-04 | Ricoh Co Ltd | 撮像装置、車載用撮像システム、路面外観認識方法及び物体識別装置 |
| JP2017058383A (ja) * | 2014-03-04 | 2017-03-23 | パナソニックIpマネジメント株式会社 | 偏光画像処理装置 |
| WO2019003383A1 (ja) * | 2017-06-29 | 2019-01-03 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および材質特定方法 |
| US20190336003A1 (en) * | 2018-05-02 | 2019-11-07 | Canfield Scientific, Incorporated | Skin reflectance and oiliness measurement |
| WO2020049816A1 (ja) * | 2018-09-03 | 2020-03-12 | ソニー株式会社 | 情報処理装置と情報処理方法およびプログラム |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001264037A (ja) | 2000-03-22 | 2001-09-26 | Nippon Telegr & Teleph Corp <Ntt> | カメラキャリブレーション方法及び装置及びカメラキャリブレーションプログラムを格納した記憶媒体 |
| JP4743538B2 (ja) | 2006-11-17 | 2011-08-10 | アイシン精機株式会社 | カメラ校正装置 |
| WO2009019886A1 (ja) * | 2007-08-07 | 2009-02-12 | Panasonic Corporation | 法線情報生成装置および法線情報生成方法 |
| US9001326B2 (en) * | 2011-12-13 | 2015-04-07 | Welch Allyn, Inc. | Method and apparatus for observing subsurfaces of a target material |
| JP5960471B2 (ja) | 2012-03-30 | 2016-08-02 | セコム株式会社 | 画像監視装置 |
| JP6576059B2 (ja) | 2015-03-10 | 2019-09-18 | キヤノン株式会社 | 情報処理、情報処理方法、プログラム |
| KR102659810B1 (ko) * | 2015-09-11 | 2024-04-23 | 삼성디스플레이 주식회사 | 결정화도 측정 장치 및 그 측정 방법 |
| JP2018082424A (ja) * | 2016-11-04 | 2018-05-24 | パナソニックIpマネジメント株式会社 | 画像形成装置 |
| US10371998B2 (en) * | 2017-06-29 | 2019-08-06 | Varjo Technologies Oy | Display apparatus and method of displaying using polarizers and optical combiners |
| JP6799155B2 (ja) * | 2017-06-29 | 2020-12-09 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理システム、および被写体情報特定方法 |
-
2021
- 2021-03-30 US US17/922,579 patent/US12437380B2/en active Active
- 2021-03-30 JP JP2022522553A patent/JPWO2021229943A1/ja active Pending
- 2021-03-30 WO PCT/JP2021/013540 patent/WO2021229943A1/ja not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007064888A (ja) * | 2005-09-01 | 2007-03-15 | Tokai Rika Co Ltd | 路面状態検出装置 |
| JP2011150689A (ja) * | 2009-12-25 | 2011-08-04 | Ricoh Co Ltd | 撮像装置、車載用撮像システム、路面外観認識方法及び物体識別装置 |
| JP2017058383A (ja) * | 2014-03-04 | 2017-03-23 | パナソニックIpマネジメント株式会社 | 偏光画像処理装置 |
| WO2019003383A1 (ja) * | 2017-06-29 | 2019-01-03 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および材質特定方法 |
| US20190336003A1 (en) * | 2018-05-02 | 2019-11-07 | Canfield Scientific, Incorporated | Skin reflectance and oiliness measurement |
| WO2020049816A1 (ja) * | 2018-09-03 | 2020-03-12 | ソニー株式会社 | 情報処理装置と情報処理方法およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20230196534A1 (en) | 2023-06-22 |
| JPWO2021229943A1 (ja) | 2021-11-18 |
| US12437380B2 (en) | 2025-10-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN115830103B (zh) | 一种基于单目彩色的透明物体定位方法、装置及存储介质 | |
| JP7528929B2 (ja) | 撮像装置と画像処理装置および画像処理方法 | |
| US10621753B2 (en) | Extrinsic calibration of camera systems | |
| CN113256730B (zh) | 用于阵列相机的动态校准的系统和方法 | |
| CN111461963B (zh) | 一种鱼眼图像拼接方法及装置 | |
| US20070273795A1 (en) | Alignment optimization in image display systems employing multi-camera image acquisition | |
| US7411669B2 (en) | Substrate defect inspection method, computer readable storage medium, and defect inspection apparatus | |
| JP2009524349A (ja) | 立体表示システムの調整方法 | |
| US20210152748A1 (en) | Information processing apparatus, information processing method, program, and calibration apparatus | |
| WO2021131508A1 (ja) | 表面欠陥検出装置と表面欠陥検出方法とプログラムおよび表面欠陥検出システム | |
| WO2019201355A1 (en) | Light field system occlusion removal | |
| JP2013128217A5 (ja) | ||
| WO2021229943A1 (ja) | 情報処理装置と情報処理方法およびプログラム | |
| KR101853269B1 (ko) | 스테레오 이미지들에 관한 깊이 맵 스티칭 장치 | |
| US20220146434A1 (en) | Image processing apparatus, information generation apparatus, and method thereof | |
| JP7666512B2 (ja) | 画像処理装置と画像処理方法およびプログラム | |
| CN112509034A (zh) | 一种基于图像像素点匹配的大范围行人体温精准检测方法 | |
| WO2019078320A1 (ja) | 情報処理装置と情報処理方法および撮像装置とプログラム | |
| US20240273739A1 (en) | Apparatus and method with image registration | |
| TW202141351A (zh) | 影像檢查裝置以及影像檢查方法 | |
| TW202113344A (zh) | 外觀檢查裝置及其校正方法、及電腦可讀取的記錄媒體 | |
| CN119043220A (zh) | 一种草地面积测量系统 | |
| JP7424272B2 (ja) | 情報処理装置と情報処理方法およびプログラム | |
| KR20240071170A (ko) | 멀티뷰 위상 이동 프로파일로메트리를 위한 3차원 캘리브레이션 방법 및 장치 | |
| WO2022107530A1 (ja) | 信号処理装置と信号処理方法およびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21803371 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022522553 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21803371 Country of ref document: EP Kind code of ref document: A1 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 17922579 Country of ref document: US |