[go: up one dir, main page]

WO2025173500A1 - Image processing device and image processing method - Google Patents

Image processing device and image processing method

Info

Publication number
WO2025173500A1
WO2025173500A1 PCT/JP2025/002226 JP2025002226W WO2025173500A1 WO 2025173500 A1 WO2025173500 A1 WO 2025173500A1 JP 2025002226 W JP2025002226 W JP 2025002226W WO 2025173500 A1 WO2025173500 A1 WO 2025173500A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
rotation angle
objective lens
captured images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2025/002226
Other languages
French (fr)
Japanese (ja)
Inventor
芳晴 渡辺
雅文 篠田
大地 吉本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lasertec Corp
Original Assignee
Lasertec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lasertec Corp filed Critical Lasertec Corp
Priority to JP2025506014A priority Critical patent/JP7748600B1/en
Publication of WO2025173500A1 publication Critical patent/WO2025173500A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • H10P74/00

Definitions

  • This disclosure relates to an image processing device and an image processing method.
  • Patent Document 1 proposes a method for measuring the peripheral shape of a wafer edge.
  • the image processing device disclosed herein includes a support unit that supports an object rotatably around a rotation axis, an objective lens that focuses light reflected by the object, a drive unit that can change the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, an acquisition unit that acquires captured images of the object via the objective lens, a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire multiple captured images, an identification unit that identifies pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object, a storage unit that stores the identified pixel positions together with the Z position and the rotation angle in each captured image as high-brightness information, and a generation unit that generates image information of the object based on the stored multiple pieces of high-brightness information.
  • control unit causes the support unit to rotate the object around the rotation axis, while causing the drive unit to change the radial position in a second direction opposite to the first direction and keeping the Z position constant at the second position, causing the acquisition unit to acquire multiple captured images of the main surface of the object, wherein the second position is different from the first position, and the difference between the second position and the first position may be within the focal depth of an optical system including the objective lens.
  • the image processing device includes a support unit that supports an object rotatably around a rotation axis; an objective lens that focuses light reflected by the object; a drive unit that can change the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens; an acquisition unit that acquires captured images of the object via the objective lens; a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire multiple captured images; an identification unit that identifies pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object; and a generation unit that generates image information of the object based on the pixels where the brightness is equal to or greater than the predetermined value, and the generation unit may generate image information of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than the predetermined value in multiple captured images that have different Z positions at the specific rotation angle.
  • the drive unit can change the optical axis angle, which is the angle of the optical axis of the objective lens with respect to a plane perpendicular to the rotation axis, and the identification unit identifies pixel positions where the brightness is greater than or equal to a predetermined value in the plurality of captured images where the Z position, rotation angle, and optical axis angle are different, and the generation unit may generate the image information of the edge of the object at the specific rotation angle based on the pixels where the brightness is greater than or equal to the predetermined value in the plurality of captured images where the rotation angle is the specific rotation angle, the optical axis angle is a first optical axis angle, and the Z positions are different, and the pixels where the brightness is greater than or equal to the predetermined value in the plurality of captured images where the rotation angle is the specific rotation angle, the optical axis angle is a second optical axis angle, and the Z positions are different.
  • the optical axis angle which is the angle of the optical axis of the objective lens with respect to a plane per
  • the image processing method disclosed herein comprises the steps of supporting an object on a support unit so that it can rotate around a rotation axis; focusing light reflected by the object with an objective lens; changing the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, with a drive unit; causing an acquisition unit to acquire an image of the object via the objective lens; a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire a plurality of the imaged images; causing an identification unit to identify pixel positions, which are positions of pixels where the brightness is equal to or greater than a predetermined value, in a plurality of imaged images that have different Z positions and different rotation angles of the object; storing the identified pixel positions, along with the Z positions and rotation angles in each imaged image, in a storage unit as high-brightness information; and causing a generation unit to generate image information of the object based on the stored plurality of high-brightness information.
  • the drive unit moves the object in a direction perpendicular to the rotation axis
  • the control unit The object may be rotated around the rotation axis by the support unit, while the radial position is changed in a first direction perpendicular to the rotation axis by the drive unit, and the Z position is kept constant at the first position, causing the acquisition unit to acquire multiple captured images of the main surface of the object.
  • the image processing method disclosed herein includes the steps of supporting an object on a support unit so that it can rotate around a rotation axis; focusing light reflected by the object with an objective lens; changing the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, with a drive unit; causing an acquisition unit to acquire an image of the object via the objective lens; a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire multiple captured images; causing an identification unit to identify pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object; and causing a generation unit to generate image information of the object based on the pixels where the brightness is equal to or greater than the predetermined value.
  • the generation unit In the step of causing the generation unit to generate image information, the generation unit generates image information of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than the predetermined value in multiple captured
  • the generation unit may generate the image information of the object at a plurality of specific rotation angles based on pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images whose rotation angle is a first rotation angle and whose Z positions are different from one another, and pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images whose rotation angle is a second rotation angle and whose Z positions are different from one another.
  • the identifying unit identifies pixel positions, which are positions of pixels where the brightness is equal to or greater than a predetermined value, in a plurality of the captured images, each of which has a different Z position, rotation angle, and optical axis angle, and causes the generating unit to generate the image information
  • the generating unit may generate the image information of the edge of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than a predetermined value in a plurality of the captured images, each of which has a rotation angle that is a specific rotation angle, a first optical axis angle, and a different Z position, and pixels where the brightness is equal to or greater than a predetermined value in a pluralit
  • the identification unit In the step of causing the identification unit to identify the pixel positions, the identification unit identifies first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first captured images having different Z positions, and identifies second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of second captured images having different Z positions.
  • the generation unit may generate a color image of the edge of the object represented in a plurality of colors based on the first-wavelength high-luminance information and the second-wavelength high-luminance information.
  • This disclosure provides an image processing device and image processing method that can measure information about an object quickly and accurately.
  • FIG. 1 is a configuration diagram illustrating an inspection device according to a first embodiment.
  • FIG. 2 is a plan view illustrating a stage in the inspection device according to the first embodiment.
  • FIG. 2 is a configuration diagram illustrating an imaging unit in the inspection device according to the first embodiment.
  • 1 is a block diagram illustrating an image processing apparatus according to a first embodiment.
  • 1 is a diagram illustrating an example of an inspection image acquired by an acquisition unit in the image processing apparatus according to the first embodiment, where the horizontal axis indicates the rotation angle and the vertical axis indicates the pixels capturing an image of the edge of the wafer.
  • 3 is a diagram illustrating an example of a profile of a wafer edge generated by a generating unit in the image processing apparatus according to the first embodiment;
  • FIG. 10 is a diagram illustrating a profile of a wafer surface of a wafer generated by a generating unit in an image processing apparatus according to a second embodiment.
  • FIG. 10 is a flowchart illustrating an image processing method using the image processing device according to the second embodiment.
  • Fig. 1 is a configuration diagram illustrating an inspection device 1 according to embodiment 1.
  • Fig. 2 is a plan view illustrating a stage 10 in the inspection device 1 according to embodiment 1.
  • Fig. 3 is a configuration diagram illustrating an imaging unit 20 in the inspection device 1 according to embodiment 1.
  • the stage 10 places the wafer WF on it.
  • the stage 10 has a stage surface 11.
  • the stage 10 places the wafer WF flat on the stage surface 11.
  • the back surface of the wafer WF contacts the stage surface 11.
  • the stage 10 may have a predetermined set position on the stage surface 11 where the wafer WF is set. For example, when inspecting the wafer WF, the wafer WF may first be fixed at the set position.
  • the wafer WF has a wafer surface WF1.
  • an ⁇ Cartesian coordinate system is introduced.
  • the plane parallel to the stage surface 11 is defined as the ⁇ plane.
  • the direction perpendicular to the stage surface 11 is defined as the ⁇ axis direction.
  • the stage 10 has, for example, a rotation axis C1.
  • the rotation axis C1 extends, for example, in the ⁇ -axis direction.
  • the rotation axis C1 passes through the wafer WF placed on the stage surface 11. Therefore, the stage 10 rotates the wafer WF around the rotation axis C1.
  • the stage 10 may be connected to a drive unit 12 such as a motor.
  • the drive unit 12 rotates the stage 10 around the rotation axis C1.
  • the rotation angle from a predetermined set position is called ⁇ .
  • the imaging unit 20 captures an inspection image of the edge WFE of the wafer WF.
  • the imaging unit 20 includes, for example, an objective lens 21, a drive unit 22, and a light receiving unit 23.
  • the imaging unit 20 may further include a light source 24, a pinhole 25, a beam splitter 26, an optical element 27, an optical element 28, and a sensor 29. Note that, as long as the imaging unit 20 can capture an inspection image of the edge WFE of the wafer WF, any of the above optical elements may be replaced with other optical elements, or other optical elements may be included in addition to the above optical elements.
  • the imaging unit 20 captures an inspection image of the edge WFE from reflected light R1 reflected by the edge WFE of the wafer WF.
  • the drive unit 22 changes the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21.
  • the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21 is called the Z position.
  • the drive unit 22 moves the position of the objective lens 21 in the optical axis direction.
  • the drive unit 22 may change the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21 by changing the shape, position, or attitude of the optical element, or by changing the focusing distance of the objective lens 21.
  • the light receiving unit 23 receives the reflected light R1 focused by the objective lens 21. As a result, the imaging unit 20 captures an inspection image from the reflected light R1 reflected by the end WFE of the wafer WF.
  • the light receiving unit 23 may include a first light receiving unit 23a and a second light receiving unit 23b. Furthermore, the light receiving unit 23 may include a third light receiving unit 23c in addition to the first light receiving unit 23a and the second light receiving unit 23b.
  • the imaging unit 20 may have a confocal optical system. Therefore, the imaging unit 20 can capture an image so that the edge WFE of the wafer WF is in focus. However, in this embodiment, the imaging unit 20 may move the Z position of the objective lens 21 during imaging without paying attention to the focus. The brightness of the reflected light R1 from the position of the edge WFE in focus increases. This makes it possible to identify the Z position in focus. Therefore, after imaging, various data information such as the in-focus Z position, brightness information, rotation angle ⁇ , optical axis angle ⁇ , and sampling time of the image is referenced. A profile of the edge WFE of the wafer WF can be formed using the various data information referenced. In this way, the imaging unit 20 can perform three-dimensional measurement.
  • the imaging unit 20 is connected to the image processing device 30 via a communication line that includes at least one of wireless and wired lines. Specifically, the imaging unit 20 is connected in a state where it can transmit information including image data, Z position, rotation angle ⁇ , optical axis angle ⁇ , sampling time, etc. to the image processing device 30. The imaging unit 20 outputs information such as captured image data, Z position, rotation angle ⁇ , optical axis angle ⁇ , sampling time, etc. to the image processing device 30.
  • the acquisition unit 31 acquires inspection images of the edge WFE of the wafer WF at multiple Z positions from the reflected light R1 collected by the objective lens 21 by rotating the wafer WF around the rotation axis C1 on the stage 10 and moving the Z position.
  • the inspection image includes pixels of the edge WFE corresponding to the rotation angle ⁇ at each Z position.
  • FIG. 5 is a diagram illustrating an inspection image acquired by the acquisition unit 31 in the image processing device 30 according to embodiment 1, where the horizontal axis indicates the rotation angle ⁇ and the vertical axis indicates the pixels capturing the edge WFE of the wafer WF.
  • the acquisition unit 31 acquires multiple inspection images each with a different Z position.
  • Each inspection image corresponds the rotation angle ⁇ of the edge WFE of the wafer WF to the pixels capturing the edge WFE at the rotation angle ⁇ .
  • the rotation angle ⁇ in the inspection image may include a range of 0° to 360° so as to correspond to the entire circumference of the edge WFE of the wafer WF, or may include a predetermined partial range.
  • the acquisition unit 31 may acquire inspection images when the optical axis angle ⁇ is changed.
  • the optical axis angle ⁇ of the inspection image may include a range from -90° to +90°.
  • the acquisition unit 31 may acquire the inspection image from the first light receiving unit 23a, which receives light of the first wavelength.
  • the inspection image acquired from the first light receiving unit 23a is called the first inspection image. Therefore, in this case, the acquisition unit 31 acquires the first inspection image.
  • the acquisition unit 31 may acquire a second inspection image and a third inspection image similar to those in FIG. 5.
  • the inspection image acquired from the second light receiving unit 23b that receives light of the second wavelength in the reflected light R1 is called the second inspection image
  • the inspection image acquired from the third light receiving unit 23c that receives light of the third wavelength in the reflected light R1 is called the third inspection image.
  • the first inspection image, second inspection image, and third inspection image may each include multiple inspection images with different Z positions.
  • the identification unit 32 identifies the positions of pixels in the inspection image where the brightness is above a predetermined level.
  • the positions of pixels where the brightness is above a predetermined level are called pixel positions P. Note that some reference numerals have been omitted in Figure 5 to avoid cluttering the illustration.
  • the inspection image includes multiple inspection images with different Z positions.
  • the identification unit 32 may also identify pixel positions P where the brightness is above a predetermined level in multiple inspection images with different optical axis angles ⁇ .
  • the above is an example of a process for identifying pixel positions P where the brightness is above a predetermined level in multiple inspection images with different Z positions and different object rotation angles ⁇ .
  • the identifying unit 32 may identify pixel positions P where the luminance is equal to or greater than a predetermined value in multiple inspection images that correspond to Z positions and pixels capturing images of the end portions at the Z positions, each of which has a different Z position and a different rotation angle ⁇ of the object, based on multiple inspection images for each of multiple rotation angles ⁇ .
  • the identifying unit 32 may identify pixel positions P where the quality of an evaluation parameter for a pixel is equal to or greater than a predetermined value.
  • a luminance equal to or greater than a predetermined value is an example of the quality of an evaluation parameter for a pixel being equal to or greater than a predetermined value.
  • the evaluation parameters for a pixel may include, in addition to luminance, parameters obtained by normalizing luminance, etc. Furthermore, identifying pixel positions where the quality of an evaluation parameter for a luminance or pixel is equal to or greater than a predetermined value may also include identifying pixel positions where the quality of an evaluation parameter for a luminance or pixel is less than a predetermined value and excluding those pixels from processing.
  • Whether the quality of the brightness or pixel evaluation parameters is above a predetermined level may be determined by using the brightness or pixel evaluation parameters when an image is formed in a state where the object is focused to a predetermined degree as a threshold.
  • a state where the object is focused to a predetermined degree may mean just focus, or may include a state including an amount of defocus that is acceptable for the design.
  • the identification unit 32 may identify a first-wavelength pixel position P1 in a plurality of first inspection images acquired by the acquisition unit 31, each having a different Z position.
  • the identification unit 32 may identify a second-wavelength pixel position in a plurality of second inspection images acquired by the acquisition unit 31, each having a different Z position, or may identify a third-wavelength pixel position in a plurality of third inspection images, each having a different Z position.
  • a pixel position in the first inspection image where the brightness is greater than or equal to a predetermined value is referred to as a first-wavelength pixel position P1
  • a pixel position in the second inspection image where the brightness is greater than or equal to a predetermined value is referred to as a second-wavelength pixel position P2 (not shown)
  • a pixel position in the third inspection image where the brightness is greater than or equal to a predetermined value is referred to as a third-wavelength pixel position P3 (not shown).
  • the storage unit 33 stores the identified pixel position P together with the Z position, rotation angle ⁇ , and sampling time in each inspection image as high-brightness information.
  • the storage unit 33 may also store the optical axis angle ⁇ together with the pixel position P as high-brightness information.
  • the storage unit 33 may also store high-brightness information for multiple optical axis angles ⁇ .
  • the storage unit 33 may store the first-wavelength pixel position P1 together with the Z position, rotation angle ⁇ , and sampling time as information when the first wavelength is high-brightness in each first inspection image. Furthermore, the storage unit 33 may store the second-wavelength pixel position P2 together with the Z position, rotation angle ⁇ , and sampling time as information when the second wavelength is high-brightness in each second inspection image, and may store the third-wavelength pixel position P3 together with the Z position, rotation angle ⁇ , and sampling time as information when the third wavelength is high-brightness in each third inspection image.
  • Generating image information based on high-brightness information by the generation unit 34 may also include the generation unit 34 obtaining statistical values such as the average and standard deviation based on the brightness at multiple pixel positions P indicated by the multiple pieces of high-brightness information, or obtaining interpolated values based on the brightness at multiple pixel positions P, and outputting this as brightness level information or height information converted into physical height.
  • FIG. 6 is a diagram illustrating a profile of the edge WFE of the wafer WF generated by the generation unit 34 in the image processing device 30 according to the first embodiment.
  • FIG. 6 also shows a case where the optical axis angle ⁇ is multiple values ⁇ 1 to ⁇ 5. Note that an optical axis angle ⁇ of -90° to +90° is one example. Another example is dividing the optical axis angle ⁇ of -90° to +90° into ⁇ 1 to ⁇ 5.
  • the generation unit 34 may generate a profile of the cross section of the edge WFE as image information.
  • the generation unit 34 may generate a color image of the edge WFE of the wafer WF expressed in multiple colors based on the first wavelength high brightness information and the second wavelength high brightness information.
  • the generation unit 34 may also generate a color image of the edge WFE of the wafer WF expressed in multiple colors based on the first wavelength high brightness information, the second wavelength high brightness information, and the third wavelength high brightness information.
  • the generation unit 34 may generate a reference image of the edge WFE of the wafer WF based on the image information.
  • the reference image is an image of the edge WFE of an ideal wafer WF (a wafer WF that can be said to be defect-free) used to inspect the edge WFE of the wafer WF for defects, etc.
  • the control unit 35 controls the operation of the acquisition unit 31, identification unit 32, memory unit 33, and generation unit 34 in the image processing device 30.
  • the control unit 35 may also control the operation of the drive unit 12 of the stage 10, the drive unit 22 of the imaging unit 20, the light receiving unit 23, and the light source 24 in the inspection device 1.
  • the control unit 35 may inspect the wafer WF for defects, etc. based on the image information.
  • the inspection target wafer WF may be inspected by comparing image information of an ideal wafer WF with image information of the inspection target wafer WF.
  • the inspection target wafer WF may be inspected using a profile, color image, or image of the edge WFE generated based on the image information.
  • the identification unit 32 is caused to identify pixel position P.
  • the control unit 35 causes the identification unit 32 to identify pixel position P where the brightness is equal to or greater than a predetermined value in a plurality of inspection images each having a different Z position.
  • the control unit 35 may cause the identification unit 32 to identify first-wavelength pixel position P1 in a plurality of first inspection images, to identify second-wavelength pixel position P2 in a plurality of second inspection images, and to identify third-wavelength pixel position P3 in a plurality of third inspection images.
  • the control unit 35 may cause the identification unit to identify pixel positions where the brightness is equal to or greater than a predetermined value in a plurality of inspection images each having a different Z position and a different rotation angle ⁇ of the object.
  • the control unit 35 stores the identified pixel position P in the memory unit 33 as high-brightness information, along with the Z position, rotation angle ⁇ , and sampling time in each inspection image.
  • the control unit 35 may store the optical axis angle ⁇ together with the pixel position P in the memory unit 33 as high-brightness information.
  • the control unit 35 may also store high-brightness information for multiple optical axis angles ⁇ in the memory unit 33.
  • the control unit 35 causes the generation unit 34 to generate image information regarding the edge WFE of the wafer WF based on the stored multiple pieces of high-brightness information.
  • the control unit 35 may cause the generation unit 34 to generate image information based on the high-brightness information at multiple optical axis angles ⁇ relative to the wafer WF.
  • the control unit 35 may also cause the generation unit 34 to generate image information regarding the rotation angle ⁇ of the entire circumference of the edge WFE of the wafer WF based on the high-brightness information.
  • control unit 35 may cause the generation unit 34 to generate a color image of the edge WFE of the wafer WF displayed in multiple colors based on the first-wavelength high-brightness information, second-wavelength high-brightness information, and third-wavelength high-brightness information.
  • Fig. 8 is a flow chart illustrating an inspection method using the inspection device 1 according to the first embodiment.
  • step S22 the reflected light R1 reflected by the edge WFE of the wafer WF is focused by the objective lens 21.
  • the control unit 35 drives the drive unit 22 to control the position of the objective lens 21 so that the objective lens 21 focuses the reflected light R1.
  • step S23 the Z position is moved by the drive unit 22.
  • the control unit 35 drives the drive unit 22 to move the Z position in the optical axis direction.
  • the imaging unit 20 captures an inspection image.
  • the control unit 35 causes the light receiving unit 23 to receive reflected light and capture the inspection image.
  • step S25 the inspection image is subjected to image processing by the image processing device 30.
  • the image processing method using the image processing device 30 is as described above. Note that a step of inspecting the object based on the image information may also be included.
  • the image processing device 30 of this embodiment acquires an inspection image of the edge WFE by changing the Z position of the objective lens 21, which collects reflected light R1 from the edge WFE of the wafer WF, while rotating the wafer WF around the rotation axis C1.
  • the image processing device 30 then identifies pixel positions P from the inspection image where the brightness is above a predetermined level, and generates image information of the edge WFE based on these pixel positions P.
  • This allows the image processing device 30 to obtain profiles of the edge WFE for multiple rotation angles ⁇ . Specifically, it is possible to obtain profile information of the edge WFE around the entire periphery of the wafer WF. Therefore, the image processing device 30 can measure information about objects such as the wafer WF quickly and accurately.
  • the image processing device 30 of this embodiment stores data on multiple inspection images acquired by the acquisition unit 31 in the memory unit 33. Therefore, if there is a singularity in the profile of the edge WFE due to a measurement error or the like, corrections such as interpolation can be made to the singularity based on the data surrounding the singularity stored in the memory unit 33, thereby improving the accuracy of the profile of the edge WFE of the wafer WF.
  • the image processing device 30 of this embodiment scans the objective lens 21 in the optical axis direction regardless of the focal length of the objective lens 21. This makes it possible to obtain an image in which the object is in focus. This makes an autofocus function unnecessary.
  • each light receiving element In related devices that capture color images, light receiving elements corresponding to multiple colors are used to obtain the color image. In this case, each light receiving element must be in focus at the same time. If each light receiving element is not in focus at the same time, chromatic aberration will cause the colors to be inaccurate when a color image is generated.
  • the image processing device 30 of this embodiment generates a color image based on the focused first wavelength high brightness information, second wavelength high brightness information, and third wavelength high brightness information at each light receiving unit 23 (first light receiving unit 23a, second light receiving unit 23b, and third light receiving unit 23c) while scanning the objective lens 21 in the optical axis direction.
  • the first inspection image, second inspection image, and third inspection image are combined to generate a color image. This makes it possible to generate a color image quickly and with high accuracy.
  • the image processing device 30 and inspection device 1 of this embodiment use a device equipped with a confocal optical microscope, and can simultaneously generate high-resolution color images and cross-sectional profiles of the wafer WF.
  • the inspection device 1 may be configured to include a confocal optical imaging unit 20, a stage 10 having a rotation axis C1, and multiple light receiving units 23 that detect different wavelengths. While rotating the stage 10 at high speed, scanning is performed by moving the objective lens 21 in the direction of the optical axis C2 relative to the focus position.
  • the image processing device and inspection device 1 of this embodiment can be expected to have the following effects.
  • the first is the cost reduction effect. Because the wafer WF profile information and the inspection image used for defect inspection are captured simultaneously, the number of devices used can be reduced, resulting in reduced labor and costs.
  • the second is to obtain a profile of unevenness that is larger than the focal length.
  • the image processing device 30 of this embodiment can obtain focused images even for wafers WF with large uneven shapes.
  • Recent semiconductor devices tend to be formed three-dimensionally, and many wafers WF have surface unevenness of several tens of ⁇ m or more. For this reason, it is difficult to obtain an image in focus over the entire surface with the focal length of a typical confocal optical microscope.
  • the third is the reduction of chromatic aberration in color images.
  • brightness increases where the image is in focus and decreases where it is not. For this reason, if there is a discrepancy in the timing at which multiple cameras achieve focus, the colors will not be displayed correctly.
  • an image in which each light-receiving element 23 is in focus can be obtained by scanning along the optical axis. Therefore, by combining the images of each color later, a color image in which all colors are in focus can be generated.
  • an image processing device Next, an image processing device according to a second embodiment will be described. In the following, the description of the same configuration as in the first embodiment may be omitted.
  • the image processing device of this embodiment may be called an inspection device.
  • the inspection device 1 of the first embodiment described above may be called an image processing device.
  • the image processing device 30 will be called an image processing unit 30a.
  • the ⁇ image processing device> and ⁇ image processing unit> will be described first, followed by the ⁇ image processing method>.
  • FIG. 9 is a configuration diagram illustrating an image processing device 2 according to the second embodiment.
  • FIG. 10 is a cross-sectional view illustrating an example of an object in the image processing device 2 according to the second embodiment.
  • FIG. 11 is a plan view illustrating an example of an object in the image processing device 2 according to the second embodiment.
  • the image processing device 2 includes a support unit 40, a drive unit 12, a drive unit 22, an imaging unit 20, and an image processing unit 30a.
  • the image processing device 2 processes, for example, captured images of the object.
  • the image processing device 2 may inspect the object. In that case, the captured images include an inspection image.
  • the image processing device 2 may process captured images of the edge of the object and captured images of the main surface of the object.
  • the object includes, for example, a wafer WF.
  • the edge of the object includes the edge WFE of the wafer WF.
  • the main surface of the object includes the wafer surface WF1 of the wafer WF. Note that the main surface of the object does not exclude including the back surface of the wafer WF.
  • the object may be referred to as a wafer WF.
  • the object is not limited to a wafer WF, and may include plate-like objects such as semiconductor chips and printed circuit boards, as long as it has an edge and a main surface, or may be an object other than a plate-like object.
  • an r ⁇ polar coordinate system centered on the rotation axis C1 is introduced for the ⁇ plane in the ⁇ Cartesian coordinate system.
  • the radial position r indicates the distance from the rotation axis C1.
  • the rotation angle ⁇ indicates the angle between the ⁇ axis and the - ⁇ axis direction.
  • the support unit 40 supports the object.
  • the support unit 40 may include, for example, a stage 10.
  • the stage 10 supports a wafer WF placed on a stage surface 11.
  • the stage 10 may be connected to a drive unit 12 such as a motor.
  • the drive unit 12 rotates the stage 10 around a rotation axis C1.
  • the stage 10 may have a three-axis adjustment mechanism or the like that adjusts the position of the stage 10 and the gradient of the stage surface 11.
  • the drive unit 12 may move the stage 10 in a direction perpendicular to the rotation axis C1.
  • the drive unit 12 may also move the stage 10 in a direction parallel to the rotation axis C1.
  • the drive unit 12 may be able to change the relative position between the object and the focusing position of the objective lens 21 in a direction parallel to the rotation axis C1.
  • the support unit 40 supports the object so that it can rotate around the rotation axis C1.
  • the support unit 40 also supports the object so that it can move in a direction perpendicular to the rotation axis C1.
  • the support unit 40 supports the object so that it can move in the ⁇ -axis direction.
  • the support unit 40 may sense the amount of movement in the ⁇ -axis direction using the sensor 13. The amount of movement of the support unit 40 in the ⁇ -axis direction corresponds to the radial position r at which the imaging unit 20 images the wafer surface WF1.
  • the support unit 40 supports the object so that it can move in a direction parallel to the rotation axis C1.
  • the support unit 40 supports the object so that it can move in the ⁇ -axis direction.
  • the support unit 40 may sense the Z position from the amount of movement in the ⁇ -axis direction using the sensor 13.
  • the support unit 40 is not limited to the stage 10, as long as it can support the object so that it can rotate around the rotation axis C1 and so that it can move in a direction perpendicular to and parallel to the rotation axis C1.
  • the support unit 40 is connected to the image processing unit 30a via a communication line that can be either wireless or wired.
  • the support unit 40 is connected in a state where it can transmit information including data on the radial position r, rotation angle ⁇ , and Z position to the image processing unit 30a.
  • the support unit 40 outputs this information.
  • the driver 12 rotates the stage 10 around the rotation axis C1. Therefore, the driver 12 rotates the object around the rotation axis C1.
  • the driver 12 may change the position of the stage 10 by driving a three-axis adjustment mechanism or the like.
  • the driver 12 can change the relative position (called the radial position r) between the object and the focusing position of the objective lens 21 in a direction perpendicular to the rotation axis C1.
  • the driver 12 moves the stage 10 in the ⁇ -axis direction perpendicular to the rotation axis C1. Therefore, the driver 12 moves the object in the ⁇ -axis direction.
  • the driver 12 also moves the stage 10 in the ⁇ -axis direction parallel to the rotation axis C1. Therefore, the driver 12 moves the object in the ⁇ -axis direction.
  • the driver 12 may fix the Z position when changing the radial position r in a direction perpendicular to the rotation axis C1.
  • the driver 12 may fix the position of the objective lens 21 in the ⁇ -axis direction when changing the radial position r in a direction perpendicular to the rotation axis C1.
  • the driver 12 may fix the Z position to a first position, a second position, etc. when changing the radial position r in a direction perpendicular to the rotation axis C1.
  • the driver 12 may fix the position of the objective lens 21 in the ⁇ -axis direction to a first position, a second position, etc.
  • the driver 12 when changing the radial position r in a direction perpendicular to the rotation axis C1. Specifically, as shown by the support unit 40 on the left side of Figure 11, the driver 12 fixes the Z position to the first position when moving the support unit 40 in the - ⁇ -axis direction while rotating it. In contrast, as shown by the support unit 40 on the right side of Figure 11, when the driver 12 rotates the support unit 40 and moves it in the + ⁇ -axis direction, the driver 12 fixes the Z position at the second position.
  • the first and second positions may be different in the ⁇ -axis direction. The difference in the ⁇ -axis direction between the first and second positions is preferably within the focal depth of the optical system including the objective lens 21.
  • the driver 12 may set the optical axis angle ⁇ to approximately 90°. That is, when the driver 12 changes the radial position r in a direction perpendicular to the rotation axis C1, the driver 12 may set the optical axis angle ⁇ so that the optical axis C2 of the objective lens 21 and the rotation axis C1 are approximately parallel. Note that, in the above description, the driver 12 changes the radial position r in a direction perpendicular to the rotation axis C1, fixes the Z position at the first position, the second position, etc., or sets the optical axis angle ⁇ . However, instead of or in addition to this, the driver 22 may perform these operations.
  • the imaging unit 20 captures an image of an object.
  • the imaging unit 20 captures an image of the edge WFE and wafer surface WF1 of the wafer WF.
  • the imaging unit 20 includes, for example, an objective lens 21, a drive unit 22, and a light receiving unit 23.
  • the imaging unit 20 may further include a light source 24, a pinhole 25, a beam splitter 26, an optical element 27, an optical element 28, and a sensor 29. Note that, as long as the imaging unit 20 can capture images of the edge WFE and wafer surface WF1 of the wafer WF, any of the above optical components may be replaced with other optical components, or other optical components may be included in addition to the above optical components.
  • the imaging unit 20 captures an image of the edge WFE from the reflected light R1 reflected by the end WFE of the wafer WF.
  • the imaging unit 20 also captures an image of the wafer surface WF1 from the reflected light R1 reflected by the wafer surface WF1.
  • this may be simply described as the imaging unit 20 capturing images of the edge WFE and wafer surface WF1 from the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF.
  • the reflected light R1 reflected by at least one of the edge WFE and wafer surface WF1 of the wafer WF may be simply described as the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF.
  • the objective lens 21 focuses the reflected light R1 reflected by the object. Specifically, for example, the objective lens 21 focuses the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF.
  • the reflected light R1 may be illumination light L1 emitted from the light source 24, reflected by at least one of the edge WFE and wafer surface WF1 of the wafer WF.
  • the objective lens 21 has an optical axis C2. The direction of the optical axis C2 is called the optical axis direction.
  • At least one of the drive units 12 and 22 can change the Z position, which is the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21.
  • the drive unit 22 moves the position of the objective lens 21 in the optical axis direction.
  • At least one of the drive units 12 and 22 can also change the optical axis angle ⁇ , which is the angle of the optical axis C2 of the objective lens 21 with respect to a plane perpendicular to the rotation axis C1 of the stage 10.
  • at least one of the drive units 12 and 22 can change the radial position r, which is the relative position between the object and the focusing position of the objective lens 21 in a direction perpendicular to the rotation axis C1 of the stage 10.
  • the drive unit 12 changes the radial position r of the objective lens 21 by moving the stage 10 in the ⁇ -axis direction.
  • FIG. 12 and 13 are diagrams illustrating positions G1 and G2 at which the imaging unit 20 captures an image when the support unit 40 rotates in an image processing device 2 according to embodiment 2.
  • information about position G1 at which the imaging unit 20 captures an image includes sampling time t1, radial position r1, and rotation angle ⁇ 1.
  • rotation of the support unit 40, such as the stage 10 causes the rotation angle ⁇ to change from ⁇ 1 to ⁇ 2.
  • movement of the support unit 40, such as the stage 10 in the - ⁇ -axis direction causes the radial position r to change from r1 to r2.
  • Information about position G2 at which the imaging unit 20 captures an image includes sampling time t2, radial position r2, and rotation angle ⁇ 2.
  • the imaging unit 20 may have a confocal optical system. This allows the imaging unit 20 to capture images so that the edge WFE and wafer surface WF1 of the wafer WF are in focus.
  • the imaging unit 20 and the like may move the Z position of the objective lens 21 during imaging without paying attention to focus.
  • the brightness of reflected light R1 from the focused positions of the edge WFE and wafer surface WF1 increases. Therefore, the focused Z position can be identified. Therefore, after imaging, various data information such as the focused radial position r, Z position, brightness information, evaluation parameters, rotation angle ⁇ , optical axis angle ⁇ , and sampling time of the image are referenced. Profiles of the edge WFE and wafer surface WF1 of the wafer WF can be formed using the referenced various data information. In this way, the imaging unit 20 enables three-dimensional measurement.
  • the imaging unit 20 is connected to the image processing unit 30a via a communication line that includes at least one of wireless and wired connections. Specifically, the imaging unit 20 is connected in a state where it can transmit information including image data, radial position r, Z position, rotation angle ⁇ , optical axis angle ⁇ , sampling time, etc. to the image processing unit 30a. The imaging unit 20 outputs this information to the image processing unit 30a.
  • FIG. 14 is a block diagram illustrating the image processing unit 30a according to the second embodiment.
  • the image processing unit 30a includes an acquisition unit 31, an identification unit 32, a storage unit 33, a generation unit 34, and a control unit 35, similar to the image processing device 30 described above. Note that the image processing unit 30a does not necessarily have to include the storage unit 33.
  • the acquisition unit 31 acquires an image of the object via the objective lens 21.
  • the acquisition unit 31 acquires an image of the end or main surface of the object via the objective lens 21 by changing the Z position while rotating the object around the rotation axis C1 using at least one of the drive units 12 and 22.
  • the acquisition unit 31 acquires multiple images with different Z positions and different rotation angles ⁇ of the object.
  • FIG. 15 is a diagram illustrating an example of an image acquired by the acquisition unit 31 in the image processing unit 30a according to embodiment 2, where the horizontal axis represents the rotation angle ⁇ and the vertical axis represents the pixels capturing an image of an area on the wafer surface WF1 of the wafer WF at a predetermined radial position r.
  • the acquisition unit 31 acquires multiple images with different Z positions and radial positions r. Each image corresponds to the rotation angle ⁇ of position G on the wafer surface WF1 of the wafer WF captured by the imaging unit 20, and the pixels capturing the image of the wafer surface WF1 at the rotation angle ⁇ .
  • the rotation angle ⁇ in the captured image may include a range of 0° to 360° so as to correspond to the entire circumference surrounding the central axis C1 of the wafer surface WF1 of the wafer WF, or may include a predetermined partial range.
  • the acquisition unit 31 may acquire multiple captured images with the radial position r changed. For example, the acquisition unit 31 may acquire each captured image when the radial position r is changed from r1 to r3 or more. Furthermore, the acquisition unit 31 may acquire multiple captured images when the Z position is moved in the ⁇ -axis direction at a predetermined distance within the focal depth of the optical system including the objective lens 21, such as from the first position Z1 to the second position Z2 to the third position Z3, etc.
  • the Z position when the Z position is the first position Z1, multiple captured images may be acquired from the radial positions r1 to r3 or more; when the Z position is the second position Z2, multiple captured images may be acquired from the radial positions r1 to r3 or more; and when the Z position is the third position Z3, multiple captured images may be acquired from the radial positions r1 to r3 or more.
  • the horizontal axis represents the rotation angle ⁇
  • the vertical axis represents the pixels capturing the area on the wafer surface WF1 of the wafer WF at the predetermined radial position r.
  • FIG. 16 is a diagram illustrating the correspondence between the positions of pixels included in each captured image and the positions on the wafer surface WF1 of the wafer WF when the radial position r increases at each sampling time t in the image processing unit 30a according to the second embodiment.
  • the pixels included in the captured image may correspond to positions on the wafer surface WF1 of the wafer WF at a radial position between r01 and r02 and at a rotation angle between 0° and 360°.
  • radial position r01 may be the radial position at which the measurement target area closest to the center of the wafer surface WF1 will be included in the captured image when the radial position r is increased by a small amount from radial position r01 at a rotation angle of 0°.
  • radial position r0E may be the radial position at which the measurement target area closest to the edge WFE of the wafer WF will be included in the captured image when the radial position is decreased by a small amount from radial position r0E at a rotation angle of 0°. This allows the measurement target area on the wafer surface WF1 to be included in the captured image without any deficiencies.
  • Figure 16 the pixels included in the captured image are arranged in a line extending in one direction, but this is not limited to this. Also, Figure 16 shows an example where the radial position r increases sequentially, but if the radial position r decreases sequentially, the above explanation can be followed in reverse.
  • the acquisition unit 31 can acquire the captured image shown in FIG. 15 based on captured images of the wafer surface WF1 of the wafer WF at multiple sampling times t, as shown in the example of FIG. 16.
  • the acquisition unit 31 may acquire captured images from the first light receiving unit 23a that receives light of a first wavelength.
  • the captured image acquired from the first light receiving unit 23a is referred to as the first captured image.
  • the acquisition unit 31 may acquire captured images from the second light receiving unit 23b that receives light of a second wavelength.
  • the captured image acquired from the second light receiving unit 23b is referred to as the second captured image.
  • the acquisition unit 31 may acquire captured images from the third light receiving unit 23c that receives light of a third wavelength.
  • the captured image acquired from the third light receiving unit 23c is referred to as the third captured image.
  • the acquisition unit 31 may acquire second and third inspection images similar to those in FIG. 5.
  • the first, second, and third captured images may each include multiple inspection images with different radial positions r and Z positions.
  • the control unit 35 controls the support unit 40 and drive units 12 and 22, and causes the acquisition unit 31 to acquire multiple captured images.
  • the control unit 35 also controls the operation of the acquisition unit 31, identification unit 32, memory unit 33, and generation unit 34.
  • the control unit 35 causes the acquisition unit 31 to acquire multiple captured images of the end of the object by changing the Z position using at least one of the drive units 12 and 22 while rotating the object around the rotation axis C1 using the support unit 40.
  • the control unit 35 rotates the object around the rotation axis C1 using the support unit 40, while causing the drive unit 22 to change the radial position r in a first direction perpendicular to the rotation axis, and while maintaining the Z position at the first position, causing the acquisition unit 41 to acquire multiple captured images of the object's main surface. Furthermore, while rotating the object around the rotation axis C1 using the support unit 40, the control unit 35 changes the radial position r in a second direction perpendicular to the rotation axis using the drive unit 22, and while maintaining the Z position at the second position, causing the acquisition unit 41 to acquire multiple captured images of the object's main surface.
  • the second direction is the opposite direction to the first direction.
  • the second position is a position different from the first position. Furthermore, the difference between the first position and the second position is within the focal depth of the optical system of the imaging unit 20, including the objective lens 21. Note that in this embodiment, the drive unit 12 may be driven instead of or in addition to the drive unit 22.
  • the identification unit 32 identifies pixel position P, which is the position of a pixel where the brightness is above a predetermined level, in a plurality of captured images that each have a different Z position and a different rotation angle ⁇ of the object.
  • the identification unit 32 may also identify pixel position P, which is the position of a pixel where the brightness is above a predetermined level, in a plurality of captured images that each have a different radial position r, Z position, rotation angle ⁇ , and optical axis angle ⁇ . Note that, as described above, brightness above a predetermined level is an example of the quality of the evaluation parameters for a pixel being above a predetermined level.
  • the identification unit 32 may identify a first-wavelength pixel position P1 in a plurality of first captured images acquired by the acquisition unit 31, each having a different radial position r or Z position.
  • the identification unit 32 may identify a second-wavelength pixel position in a plurality of second captured images acquired by the acquisition unit 31, each having a different radial position r or Z position, or may identify a third-wavelength pixel position in a plurality of third captured images, each having a different radial position r or Z position.
  • a pixel position in the first captured image where the brightness is greater than or equal to a predetermined value is referred to as a first-wavelength pixel position P1
  • a pixel position in the second captured image where the brightness is greater than or equal to a predetermined value is referred to as a second-wavelength pixel position P2 (not shown)
  • a pixel position in the third captured image where the brightness is greater than or equal to a predetermined value is referred to as a third-wavelength pixel position P3 (not shown).
  • the storage unit 33 stores the identified pixel position P together with the Z position and rotation angle ⁇ in each captured image as high-brightness information.
  • the storage unit 33 may also store the identified pixel position P together with the radial position r, Z position, rotation angle ⁇ , optical axis angle ⁇ , and sampling time in each captured image as high-brightness information.
  • the generation unit 34 may generate image information of the object based on multiple pieces of high-brightness information that have been stored.
  • the generation unit 34 may also generate image information of the object based on pixels with a predetermined or higher brightness.
  • the generation unit 34 may generate image information of the object at a specific rotation angle ⁇ based on pixels with a predetermined or higher brightness in multiple captured images that have different Z positions at a specific rotation angle ⁇ .
  • the generation unit 34 may generate image information of the object at multiple specific rotation angles ⁇ based on pixels whose brightness is a predetermined value or higher in multiple captured images whose Z positions are different and whose rotation angle ⁇ is a first rotation angle ⁇ 1, and pixels whose brightness is a predetermined value or higher in multiple captured images whose Z positions are different and whose rotation angle ⁇ is a second rotation angle ⁇ 2.
  • the generation unit 34 may generate the above-mentioned image information for multiple radial positions r. In this way, the generation unit 34 may generate image information of the main surface of the object.
  • the image information may include a profile of the main surface of the object.
  • FIG. 17 is a diagram illustrating a profile of the wafer surface WF1 of the wafer WF generated by the generation unit 34 in the image processing device 2 according to the second embodiment.
  • FIG. 17 also shows the case where the radial position r is multiple values r1 to r5.
  • the generation unit 34 may generate a profile of the cross section of the wafer surface WF1 of the wafer WF as image information.
  • a pixel whose brightness (goodness of the pixel evaluation parameter) is equal to or greater than a predetermined value can be considered to be appropriately focused on the surface of the edge WFE or the wafer surface WF1 at that Z position, e.g., just focused. Therefore, the generation unit 34 can generate a profile of the cross section of the wafer surface WF1 or the edge WFE of the wafer WF based on the Z position at that time.
  • the generation unit 34 may generate image information about the rotation angle ⁇ of the entire circumference of the main surface of the object based on the high-brightness information or pixels with a brightness equal to or greater than a predetermined value. For example, the generation unit 34 may generate a cross-sectional profile about the rotation angle ⁇ of the entire circumference of the wafer surface WF1 of the wafer WF.
  • the generation unit 34 may generate image information of the end of the object, as in the first embodiment described above. That is, the generation unit 34 generates image information of the end of the object at a specific rotation angle ⁇ based on pixels whose brightness is above a predetermined level in multiple captured images where the rotation angle ⁇ is a specific rotation angle ⁇ , the optical axis angle ⁇ is a first optical axis angle, and the Z positions are different from each other, and pixels whose brightness is above a predetermined level in multiple captured images where the rotation angle ⁇ is a specific rotation angle ⁇ , the optical axis angle ⁇ is a second optical axis angle, and the Z positions are different from each other.
  • Generating image information based on high-brightness information or pixels whose brightness is above a predetermined level may include the generation unit 34 outputting brightness level information for the brightness of pixels whose brightness is above a predetermined level, or height information indicating the physical height based on the Z position at that time. Furthermore, the generation unit 34 generating image information based on high-brightness information or pixels with a predetermined or higher brightness may include the generation unit 34 obtaining statistical values such as the average or standard deviation of multiple brightnesses for multiple pixels with a predetermined or higher brightness, or obtaining values interpolated based on the multiple brightnesses, and outputting this as brightness level information or height information converted into physical height. The generation unit may generate a color image of the object represented in multiple colors based on pixels with a predetermined or higher brightness in the first captured image, the second captured image, etc.
  • the control unit 35 may inspect the wafer WF for defects, etc. based on the image information.
  • the inspection target wafer WF may be inspected by comparing image information of an ideal wafer WF with image information of the inspection target wafer WF.
  • the inspection target wafer WF may be inspected using a profile, color image, or image of the edge WFE generated based on the image information.
  • Fig. 18 is a flow chart illustrating an image processing method using the image processing device 2 according to the second embodiment.
  • the reflected light R1 is focused by the objective lens 21.
  • the control unit 35 drives the drive unit 22 to control the position of the objective lens 21 so that the reflected light R1 reflected by the object is focused by the objective lens 21.
  • step S34 the acquisition unit 31 is caused to acquire the captured image.
  • the control unit 35 controls the acquisition unit 31 to acquire the captured image.
  • the control unit 35 causes the acquisition unit 31 to acquire the captured image of the object via the objective lens 21.
  • the control unit 35 may also cause the acquisition unit 31 to acquire multiple captured images.
  • the control unit 35 may cause the acquisition unit 31 to acquire multiple captured images of the end of the object by rotating the object around the rotation axis C1 using the support unit 40 and changing the Z position using at least one of the drive units 12 and 22.
  • the control unit 35 may rotate the object around the rotation axis C1 using the support unit 40, change the radial position in a first direction perpendicular to the rotation axis using the drive unit, and keep the Z position constant at the first position, causing the acquisition unit 31 to acquire multiple captured images of the main surface of the object.
  • the control unit 35 may rotate the object around the rotation axis C1 using the support unit 40, change the radial position in a second direction perpendicular to the rotation axis using the drive unit, and keep the Z position constant at the second position, causing the acquisition unit 31 to acquire multiple captured images of the main surface of the object.
  • the control unit 35 causes the identification unit 32 to identify pixel position P, which is the position of a pixel where the brightness is equal to or greater than a predetermined value, in multiple captured images each having a different Z position and a different rotation angle ⁇ of the object.
  • the image processing method of this embodiment may include, between steps S35 and 36, a step of storing the identified pixel position P in the memory unit 33 as high-brightness information, together with the radial position r, Z position, rotation angle ⁇ , optical axis angle ⁇ , and sampling time in each captured image.
  • (Appendix 1) an acquisition unit that acquires inspection images of the end of the object at a plurality of Z positions from the reflected light collected by the objective lens by rotating the object around the rotation axis on a stage having the rotation axis and moving a Z position, which is a position of the objective lens in the optical axis direction of the objective lens that collects the reflected light reflected by the end of the object; an identifying unit that identifies pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of inspection images each having a different Z position and a different rotation angle of the object; a storage unit that stores the identified pixel position together with the Z position and the rotation angle in each inspection image as high-brightness information; a generating unit that generates image information of the edge of the object based on the stored plurality of pieces of high-brightness information;
  • An image processing device comprising: (Appendix 2) the storage unit stores, as the high-brightness information, an optical axis angle, which is an angle
  • the image processing device of claim 1 (Appendix 3) the target object includes a plate-like object, the storage unit stores the high-brightness information at a plurality of the optical axis angles, the generation unit generates the image information based on the high-brightness information at a plurality of the optical axis angles with respect to the object. 3.
  • the image processing device according to claim 2. (Appendix 4) the generation unit generates the image information regarding the rotation angle of the entire circumference at the end of the object based on the high-brightness information. 2.
  • the acquisition unit acquiring a first inspection image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light; acquiring a second inspection image from a second light receiving unit that receives light of a second wavelength corresponding to a second color in the reflected light;
  • the identification unit identifying first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first inspection images each having a different Z position; identifying second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the second inspection images each having a different Z position;
  • the storage unit storing the first-wavelength pixel position together with the Z position and the rotation angle as first-wavelength high-brightness information in each first inspection image; storing the second-wavelength pixel positions together with the Z position and the rotation angle as second-wavelength high-brightness information in each second inspection image;
  • the generation unit
  • the image processing device of claim 1. (Appendix 6)
  • the acquisition unit further acquiring a third inspection image from a third light receiving unit that receives light of a third wavelength corresponding to a third color in the reflected light;
  • the identification unit further identifying third-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the third inspection images each having a different Z position;
  • the storage unit The third-wavelength pixel position is further stored as third-wavelength high-brightness information in each third inspection image, together with the Z position and the rotation angle;
  • the generation unit generating a color image of the edge of the object represented in a plurality of colors based on the first wavelength high brightness information, the second wavelength high brightness information, and the third wavelength high brightness information; 6.
  • the image processing device according to claim 5.
  • the object includes a wafer; the end portion includes an edge of the wafer; 2.
  • the image processing device of claim 1. (Appendix 8) a control unit that inspects the object based on the image information; 8.
  • An image processing device according to any one of claims 1 to 7.
  • (Appendix 9) a first step of rotating an object around a rotation axis on a stage having the rotation axis, while moving a Z position, which is a position of the objective lens in the optical axis direction of the objective lens that collects reflected light reflected at an end of the object, and having an acquisition unit acquire inspection images of the end of the object at a plurality of the Z positions from the reflected light collected by the objective lens; a second step of causing an identifying unit to identify pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of inspection images each having a different Z position and a different rotation angle of the object; a third step of storing the identified pixel position together with the Z position and the rotation angle in each inspection image in a storage unit as high-brightness information; a fourth step of causing a generation unit to generate image information of the edge of the object based on the stored plurality of pieces of high-brightness information;
  • An image processing method comprising: (Appendix 10)
  • the target object includes a plate-like object
  • the third step storing the high-brightness information for a plurality of the optical axis angles
  • the fourth step generating the image information based on the high-brightness information at a plurality of the optical axis angles with respect to the object; 11.
  • An image processing method according to claim 10. (Appendix 12) In the fourth step, generating the image information for the rotation angle of the entire circumference of the end portion of the object based on the high-brightness information; 10.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Microscoopes, Condenser (AREA)
  • Image Input (AREA)
  • Image Processing (AREA)

Abstract

This image processing device (2) according to the present disclosure comprises: a support unit (40) that rotatably supports an object around a rotation axis (C1); an objective lens (21) that condenses a reflected light (R1) reflected by the object; drive units (12, 22) that can change a Z position, which is the relative position of the objective lens (21) with respect to the object, in an optical axis (C2) direction of the objective lens (21); an acquisition unit (31) that acquires a captured image of the object via the objective lens (21); a control unit (35) that causes the acquisition unit (31) to acquire a plurality of captured images; a specification unit (32) that specifies, in a plurality of captured images having different Z positions and different rotation angles (theta) of the object, a pixel position that is the position of a pixel having a luminance greater than or equal to a prescribed luminance; a storage unit (33) that stores the specified pixel position (P) as high-luminance time information together with the Z position and the rotation angle (theta) in each captured image; and a generation unit (34) that generates image information of the object on the basis of the high-luminance time information.

Description

画像処理装置及び画像処理方法Image processing device and image processing method

 本開示は、画像処理装置及び画像処理方法に関する。 This disclosure relates to an image processing device and an image processing method.

 例えば、特許文献1には、ウェーハエッジの周回りの形状を計測する方法が提案されている。 For example, Patent Document 1 proposes a method for measuring the peripheral shape of a wafer edge.

特許第6644282号公報Patent No. 6644282

 このように、ウェーハエッジの周回りの形状等のような対象物の情報を高速に、かつ精度よく計測する方法が求められている。 As such, there is a demand for a method to quickly and accurately measure information about objects, such as the shape of the circumference of a wafer edge.

 本開示は、このような問題を解決するためになされたものであり、対象物の情報を高速に、かつ精度よく計測することができる画像処理装置及び画像処理方法を提供することを目的とする。 The present disclosure has been made to solve these problems, and aims to provide an image processing device and image processing method that can measure information about an object quickly and accurately.

 本開示に係る画像処理装置は、対象物を回転軸の回りで回転可能に支持する支持部と、前記対象物で反射した反射光を集光する対物レンズと、前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を変更可能な駆動部と、前記対物レンズを介して前記対象物の撮像画像を取得する取得部と、前記支持部及び前記駆動部を制御し、前記取得部に複数の前記撮像画像を取得させる制御部と、前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定する特定部と、特定した前記画素位置を、各撮像画像における前記Z位置及び前記回転角度とともに高輝度時情報として記憶する記憶部と、記憶した複数の前記高輝度時情報に基づいて、前記対象物のイメージ情報を生成する生成部と、を備える。 The image processing device disclosed herein includes a support unit that supports an object rotatably around a rotation axis, an objective lens that focuses light reflected by the object, a drive unit that can change the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, an acquisition unit that acquires captured images of the object via the objective lens, a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire multiple captured images, an identification unit that identifies pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object, a storage unit that stores the identified pixel positions together with the Z position and the rotation angle in each captured image as high-brightness information, and a generation unit that generates image information of the object based on the stored multiple pieces of high-brightness information.

 上記画像処理装置では、前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記Z位置を変更させることで、前記取得部に前記対象物の端部の複数の前記撮像画像を取得させてもよい。 In the above image processing device, the control unit may cause the acquisition unit to acquire multiple captured images of the end of the object by rotating the object around the rotation axis using the support unit and changing the Z position using the drive unit.

 上記画像処理装置では、前記駆動部は、前記回転軸に垂直な方向に前記対象物と前記対物レンズの集光位置との相対位置である半径位置を変更可能であり、前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記回転軸に垂直な第1方向に変更させ、且つ、前記Z位置を第1位置で一定にさせて、前記取得部に前記対象物の主面の複数の前記撮像画像を取得させてもよい。 In the above image processing device, the drive unit can change the radial position, which is the relative position between the object and the focusing position of the objective lens, in a direction perpendicular to the rotation axis, and the control unit can rotate the object around the rotation axis using the support unit, change the radial position using the drive unit in a first direction perpendicular to the rotation axis, and keep the Z position constant at the first position, and cause the acquisition unit to acquire multiple captured images of the main surface of the object.

 上記画像処理装置では、前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記第1方向と反対方向の第2方向に変更させ、且つ、前記Z位置を第2位置で一定にさせて、前記取得部に前記対象物の前記主面の複数の前記撮像画像を取得させ、前記第2位置は、前記第1位置と異なり、前記第2位置と前記第1位置との差は、前記対物レンズを含む光学系の焦点深度以内でもよい。 In the above image processing device, the control unit causes the support unit to rotate the object around the rotation axis, while causing the drive unit to change the radial position in a second direction opposite to the first direction and keeping the Z position constant at the second position, causing the acquisition unit to acquire multiple captured images of the main surface of the object, wherein the second position is different from the first position, and the difference between the second position and the first position may be within the focal depth of an optical system including the objective lens.

 本開示に係る画像処理装置は、対象物を回転軸の回りで回転可能に支持する支持部と、前記対象物で反射した反射光を集光する対物レンズと、前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を変更可能な駆動部と、前記対物レンズを介して前記対象物の撮像画像を取得する取得部と、前記支持部及び前記駆動部を制御し、前記取得部に複数の前記撮像画像を取得させる制御部と、前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定する特定部と、前記輝度が所定以上となる画素に基づいて、前記対象物のイメージ情報を生成する生成部と、を備え、前記生成部は、前記回転角度が特定の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、前記特定の回転角度における前記対象物のイメージ情報を生成してもよい。 The image processing device according to the present disclosure includes a support unit that supports an object rotatably around a rotation axis; an objective lens that focuses light reflected by the object; a drive unit that can change the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens; an acquisition unit that acquires captured images of the object via the objective lens; a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire multiple captured images; an identification unit that identifies pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object; and a generation unit that generates image information of the object based on the pixels where the brightness is equal to or greater than the predetermined value, and the generation unit may generate image information of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than the predetermined value in multiple captured images that have different Z positions at the specific rotation angle.

 上記画像処理装置では、前記生成部は、前記回転角度が第1の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、前記回転角度が第2の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、に基づいて、複数の特定の前記回転角度における前記対象物の前記イメージ情報を生成してもよい。 In the image processing device described above, the generation unit may generate the image information of the object at a plurality of specific rotation angles based on pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images in which the rotation angle is a first rotation angle and the Z positions are different from one another, and pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images in which the rotation angle is a second rotation angle and the Z positions are different from one another.

 上記画像処理装置では、前記駆動部は、前記回転軸に直交した面に対する前記対物レンズの光軸の角度である光軸角度を変更可能であり、前記特定部は、前記Z位置、前記回転角度及び前記光軸角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定し、前記生成部は、前記回転角度が特定の回転角度であり、前記光軸角度が第1の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、前記回転角度が特定の回転角度であり、前記光軸角度が第2の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、に基づいて、前記特定の前記回転角度における前記対象物の端部の前記イメージ情報を生成してもよい。 In the image processing device, the drive unit can change the optical axis angle, which is the angle of the optical axis of the objective lens with respect to a plane perpendicular to the rotation axis, and the identification unit identifies pixel positions where the brightness is greater than or equal to a predetermined value in the plurality of captured images where the Z position, rotation angle, and optical axis angle are different, and the generation unit may generate the image information of the edge of the object at the specific rotation angle based on the pixels where the brightness is greater than or equal to the predetermined value in the plurality of captured images where the rotation angle is the specific rotation angle, the optical axis angle is a first optical axis angle, and the Z positions are different, and the pixels where the brightness is greater than or equal to the predetermined value in the plurality of captured images where the rotation angle is the specific rotation angle, the optical axis angle is a second optical axis angle, and the Z positions are different.

 上記画像処理装置では、前記取得部は、前記反射光における第1の色に対応した第1波長の光を受光する第1受光部から第1撮像画像を取得し、前記反射光における第2の色に対応した第2波長の光を受光する第2受光部から第2撮像画像を取得し、前記特定部は、前記Z位置がそれぞれ異なる複数の前記第1撮像画像において前記輝度が所定以上となる前記画素位置である第1波長画素位置を特定し、前記Z位置がそれぞれ異なる複数の前記第2撮像画像において前記輝度が所定以上となる前記画素位置である第2波長画素位置を特定し、前記生成部は、前記第1撮像画像及び前記第2撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、複数の色で表された前記対象物のカラーイメージを生成してもよい。 In the image processing device described above, the acquisition unit acquires a first captured image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light, and acquires a second captured image from a second light receiving unit that receives light of a second wavelength corresponding to a second color in the reflected light, the identification unit identifies first-wavelength pixel positions in the first captured images having different Z positions, where the pixel positions have a predetermined luminance or higher, and identifies second-wavelength pixel positions in the second captured images having different Z positions, where the pixel positions have a predetermined luminance or higher, and the generation unit may generate a color image of the object represented in multiple colors based on the pixels in the first captured image and the second captured image having a luminance or higher.

 本開示に係る画像処理方法は、対象物を回転軸の回りで回転可能に支持部に支持させるステップと、前記対象物で反射した反射光を対物レンズで集光させるステップと、前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を駆動部に変更させるステップと、前記対物レンズを介して前記対象物の撮像画像を取得部に取得させるステップと、前記支持部及び前記駆動部を制御する制御部が、前記取得部に複数の前記撮像画像を取得させるステップと、前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定部に特定させるステップと、特定した前記画素位置を、各撮像画像における前記Z位置及び前記回転角度とともに高輝度時情報として記憶部に記憶させるステップと、記憶した複数の前記高輝度時情報に基づいて、前記対象物のイメージ情報を生成部に生成させるステップと、を備える。 The image processing method disclosed herein comprises the steps of supporting an object on a support unit so that it can rotate around a rotation axis; focusing light reflected by the object with an objective lens; changing the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, with a drive unit; causing an acquisition unit to acquire an image of the object via the objective lens; a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire a plurality of the imaged images; causing an identification unit to identify pixel positions, which are positions of pixels where the brightness is equal to or greater than a predetermined value, in a plurality of imaged images that have different Z positions and different rotation angles of the object; storing the identified pixel positions, along with the Z positions and rotation angles in each imaged image, in a storage unit as high-brightness information; and causing a generation unit to generate image information of the object based on the stored plurality of high-brightness information.

 上記画像処理方法では、前記撮像画像を前記取得部に取得させるステップにおいて、前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記Z位置を変更させることで、前記取得部に前記対象物の端部の複数の前記撮像画像を取得させてもよい。 In the above image processing method, in the step of having the acquisition unit acquire the captured images, the control unit may cause the acquisition unit to acquire the multiple captured images of the end portion of the object by rotating the object around the rotation axis using the support unit and changing the Z position using the drive unit.

 上記画像処理方法では、前記支持部に支持させるステップにおいて、前記駆動部は、前記回転軸に垂直な方向に前記対象物を移動させ、前記撮像画像を前記取得部に取得させるステップにおいて、前記制御部は、
 前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記回転軸に垂直な第1方向に変更させ、且つ、前記Z位置を第1位置で一定にさせて、前記取得部に前記対象物の主面の複数の前記撮像画像を取得させてもよい。
In the image processing method, in the step of supporting the object on the support unit, the drive unit moves the object in a direction perpendicular to the rotation axis, and in the step of causing the acquisition unit to acquire the captured image, the control unit
The object may be rotated around the rotation axis by the support unit, while the radial position is changed in a first direction perpendicular to the rotation axis by the drive unit, and the Z position is kept constant at the first position, causing the acquisition unit to acquire multiple captured images of the main surface of the object.

 上記画像処理方法では、前記撮像画像を前記取得部に取得させるステップにおいて、
 前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記第1方向と反対方向の第2方向に変更させ、且つ、前記Z位置を第2位置で一定にさせて、前記取得部に前記対象物の前記主面の複数の前記撮像画像を取得させ、前記第2位置は、前記第1位置と異なり、前記第2位置と前記第1位置との差は、前記対物レンズを含む光学系の焦点深度以内でもよい。
In the image processing method, in the step of causing the acquisition unit to acquire the captured image,
The control unit rotates the object around the rotation axis using the support unit, changes the radial position in a second direction opposite to the first direction using the drive unit, and keeps the Z position constant at a second position, causing the acquisition unit to acquire multiple images of the main surface of the object, wherein the second position is different from the first position, and the difference between the second position and the first position may be within the focal depth of an optical system including the objective lens.

 本開示に係る画像処理方法は、対象物を回転軸の回りで回転可能に支持部に支持させるステップと、前記対象物で反射した反射光を対物レンズで集光させるステップと、前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を駆動部に変更させるステップと、前記対物レンズを介して前記対象物の撮像画像を取得部に取得させるステップと、前記支持部及び前記駆動部を制御する制御部が、前記取得部に複数の前記撮像画像を取得させるステップと、前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定部に特定させるステップと、前記輝度が所定以上となる画素に基づいて、前記対象物のイメージ情報を生成部に生成させるステップと、を備え、前記生成部に生成させるステップにおいて、前記生成部は、前記回転角度が特定の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、前記特定の回転角度における前記対象物のイメージ情報を生成する。 The image processing method disclosed herein includes the steps of supporting an object on a support unit so that it can rotate around a rotation axis; focusing light reflected by the object with an objective lens; changing the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, with a drive unit; causing an acquisition unit to acquire an image of the object via the objective lens; a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire multiple captured images; causing an identification unit to identify pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object; and causing a generation unit to generate image information of the object based on the pixels where the brightness is equal to or greater than the predetermined value. In the step of causing the generation unit to generate image information, the generation unit generates image information of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than the predetermined value in multiple captured images that have different Z positions at the specific rotation angle.

 上記画像処理方法では、前記生成部に生成させるステップにおいて、前記生成部は、前記回転角度が第1の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、前記回転角度が第2の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、に基づいて、複数の特定の前記回転角度における前記対象物の前記イメージ情報を生成してもよい。 In the above image processing method, in the step of causing the generation unit to generate image information, the generation unit may generate the image information of the object at a plurality of specific rotation angles based on pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images whose rotation angle is a first rotation angle and whose Z positions are different from one another, and pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images whose rotation angle is a second rotation angle and whose Z positions are different from one another.

 上記画像処理方法では、前記Z位置を前記駆動部に変更させるステップにおいて、
 前記駆動部は、前記回転軸に直交した面に対する前記対物レンズの光軸の角度である光軸角度を変更させ、前記画素位置を前記特定部に特定させるステップにおいて、前記特定部は、前記Z位置、前記回転角度及び前記光軸角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定し、前記生成部に生成させるステップにおいて、前記生成部は、前記回転角度が特定の回転角度であり、前記光軸角度が第1の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、前記回転角度が特定の回転角度であり、前記光軸角度が第2の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、に基づいて、前記特定の前記回転角度における前記対象物の端部の前記イメージ情報を生成しもよい。
In the image processing method, in the step of changing the Z position to the drive unit,
In the step of causing the driving unit to change the optical axis angle, which is the angle of the optical axis of the objective lens with respect to a plane perpendicular to the rotation axis, and causing the identifying unit to identify the pixel positions, the identifying unit identifies pixel positions, which are positions of pixels where the brightness is equal to or greater than a predetermined value, in a plurality of the captured images, each of which has a different Z position, rotation angle, and optical axis angle, and causes the generating unit to generate the image information, the generating unit may generate the image information of the edge of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than a predetermined value in a plurality of the captured images, each of which has a rotation angle that is a specific rotation angle, a first optical axis angle, and a different Z position, and pixels where the brightness is equal to or greater than a predetermined value in a plurality of the captured images, each of which has a rotation angle that is a specific rotation angle, a second optical axis angle, and a different Z position.

 上記画像処理方法では、前記撮像画像を取得させるステップにおいて、前記制御部は、前記取得部に、前記反射光における第1の色に対応した第1波長の光を受光する第1受光部から第1撮像画像を取得させ、前記反射光における第2の色に対応した第2波長の光を受光する第2受光部から第2撮像画像を取得させ、前記画素位置を前記特定部に特定させるステップにおいて、前記特定部は、前記Z位置がそれぞれ異なる複数の前記第1撮像画像において前記輝度が所定以上となる前記画素位置である第1波長画素位置を特定し、前記Z位置がそれぞれ異なる複数の前記第2撮像画像において前記輝度が所定以上となる前記画素位置である第2波長画素位置を特定し、前記生成部に生成させるステップにおいて、前記生成部は、前記第1波長高輝度時情報及び前記第2波長高輝度時情報に基づいて、複数の色で表された前記対象物の前記端部のカラーイメージを生成させてもよい。 In the above image processing method, in the step of acquiring the captured images, the control unit causes the acquisition unit to acquire a first captured image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light, and to acquire a second captured image from a second light receiving unit that receives light of a second wavelength that corresponds to a second color in the reflected light. In the step of causing the identification unit to identify the pixel positions, the identification unit identifies first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first captured images having different Z positions, and identifies second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of second captured images having different Z positions. In the step of causing the generation unit to generate the image, the generation unit may generate a color image of the edge of the object represented in a plurality of colors based on the first-wavelength high-luminance information and the second-wavelength high-luminance information.

 本開示によれば、対象物の情報を高速に、かつ精度よく計測することができる画像処理装置及び画像処理方法を提供することができる。 This disclosure provides an image processing device and image processing method that can measure information about an object quickly and accurately.

実施形態1に係る検査装置を例示した構成図である。1 is a configuration diagram illustrating an inspection device according to a first embodiment. 実施形態1に係る検査装置において、ステージを例示した平面図である。FIG. 2 is a plan view illustrating a stage in the inspection device according to the first embodiment. 実施形態1に係る検査装置において、撮像部を例示した構成図である。FIG. 2 is a configuration diagram illustrating an imaging unit in the inspection device according to the first embodiment. 実施形態1に係る画像処理装置を例示したブロック図である。1 is a block diagram illustrating an image processing apparatus according to a first embodiment. 実施形態1に係る画像処理装置において、取得部が取得した検査画像を例示した図であり、横軸は、回転角度を示し、縦軸は、ウェーハのエッジを撮像した画素を示す。1 is a diagram illustrating an example of an inspection image acquired by an acquisition unit in the image processing apparatus according to the first embodiment, where the horizontal axis indicates the rotation angle and the vertical axis indicates the pixels capturing an image of the edge of the wafer. 実施形態1に係る画像処理装置において、生成部が生成したウェーハのエッジのプロファイルを例示した図である。3 is a diagram illustrating an example of a profile of a wafer edge generated by a generating unit in the image processing apparatus according to the first embodiment; FIG. 実施形態1に係る画像処理装置を用いた画像処理方法を例示したフローチャート図である。FIG. 2 is a flowchart illustrating an image processing method using the image processing device according to the first embodiment. 実施形態1に係る検査装置を用いた検査方法を例示したフローチャート図である。FIG. 3 is a flowchart illustrating an inspection method using the inspection device according to the first embodiment. 実施形態2に係る画像処理装置を例示した構成図である。FIG. 10 is a configuration diagram illustrating an image processing apparatus according to a second embodiment. 実施形態2に係る画像処理装置において、対象物を例示した断面図である。10 is a cross-sectional view illustrating an example of an object in an image processing device according to a second embodiment. FIG. 実施形態2に係る画像処理装置において、対象物を例示した平面図である。FIG. 10 is a plan view illustrating an example of an object in an image processing device according to a second embodiment. 実施形態2に係る画像処理装置において、支持部の回転時に撮像部が撮像画像を撮像する位置を例示した図である。10A and 10B are diagrams illustrating positions at which an imaging unit captures an image when a support unit rotates in an image processing device according to a second embodiment. 実施形態2に係る画像処理装置において、支持部の回転時に撮像部が撮像画像を撮像する位置を例示した図である。10A and 10B are diagrams illustrating positions at which an imaging unit captures an image when a support unit rotates in an image processing device according to a second embodiment. 実施形態2に係る画像処理部を例示したブロック図である。FIG. 10 is a block diagram illustrating an image processing unit according to a second embodiment. 実施形態2に係る画像処理部において、取得部が取得した撮像画像を例示した図であり、横軸は、回転角度を示し、縦軸は、ウェーハのウェーハ面を撮像した画素を示す。10 is a diagram illustrating an example of an image captured by an acquisition unit in an image processing unit according to a second embodiment, in which the horizontal axis indicates the rotation angle and the vertical axis indicates the pixels capturing an image of the wafer surface of the wafer. 実施形態2に係る画像処理部において、サンプリング時刻ごとに半径位置が増加する場合における、各撮像画像に含まれる画素の位置とウェーハのウェーハ面上の位置との対応関係を説明する図である。10A and 10B are diagrams illustrating the correspondence between the positions of pixels included in each captured image and the positions on the wafer surface of the wafer when the radial position increases at each sampling time in an image processing unit according to the second embodiment. 実施形態2に係る画像処理装置において、生成部が生成したウェーハのウェーハ面のプロファイルを例示した図である。10 is a diagram illustrating a profile of a wafer surface of a wafer generated by a generating unit in an image processing apparatus according to a second embodiment. FIG. 実施形態2に係る画像処理装置を用いた画像処理方法を例示したフローチャート図である。FIG. 10 is a flowchart illustrating an image processing method using the image processing device according to the second embodiment.

 以下、本実施形態の具体的構成について図面を参照して説明する。以下の説明は、本開示の好適な実施の形態を示すものであって、本開示の範囲が以下の実施の形態に限定されるものではない。以下の説明において、同一の符号が付されたものは実質的に同様の内容を示している。 The specific configuration of this embodiment will be described below with reference to the drawings. The following description illustrates a preferred embodiment of the present disclosure, and the scope of the present disclosure is not limited to the following embodiment. In the following description, parts with the same reference numerals indicate substantially similar content.

 (実施形態1)
 実施形態1に係る画像処理装置及び画像処理方法を説明する。まず、<検査装置>において、光学装置の一例として、検査装置を説明する。次に、<画像処理装置>において、光学装置に備えられた画像処理装置を説明する。次に、<画像処理方法>及び<検査方法>において、画像処理装置を用いた画像処理方法及び光学装置の利用方法の一例としての検査方法を説明する。
(Embodiment 1)
An image processing device and an image processing method according to the first embodiment will be described. First, in <Inspection device>, an inspection device will be described as an example of an optical device. Next, in <Image processing device>, an image processing device provided in the optical device will be described. Next, in <Image processing method> and <Inspection method>, an image processing method using the image processing device and an inspection method as an example of a method of using the optical device will be described.

 なお、本開示の一例である画像処理装置及び画像処理方法は、以下の実施形態で説明されるように、検査装置に対して用いるものでもよいが、これに限らない。例えば、本開示の一例である画像処理装置及び画像処理方法は、試料を照明した結果、得られた画像(撮像画像)をディスプレイ等に表示する装置(レビュー装置)として用いるものでもよい。 Note that the image processing device and image processing method, which are examples of the present disclosure, may be used in an inspection device, as described in the following embodiments, but are not limited to this. For example, the image processing device and image processing method, which are examples of the present disclosure, may be used as a device (review device) that displays an image (captured image) obtained as a result of illuminating a sample on a display or the like.

 <検査装置>
 図1は、実施形態1に係る検査装置1を例示した構成図である。図2は、実施形態1に係る検査装置1において、ステージ10を例示した平面図である。図3は、実施形態1に係る検査装置1において、撮像部20を例示した構成図である。
<Inspection equipment>
Fig. 1 is a configuration diagram illustrating an inspection device 1 according to embodiment 1. Fig. 2 is a plan view illustrating a stage 10 in the inspection device 1 according to embodiment 1. Fig. 3 is a configuration diagram illustrating an imaging unit 20 in the inspection device 1 according to embodiment 1.

 図1~図3に示すように、検査装置1は、ステージ10、撮像部20及び画像処理装置30を備えている。検査装置1は、対象物を検査する。検査装置1は、例えば、対象物の端部を検査する。 As shown in Figures 1 to 3, the inspection device 1 includes a stage 10, an imaging unit 20, and an image processing device 30. The inspection device 1 inspects an object. For example, the inspection device 1 inspects the edge of the object.

 対象物は、例えば、ウェーハWFを含む。その場合には、対象物の端部は、ウェーハWFのエッジWFEを含む。以下の説明では、対象物は、ウェーハWFとする。なお、対象物は、端部を有していれば、ウェーハWFに限らず、半導体チップ、プリント基板等の板状物体を含んでもよいし、板状物体以外の物体でもよい。 The object includes, for example, a wafer WF. In that case, the edge of the object includes the edge WFE of the wafer WF. In the following description, the object is referred to as a wafer WF. Note that the object is not limited to a wafer WF, and may include plate-like objects such as semiconductor chips and printed circuit boards, as long as it has an edge, or may be an object other than a plate-like object.

 ステージ10は、ウェーハWFを載置する。ステージ10は、ステージ面11を有している。ステージ10は、ステージ面11上にウェーハWFを平らに載置する。ウェーハWFの裏面は、ステージ面11に接触する。ステージ10は、ステージ面11上に、ウェーハWFをセットする所定のセット位置を有してもよい。例えば、ウェーハWFを検査する場合には、初めにセット位置にウェーハWFを固定させてもよい。ウェーハWFは、ウェーハ面WF1を有する。ここで、検査装置1の説明の便宜のために、αβγ直交座標軸系を導入する。ステージ面11に平行な面をαβ面とする。ステージ面11に直交する方向をγ軸方向とする。 The stage 10 places the wafer WF on it. The stage 10 has a stage surface 11. The stage 10 places the wafer WF flat on the stage surface 11. The back surface of the wafer WF contacts the stage surface 11. The stage 10 may have a predetermined set position on the stage surface 11 where the wafer WF is set. For example, when inspecting the wafer WF, the wafer WF may first be fixed at the set position. The wafer WF has a wafer surface WF1. Here, for convenience in explaining the inspection apparatus 1, an αβγ Cartesian coordinate system is introduced. The plane parallel to the stage surface 11 is defined as the αβ plane. The direction perpendicular to the stage surface 11 is defined as the γ axis direction.

 ステージ10は、例えば、回転軸C1を有する。回転軸C1は、例えば、γ軸方向に延びている。回転軸C1は、ステージ面11上に載置されたウェーハWFを通っている。よって、ステージ10は、ウェーハWFを回転軸C1の周りで回転させる。例えば、ステージ10は、モータ等の駆動部12に接続してもよい。駆動部12は、ステージ10を回転軸C1の周りで回転させる。ステージ10においてウェーハWFが回転する場合に、所定のセット位置からの回転角度をθと呼ぶ。 The stage 10 has, for example, a rotation axis C1. The rotation axis C1 extends, for example, in the γ-axis direction. The rotation axis C1 passes through the wafer WF placed on the stage surface 11. Therefore, the stage 10 rotates the wafer WF around the rotation axis C1. For example, the stage 10 may be connected to a drive unit 12 such as a motor. The drive unit 12 rotates the stage 10 around the rotation axis C1. When the wafer WF rotates on the stage 10, the rotation angle from a predetermined set position is called θ.

 ステージ10は、ステージ10の位置及びステージ面11の勾配を調整する3軸調整機構等を有してもよい。ステージ10は、所定のセット位置から回転した回転角度θをセンシングするエンコーダ等のセンサ13を含んでもよい。また、後述する対物レンズ21の駆動部22の代わりに、駆動部12は、ウェーハWFに対して相対的な対物レンズ21の位置を移動させてもよい。 The stage 10 may have a three-axis adjustment mechanism that adjusts the position of the stage 10 and the gradient of the stage surface 11. The stage 10 may also include a sensor 13 such as an encoder that senses the rotation angle θ rotated from a predetermined set position. Also, instead of the objective lens 21 driver 22 described below, the driver 12 may move the position of the objective lens 21 relative to the wafer WF.

 ステージ10は、無線及び有線の少なくとも何れかを含む通信回線で画像処理装置30に接続する。具体的には、ステージ10は、回転角度θのデータを含む情報を画像処理装置30に伝達可能な状態で接続されている。ステージ10は、画像処理装置30にセンシングした回転角度θのデータ等の情報を出力する。 The stage 10 is connected to the image processing device 30 via a communication line that includes at least one of a wireless and a wired line. Specifically, the stage 10 is connected in a state in which information including data on the rotation angle θ can be transmitted to the image processing device 30. The stage 10 outputs information such as data on the rotation angle θ sensed to the image processing device 30.

 撮像部20は、ウェーハWFのエッジWFEの検査画像を撮像する。撮像部20は、例えば、対物レンズ21、駆動部22及び受光部23を含む。撮像部20は、さらに、光源24、ピンホール25、ビームスプリッタ26、光学素子27、光学素子28、センサ29を含んでもよい。なお、撮像部20は、ウェーハWFのエッジWFEの検査画像を撮像することができれば、上記の光学部材のいずれかを他の光学部材と置き換えてもよいし、上記の光学部材に加えて、さらに、他の光学部材を含んでもよい。撮像部20は、ウェーハWFの端部WFEで反射した反射光R1からエッジWFEの検査画像を撮像する。 The imaging unit 20 captures an inspection image of the edge WFE of the wafer WF. The imaging unit 20 includes, for example, an objective lens 21, a drive unit 22, and a light receiving unit 23. The imaging unit 20 may further include a light source 24, a pinhole 25, a beam splitter 26, an optical element 27, an optical element 28, and a sensor 29. Note that, as long as the imaging unit 20 can capture an inspection image of the edge WFE of the wafer WF, any of the above optical elements may be replaced with other optical elements, or other optical elements may be included in addition to the above optical elements. The imaging unit 20 captures an inspection image of the edge WFE from reflected light R1 reflected by the edge WFE of the wafer WF.

 対物レンズ21は、ウェーハWFのエッジWFEで反射した反射光R1を集光する。反射光R1は、光源24から出射した照明光L1がウェーハWFのエッジWFEで反射したものでもよい。対物レンズ21は、光軸C2を有している。光軸C2の方向を光軸方向と呼ぶ。 The objective lens 21 focuses the reflected light R1 reflected by the edge WFE of the wafer WF. The reflected light R1 may be illumination light L1 emitted from the light source 24 reflected by the edge WFE of the wafer WF. The objective lens 21 has an optical axis C2. The direction of the optical axis C2 is referred to as the optical axis direction.

 駆動部22は、対物レンズ21の光軸方向における、対象物と対物レンズ21の集光位置との相対位置を変更させる。対物レンズ21の光軸方向における、対象物と対物レンズ21の集光位置との相対位置を、Z位置と呼ぶ。例えば、駆動部22は、光軸方向における対物レンズ21の位置を移動させる。駆動部22は、光学素子の形状、位置、姿勢を変化させること、対物レンズ21の集光距離を変化させること等により、対物レンズ21の光軸方向における、対象物と対物レンズ21の集光位置との相対位置を変更させてもよい。以降の説明においては、簡便のために、駆動部22は、光軸方向における対物レンズ21の位置を移動させることによって、対物レンズ21の光軸方向における、対象物と対物レンズ21の集光位置との相対位置(Z位置)を変更させることを例に用いて説明する。また、駆動部22は、ステージ10の回転軸C1に直交する面に対する対物レンズ21の光軸C2の角度を変化させる。ステージ10の回転軸C1に直交する面に対する対物レンズ21の光軸C2の角度を光軸角度Φと呼ぶ。  The drive unit 22 changes the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21. The relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21 is called the Z position. For example, the drive unit 22 moves the position of the objective lens 21 in the optical axis direction. The drive unit 22 may change the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21 by changing the shape, position, or attitude of the optical element, or by changing the focusing distance of the objective lens 21. For simplicity in the following explanation, an example will be given in which the drive unit 22 changes the relative position (Z position) between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21 by moving the position of the objective lens 21 in the optical axis direction. The driver 22 also changes the angle of the optical axis C2 of the objective lens 21 relative to a plane perpendicular to the rotation axis C1 of the stage 10. The angle of the optical axis C2 of the objective lens 21 relative to a plane perpendicular to the rotation axis C1 of the stage 10 is called the optical axis angle Φ.

 受光部23は、対物レンズ21で集光された反射光R1を受光する。これにより、撮像部20は、ウェーハWFの端部WFEで反射した反射光R1から検査画像を撮像する。受光部23は、第1受光部23a及び第2受光部23bを含んでもよい。また、受光部23は、第1受光部23a及び第2受光部23bに加えて、さらに、第3受光部23cを含んでもよい。 The light receiving unit 23 receives the reflected light R1 focused by the objective lens 21. As a result, the imaging unit 20 captures an inspection image from the reflected light R1 reflected by the end WFE of the wafer WF. The light receiving unit 23 may include a first light receiving unit 23a and a second light receiving unit 23b. Furthermore, the light receiving unit 23 may include a third light receiving unit 23c in addition to the first light receiving unit 23a and the second light receiving unit 23b.

 第1受光部23aは、反射光R1の第1の色に対応した第1波長の光を受光する。第2受光部23bは、反射光R1の第2の色に対応した第2波長の光を受光する。第3受光部23cは、反射光R1の第3の色に対応した第3波長の光を受光する。第1の色、第2の色及び第3の色は、例えば、赤色、青色及び緑色を含んでもよい。この場合には、撮像部20は、ダイクロイックミラー等の分光機能を有する光学素子27及び28を含んでもよい。 The first light receiving unit 23a receives light of a first wavelength corresponding to the first color of the reflected light R1. The second light receiving unit 23b receives light of a second wavelength corresponding to the second color of the reflected light R1. The third light receiving unit 23c receives light of a third wavelength corresponding to the third color of the reflected light R1. The first color, second color, and third color may include, for example, red, blue, and green. In this case, the imaging unit 20 may include optical elements 27 and 28 with a spectroscopic function, such as a dichroic mirror.

 光学素子27及び28は、反射光R1における第1波長の光、第2波長の光及び第3波長の光を分離する。なお、光学素子27及び28は、反射光R1における第1波長の光、第2波長の光、及び、第3波長の光を分離することができれば、ダイクロイックミラーに限らず、バンドパスフィルタ等の他の光学素子でもよい。 Optical elements 27 and 28 separate the reflected light R1 into the light of the first wavelength, the light of the second wavelength, and the light of the third wavelength. Note that optical elements 27 and 28 are not limited to dichroic mirrors, and may be other optical elements such as bandpass filters, as long as they are able to separate the reflected light R1 into the light of the first wavelength, the light of the second wavelength, and the light of the third wavelength.

 ピンホール25は、光源24における照明光L1の出射口に配置されている。また、ピンホール25は、受光部23(第1受光部23a、第2受光部23b及び第3受光部23c)における受光面上に配置されている。ビームスプリッタ26は、光源24から出射された照明光L1をウェーハWFに向かうように反射させる。また、ビームスプリッタ26は、ウェーハWFのエッジWFEで反射した反射光R1を受光部23に向かうように透過させる。 The pinhole 25 is located at the exit of the light source 24 from which the illumination light L1 is emitted. The pinhole 25 is also located on the light receiving surface of the light receiving unit 23 (first light receiving unit 23a, second light receiving unit 23b, and third light receiving unit 23c). The beam splitter 26 reflects the illumination light L1 emitted from the light source 24 toward the wafer WF. The beam splitter 26 also transmits the reflected light R1 reflected by the edge WFE of the wafer WF toward the light receiving unit 23.

 撮像部20は、共焦点光学系を有してもよい。よって、撮像部20は、ウェーハWFのエッジWFEにフォーカスが合うように撮像することができる。しかしながら、本実施形態では、撮像部20は、撮像時においては、フォーカスに着目せずに、対物レンズ21のZ位置を移動させてよい。フォーカスが合ったエッジWFEの位置からの反射光R1の輝度は大きくなる。これにより、フォーカスが合ったZ位置を特定することができる。そこで、撮像後において、フォーカスが合ったZ位置、輝度情報、回転角度θ、光軸角度Φ、撮像したサンプリング時刻等の各種データ情報を参照する。参照した各種データ情報により、ウェーハWFのエッジWFEのプロファイルを形成することができる。このようにして、撮像部20は、3次元測定を可能とすることができる。 The imaging unit 20 may have a confocal optical system. Therefore, the imaging unit 20 can capture an image so that the edge WFE of the wafer WF is in focus. However, in this embodiment, the imaging unit 20 may move the Z position of the objective lens 21 during imaging without paying attention to the focus. The brightness of the reflected light R1 from the position of the edge WFE in focus increases. This makes it possible to identify the Z position in focus. Therefore, after imaging, various data information such as the in-focus Z position, brightness information, rotation angle θ, optical axis angle Φ, and sampling time of the image is referenced. A profile of the edge WFE of the wafer WF can be formed using the various data information referenced. In this way, the imaging unit 20 can perform three-dimensional measurement.

 撮像部20は、所定のセット位置から移動したZ位置をセンシングするセンサ29を含んでもよい。センサ29は、所定のセット位置から変化した光軸角度Φをセンシングしてもよい。 The imaging unit 20 may include a sensor 29 that senses the Z position moved from a predetermined set position. The sensor 29 may sense the optical axis angle Φ changed from the predetermined set position.

 撮像部20は、無線及び有線の少なくとも何れかを含む通信回線で画像処理装置30に接続する。具体的には、撮像部20は、画像データ、Z位置、回転角度θ、光軸角度Φ、サンプリング時刻等のデータを含む情報を画像処理装置30に伝達可能な状態で接続されている。撮像部20は、画像処理装置30に撮像した画像データ、Z位置、回転角度θ、光軸角度Φ、サンプリング時刻等の情報を出力する。 The imaging unit 20 is connected to the image processing device 30 via a communication line that includes at least one of wireless and wired lines. Specifically, the imaging unit 20 is connected in a state where it can transmit information including image data, Z position, rotation angle θ, optical axis angle Φ, sampling time, etc. to the image processing device 30. The imaging unit 20 outputs information such as captured image data, Z position, rotation angle θ, optical axis angle Φ, sampling time, etc. to the image processing device 30.

 <画像処理装置>
 次に、画像処理装置30を説明する。図4は、実施形態1に係る画像処理装置30を例示したブロック図である。図4に示すように、画像処理装置30は、取得部31、特定部32、記憶部33、生成部34及び制御部35を備えている。取得部31、特定部32、記憶部33、生成部34及び制御部35は、取得手段、特定手段、記憶手段、生成手段及び制御手段としての機能を有する。画像処理装置30は、例えば、PC、サーバ及びスマートフォン等のコンピュータを含む情報処理装置である。
<Image processing device>
Next, the image processing device 30 will be described. Fig. 4 is a block diagram illustrating the image processing device 30 according to the first embodiment. As shown in Fig. 4, the image processing device 30 includes an acquisition unit 31, an identification unit 32, a storage unit 33, a generation unit 34, and a control unit 35. The acquisition unit 31, the identification unit 32, the storage unit 33, the generation unit 34, and the control unit 35 function as an acquisition means, an identification means, a storage means, a generation means, and a control means. The image processing device 30 is an information processing device including a computer such as a PC, a server, or a smartphone.

 取得部31は、ステージ10でウェーハWFを回転軸C1の周りで回転させながら、Z位置を移動させることにより、対物レンズ21で集光された反射光R1から複数のZ位置におけるウェーハWFのエッジWFEの検査画像を取得する。検査画像は、各Z位置においての回転角度θに対応させたエッジWFEの画素を含んでいる。 The acquisition unit 31 acquires inspection images of the edge WFE of the wafer WF at multiple Z positions from the reflected light R1 collected by the objective lens 21 by rotating the wafer WF around the rotation axis C1 on the stage 10 and moving the Z position. The inspection image includes pixels of the edge WFE corresponding to the rotation angle θ at each Z position.

 図5は、実施形態1に係る画像処理装置30において、取得部31が取得した検査画像を例示した図であり、横軸は、回転角度θを示し、縦軸は、ウェーハWFのエッジWFEを撮像した画素を示す。図5に示すように、取得部31は、Z位置がそれぞれ異なる複数の検査画像を取得する。各検査画像は、ウェーハWFのエッジWFEの回転角度θと、回転角度θにおけるエッジWFEを撮像した画素と、を対応させている。検査画像における回転角度θは、ウェーハWFのエッジWFEの全周に対応するように、0°~360°の範囲を含んでもよいし、所定の一部の範囲を含んでもよい。 FIG. 5 is a diagram illustrating an inspection image acquired by the acquisition unit 31 in the image processing device 30 according to embodiment 1, where the horizontal axis indicates the rotation angle θ and the vertical axis indicates the pixels capturing the edge WFE of the wafer WF. As shown in FIG. 5, the acquisition unit 31 acquires multiple inspection images each with a different Z position. Each inspection image corresponds the rotation angle θ of the edge WFE of the wafer WF to the pixels capturing the edge WFE at the rotation angle θ. The rotation angle θ in the inspection image may include a range of 0° to 360° so as to correspond to the entire circumference of the edge WFE of the wafer WF, or may include a predetermined partial range.

 図5は、光軸角度ΦがΦ=Φ1の場合を示している。一方、取得部31は、光軸角度Φを変化させた場合の検査画像を取得してもよい。例えば、取得部31は、+γ軸方向からウェーハWFのエッジWFEを撮像した場合のように、光軸角度ΦがΦ=+90°の検査画像を取得してもよい。また、取得部31は、-γ軸方向からウェーハWFのエッジWFEを撮像した場合のように、光軸角度ΦがΦ=-90°の検査画像を取得してもよい。このように、検査画像の光軸角度Φは、-90°~+90°の範囲を含んでもよい。 Figure 5 shows the case where the optical axis angle Φ is Φ = Φ1. However, the acquisition unit 31 may acquire inspection images when the optical axis angle Φ is changed. For example, the acquisition unit 31 may acquire an inspection image with an optical axis angle Φ = +90°, such as when the edge WFE of the wafer WF is imaged from the +γ-axis direction. The acquisition unit 31 may also acquire an inspection image with an optical axis angle Φ = -90°, such as when the edge WFE of the wafer WF is imaged from the -γ-axis direction. In this way, the optical axis angle Φ of the inspection image may include a range from -90° to +90°.

 取得部31は、第1波長の光を受光する第1受光部23aから検査画像を取得してもよい。第1受光部23aから取得した検査画像を第1検査画像と呼ぶ。よって、この場合には、取得部31は、第1検査画像を取得する。 The acquisition unit 31 may acquire the inspection image from the first light receiving unit 23a, which receives light of the first wavelength. The inspection image acquired from the first light receiving unit 23a is called the first inspection image. Therefore, in this case, the acquisition unit 31 acquires the first inspection image.

 例えば、図5に示すように、取得部31が第1検査画像を取得したとする。この場合には、取得部31は、図5と同様な第2検査画像及び第3検査画像を取得してもよい。ここで、反射光R1における第2波長の光を受光する第2受光部23bから取得した検査画像を、第2検査画像と呼び、反射光R1における第3波長の光を受光する第3受光部23cから取得した検査画像を、第3検査画像と呼ぶ。第1検査画像、第2検査画像及び第3検査画像は、それぞれ、Z位置が異なる複数の検査画像を含んでもよい。 For example, suppose that the acquisition unit 31 acquires a first inspection image as shown in FIG. 5. In this case, the acquisition unit 31 may acquire a second inspection image and a third inspection image similar to those in FIG. 5. Here, the inspection image acquired from the second light receiving unit 23b that receives light of the second wavelength in the reflected light R1 is called the second inspection image, and the inspection image acquired from the third light receiving unit 23c that receives light of the third wavelength in the reflected light R1 is called the third inspection image. The first inspection image, second inspection image, and third inspection image may each include multiple inspection images with different Z positions.

 図5に示すように、特定部32は、検査画像において、輝度が所定以上となる画素の位置を特定する。輝度が所定以上となる画素の位置を画素位置Pと呼ぶ。なお、図5では、図が煩雑にならないように、いくつかの符号を省略している。検査画像は、前述したように、Z位置がそれぞれ異なる複数の検査画像を含む。例えば、図5では、Z位置がZ=Z1、Z2及びZ3の場合を示している。特定部32は、Z位置がZ=Z1、Z2及びZ3の場合の検査画像において、輝度が所定以上となる画素位置Pを特定してもよい。また、特定部32は、光軸角度Φを変化させた複数の検査画像において、輝度が所定以上となる画素位置Pを特定してもよい。以上は、Z位置がそれぞれ異なり、且つ、対象物の回転角度θがそれぞれ異なる複数の検査画像において、輝度が所定以上となる画素位置Pを特定するプロセスの一例である。特定部32は、Z位置と、Z位置における端部を撮像した画素とを対応付けた検査画像であって、複数の回転角度θごとの複数の検査画像をもとに、Z位置がそれぞれ異なり、且つ、対象物の回転角度θがそれぞれ異なる複数の検査画像において、輝度が所定以上となる画素位置Pを特定してもよい。特定部32は、画素についての評価パラメータの良好さが所定以上となる画素の位置である画素位置Pを特定してもよい。輝度が所定以上であることは、画素についての評価パラメータの良好さが所定以上であることの一例である。画素についての評価パラメータには、輝度のほか、輝度を正規化したパラメータ等が含まれてもよい。また、輝度や画素についての評価パラメータの良好さが所定以上となる画素の位置を特定することには、輝度や画素についての評価パラメータの良好さが所定未満である画素の位置を特定し、当該画素を処理から除外することも含み得る。 As shown in Figure 5, the identification unit 32 identifies the positions of pixels in the inspection image where the brightness is above a predetermined level. The positions of pixels where the brightness is above a predetermined level are called pixel positions P. Note that some reference numerals have been omitted in Figure 5 to avoid cluttering the illustration. As mentioned above, the inspection image includes multiple inspection images with different Z positions. For example, Figure 5 shows the cases where the Z positions are Z = Z1, Z2, and Z3. The identification unit 32 may identify pixel positions P where the brightness is above a predetermined level in inspection images where the Z positions are Z = Z1, Z2, and Z3. The identification unit 32 may also identify pixel positions P where the brightness is above a predetermined level in multiple inspection images with different optical axis angles Φ. The above is an example of a process for identifying pixel positions P where the brightness is above a predetermined level in multiple inspection images with different Z positions and different object rotation angles θ. The identifying unit 32 may identify pixel positions P where the luminance is equal to or greater than a predetermined value in multiple inspection images that correspond to Z positions and pixels capturing images of the end portions at the Z positions, each of which has a different Z position and a different rotation angle θ of the object, based on multiple inspection images for each of multiple rotation angles θ. The identifying unit 32 may identify pixel positions P where the quality of an evaluation parameter for a pixel is equal to or greater than a predetermined value. A luminance equal to or greater than a predetermined value is an example of the quality of an evaluation parameter for a pixel being equal to or greater than a predetermined value. The evaluation parameters for a pixel may include, in addition to luminance, parameters obtained by normalizing luminance, etc. Furthermore, identifying pixel positions where the quality of an evaluation parameter for a luminance or pixel is equal to or greater than a predetermined value may also include identifying pixel positions where the quality of an evaluation parameter for a luminance or pixel is less than a predetermined value and excluding those pixels from processing.

 輝度あるいは画素の評価パラメータの良好さが所定以上であるか否かは、対象物に対して所定程度でフォーカスされた状態で像が結像されている場合の輝度あるいは画素の評価パラメータを閾値とすることで判定されてもよい。ここで、所定程度でフォーカスされた状態とは、ジャストフォーカスを意味してもよいし、設計上許容できる程度のデフォーカス量を含んだ状態を含むものとしてもよい。 Whether the quality of the brightness or pixel evaluation parameters is above a predetermined level may be determined by using the brightness or pixel evaluation parameters when an image is formed in a state where the object is focused to a predetermined degree as a threshold. Here, a state where the object is focused to a predetermined degree may mean just focus, or may include a state including an amount of defocus that is acceptable for the design.

 特定部32は、取得部31によって取得されたZ位置がそれぞれ異なる複数の第1検査画像において第1波長画素位置P1を特定してもよい。特定部32は、取得部31によって取得されたZ位置がそれぞれ異なる複数の第2検査画像において第2波長画素位置を特定してもよいし、Z位置がそれぞれ異なる複数の第3検査画像において第3波長画素位置を特定してもよい。ここで、第1検査画像において輝度が所定以上となる画素位置を第1波長画素位置P1と呼び、第2検査画像において輝度が所定以上となる画素位置を第2波長画素位置P2(図示省略)と呼び、第3検査画像において輝度が所定以上となる画素位置を第3波長画素位置P3(図示省略)と呼ぶ。 The identification unit 32 may identify a first-wavelength pixel position P1 in a plurality of first inspection images acquired by the acquisition unit 31, each having a different Z position. The identification unit 32 may identify a second-wavelength pixel position in a plurality of second inspection images acquired by the acquisition unit 31, each having a different Z position, or may identify a third-wavelength pixel position in a plurality of third inspection images, each having a different Z position. Here, a pixel position in the first inspection image where the brightness is greater than or equal to a predetermined value is referred to as a first-wavelength pixel position P1, a pixel position in the second inspection image where the brightness is greater than or equal to a predetermined value is referred to as a second-wavelength pixel position P2 (not shown), and a pixel position in the third inspection image where the brightness is greater than or equal to a predetermined value is referred to as a third-wavelength pixel position P3 (not shown).

 記憶部33は、特定した画素位置Pを、各検査画像におけるZ位置、回転角度θ及びサンプリング時刻とともに高輝度時情報として記憶する。記憶部33は、高輝度時情報として、光軸角度Φを画素位置Pとともに記憶してもよい。記憶部33は、複数の光軸角度Φにおける高輝度時情報を記憶してもよい。 The storage unit 33 stores the identified pixel position P together with the Z position, rotation angle θ, and sampling time in each inspection image as high-brightness information. The storage unit 33 may also store the optical axis angle Φ together with the pixel position P as high-brightness information. The storage unit 33 may also store high-brightness information for multiple optical axis angles Φ.

 記憶部33は、第1波長画素位置P1を、各第1検査画像における第1波長高輝度時情報として、Z位置、回転角度θ及びサンプリング時刻とともに記憶してもよい。さらに、記憶部33は、第2波長画素位置P2を、各第2検査画像における第2波長高輝度時情報として、Z位置、回転角度θ及びサンプリング時刻とともに記憶してもよいし、第3波長画素位置P3を、各第3検査画像における第3波長高輝度時情報として、Z位置、回転角度θ及びサンプリング時刻とともに記憶してもよい。 The storage unit 33 may store the first-wavelength pixel position P1 together with the Z position, rotation angle θ, and sampling time as information when the first wavelength is high-brightness in each first inspection image. Furthermore, the storage unit 33 may store the second-wavelength pixel position P2 together with the Z position, rotation angle θ, and sampling time as information when the second wavelength is high-brightness in each second inspection image, and may store the third-wavelength pixel position P3 together with the Z position, rotation angle θ, and sampling time as information when the third wavelength is high-brightness in each third inspection image.

 生成部34は、記憶した複数の高輝度時情報に基づいて、ウェーハWFのエッジWFEのイメージ情報を生成する。イメージ情報は、例えば、エッジWFEのプロファイルを含んでもよい。生成部34が、高輝度時情報に基づいてイメージ情報を生成することには、生成部34が、高輝度時情報が示す画素位置Pにおける輝度に基づいて、輝度の高低情報やこのときのZ位置に基づく物理的な高さを示す高低情報を出力することが含まれてよい。また、生成部34が、高輝度時情報に基づいてイメージ情報を生成することには、生成部34が、複数の高輝度時情報が示す複数の画素位置Pにおける輝度をもとに、その平均や標準偏差などの統計値を取得し、あるいは、複数の画素位置Pにおける輝度をもとに内挿補完した値を取得し、これを輝度の高低情報として、あるいはこれを物理的な高さに変換した高低情報として出力することが含まれてよい。 The generation unit 34 generates image information of the edge WFE of the wafer WF based on the stored multiple pieces of high-brightness information. The image information may include, for example, a profile of the edge WFE. Generating image information based on high-brightness information by the generation unit 34 may include the generation unit 34 outputting brightness level information or height information indicating the physical height based on the Z position at that time, based on the brightness at the pixel position P indicated by the high-brightness information. Generating image information based on high-brightness information by the generation unit 34 may also include the generation unit 34 obtaining statistical values such as the average and standard deviation based on the brightness at multiple pixel positions P indicated by the multiple pieces of high-brightness information, or obtaining interpolated values based on the brightness at multiple pixel positions P, and outputting this as brightness level information or height information converted into physical height.

 図6は、実施形態1に係る画像処理装置30において、生成部34が生成したウェーハWFのエッジWFEのプロファイルを例示した図である。図6では、回転角度θがθ=θ1の場合を示している。また、図6では、光軸角度Φが複数のΦ1~Φ5の場合を示している。なお、光軸角度Φが-90°~+90°であることは一例である。また、-90°~+90°の光軸角度ΦをΦ1~Φ5に分割することも一例である。図6に示すように、生成部34は、イメージ情報として、エッジWFEの断面におけるプロファイルを生成してもよい。前述の通り、輝度(画素の評価パラメータの良好さ)が所定以上である画素は、そのZ位置において、エッジWFEに対して適切にフォーカスされ、例えば、ジャストフォーカスされているものと考えてもよい。そうすると、生成部34は、このときのZ位置に基づいて、エッジWFEの断面におけるプロファイルを生成することができる。 FIG. 6 is a diagram illustrating a profile of the edge WFE of the wafer WF generated by the generation unit 34 in the image processing device 30 according to the first embodiment. FIG. 6 shows a case where the rotation angle θ is θ=θ1. FIG. 6 also shows a case where the optical axis angle Φ is multiple values Φ1 to Φ5. Note that an optical axis angle Φ of -90° to +90° is one example. Another example is dividing the optical axis angle Φ of -90° to +90° into Φ1 to Φ5. As shown in FIG. 6, the generation unit 34 may generate a profile of the cross section of the edge WFE as image information. As mentioned above, a pixel whose brightness (goodness of the pixel evaluation parameter) is equal to or greater than a predetermined value is appropriately focused on the edge WFE at that Z position, and may be considered to be in just focus, for example. In this case, the generation unit 34 can generate a profile of the cross section of the edge WFE based on the Z position at that time.

 生成部34は、高輝度時情報に基づいて、ウェーハWFのエッジWFEにおける全周の回転角度θについてのイメージ情報を生成してもよい。例えば、生成部34は、ウェーハWFのエッジWFEにおける全周の回転角度θについての断面のプロファイルを生成してもよい。 The generation unit 34 may generate image information about the rotation angle θ of the entire circumference at the edge WFE of the wafer WF based on the high-brightness information. For example, the generation unit 34 may generate a cross-sectional profile about the rotation angle θ of the entire circumference at the edge WFE of the wafer WF.

 また、生成部34は、ウェーハWFに対する複数の光軸角度Φにおける高輝度時情報に基づいて、イメージ情報を生成してもよい。例えば、生成部34は、ウェーハWFに対する複数の光軸角度Φにおいての高輝度時情報に基づいて、エッジWFEの断面のプロファイルを生成してもよい。例えば、生成部34は、回転軸C1に直交する面に対して、光軸角度Φ=-90°~+90°に渡って、プロファイルを生成することにより、ウェーハWFの裏面から表面までのエッジWFEのプロファイルを生成することができる。さらに、生成部34は、ウェーハWFのエッジWFEにおける全周の回転角度θについて、エッジWFEの断面のプロファイルを生成することによって、ウェーハWFのエッジWFEの全周に渡って、ウェーハWFの表面から裏面までのエッジWFEの断面のプロファイルを生成することができる。 The generation unit 34 may also generate image information based on high-brightness information at multiple optical axis angles Φ relative to the wafer WF. For example, the generation unit 34 may generate a cross-sectional profile of the edge WFE based on high-brightness information at multiple optical axis angles Φ relative to the wafer WF. For example, the generation unit 34 can generate a profile of the edge WFE from the back surface to the front surface of the wafer WF by generating a profile over the optical axis angle Φ = -90° to +90° with respect to a plane perpendicular to the rotation axis C1. Furthermore, the generation unit 34 can generate a cross-sectional profile of the edge WFE from the front surface to the back surface of the wafer WF over the entire circumference of the edge WFE of the wafer WF by generating a cross-sectional profile of the edge WFE for the rotation angle θ of the entire circumference of the edge WFE of the wafer WF.

 生成部34は、第1波長高輝度時情報及び第2波長高輝度時情報に基づいて、複数の色で表されたウェーハWFのエッジWFEのカラーイメージを生成してもよい。また、生成部34は、第1波長高輝度時情報、第2波長高輝度時情報及び第3波長高輝度時情報に基づいて、複数の色で表されたウェーハWFのエッジWFEのカラーイメージを生成してもよい。生成部34は、イメージ情報に基づいて、ウェーハWFのエッジWFEについての参照画像を生成してもよい。ここで参照画像とは、ウェーハWFのエッジWFEの欠陥等を検査するために用いる、理想的なウェーハWF(無欠陥と言えるウェーハWF)のエッジWFEの画像である。 The generation unit 34 may generate a color image of the edge WFE of the wafer WF expressed in multiple colors based on the first wavelength high brightness information and the second wavelength high brightness information. The generation unit 34 may also generate a color image of the edge WFE of the wafer WF expressed in multiple colors based on the first wavelength high brightness information, the second wavelength high brightness information, and the third wavelength high brightness information. The generation unit 34 may generate a reference image of the edge WFE of the wafer WF based on the image information. Here, the reference image is an image of the edge WFE of an ideal wafer WF (a wafer WF that can be said to be defect-free) used to inspect the edge WFE of the wafer WF for defects, etc.

 制御部35は、画像処理装置30における取得部31、特定部32、記憶部33及び生成部34の動作を制御する。制御部35は、検査装置1におけるステージ10の駆動部12、撮像部20の駆動部22、受光部23及び光源24の動作を制御してもよい。制御部35は、イメージ情報に基づいて、ウェーハWFについての欠陥等の検査を行ってもよい。例えば理想的なウェーハWFのイメージ情報と、検査対象ウェーハWFのイメージ情報とを比較することにより検査対象ウェーハWFを検査してよい。イメージ情報に代えて、イメージ情報に基づき生成されたエッジWFEのプロファイル、カラーイメージ、エッジWFEの画像を用いて、検査対象ウェーハWFを検査してよい。 The control unit 35 controls the operation of the acquisition unit 31, identification unit 32, memory unit 33, and generation unit 34 in the image processing device 30. The control unit 35 may also control the operation of the drive unit 12 of the stage 10, the drive unit 22 of the imaging unit 20, the light receiving unit 23, and the light source 24 in the inspection device 1. The control unit 35 may inspect the wafer WF for defects, etc. based on the image information. For example, the inspection target wafer WF may be inspected by comparing image information of an ideal wafer WF with image information of the inspection target wafer WF. Instead of image information, the inspection target wafer WF may be inspected using a profile, color image, or image of the edge WFE generated based on the image information.

 <画像処理方法>
 次に、本実施形態に係る画像処理装置30を用いた画像処理方法を説明する。図7は、実施形態1に係る画像処理装置30を用いた画像処理方法を例示したフローチャート図である。
<Image processing method>
Next, an image processing method using the image processing device 30 according to this embodiment will be described. Fig. 7 is a flowchart illustrating an image processing method using the image processing device 30 according to the first embodiment.

 まず、図7のステップS11に示すように、検査画像を取得部31に取得させる。具体的には、制御部35は、ステージ10でウェーハWFを回転軸C1の周りで回転させながら、Z位置を移動させることにより、対物レンズ21で集光された反射光R1から複数のZ位置におけるウェーハWFのエッジWFEの検査画像を取得部31に取得させる。ステップS11において、制御部35は、反射光R1における第1波長の光を受光する第1受光部23aから第1検査画像を取得部31に取得させ、第2波長の光を受光する第2受光部23bから第2検査画像を取得部31に取得させ、第3波長の光を受光する第3受光部23cから第3検査画像を取得部31に取得させてもよい。 First, as shown in step S11 of FIG. 7, the controller 35 causes the acquisition unit 31 to acquire inspection images. Specifically, the controller 35 causes the stage 10 to rotate the wafer WF around the rotation axis C1 while moving the Z position, thereby causing the acquisition unit 31 to acquire inspection images of the edge WFE of the wafer WF at multiple Z positions from the reflected light R1 collected by the objective lens 21. In step S11, the controller 35 may cause the acquisition unit 31 to acquire a first inspection image from the first light receiving unit 23a that receives light of a first wavelength in the reflected light R1, to acquire a second inspection image from the second light receiving unit 23b that receives light of a second wavelength, and to acquire a third inspection image from the third light receiving unit 23c that receives light of a third wavelength.

 次に、ステップS12に示すように、画素位置Pを特定部32に特定させる。具体的には、制御部35は、Z位置がそれぞれ異なる複数の検査画像において、輝度が所定以上となる画素位置Pを特定部32に特定させる。ステップS12において、制御部35は、複数の第1検査画像において第1波長画素位置P1を特定部32に特定させ、複数の第2検査画像において第2波長画素位置P2を特定部32に特定させ、複数の第3検査画像において第3波長画素位置P3を特定部32に特定させてもよい。制御部35は、Z位置がそれぞれ異なり、且つ、対象物の回転角度θがそれぞれ異なる複数の検査画像において、輝度が所定以上となる画素の位置である画素位置を特定部に特定させてもよい。 Next, as shown in step S12, the identification unit 32 is caused to identify pixel position P. Specifically, the control unit 35 causes the identification unit 32 to identify pixel position P where the brightness is equal to or greater than a predetermined value in a plurality of inspection images each having a different Z position. In step S12, the control unit 35 may cause the identification unit 32 to identify first-wavelength pixel position P1 in a plurality of first inspection images, to identify second-wavelength pixel position P2 in a plurality of second inspection images, and to identify third-wavelength pixel position P3 in a plurality of third inspection images. The control unit 35 may cause the identification unit to identify pixel positions where the brightness is equal to or greater than a predetermined value in a plurality of inspection images each having a different Z position and a different rotation angle θ of the object.

 次に、ステップS13に示すように、高輝度時情報を記憶部33に記憶させる。具体的には、制御部35は、特定した画素位置Pを、各検査画像におけるZ位置、回転角度θ及びサンプリング時刻とともに、高輝度時情報として記憶部33に記憶させる。ステップS13において、制御部35は、高輝度時情報として、光軸角度Φを画素位置Pとともに記憶部33に記憶させてもよい。また、制御部35は、複数の光軸角度Φにおける高輝度時情報を記憶部33に記憶させてもよい。 Next, as shown in step S13, the high-brightness information is stored in the memory unit 33. Specifically, the control unit 35 stores the identified pixel position P in the memory unit 33 as high-brightness information, along with the Z position, rotation angle θ, and sampling time in each inspection image. In step S13, the control unit 35 may store the optical axis angle Φ together with the pixel position P in the memory unit 33 as high-brightness information. The control unit 35 may also store high-brightness information for multiple optical axis angles Φ in the memory unit 33.

 さらに、制御部35は、第1波長画素位置P1を、各第1検査画像における第1波長高輝度時情報として、Z位置、回転角度θ及びサンプリング時刻とともに記憶部33に記憶させ、第2波長画素位置P2を、各第2検査画像における第2波長高輝度時情報として、Z位置、回転角度θ及びサンプリング時刻とともに記憶部33に記憶させ、第3波長画素位置P3を、各第3検査画像における第3波長高輝度時情報として、Z位置、回転角度θ及びサンプリング時刻とともに記憶部33に記憶させてもよい。 Furthermore, the control unit 35 may store the first-wavelength pixel position P1 in the memory unit 33 together with the Z position, rotation angle θ, and sampling time as information when the first wavelength is high in each first inspection image; store the second-wavelength pixel position P2 in the memory unit 33 together with the Z position, rotation angle θ, and sampling time as information when the second wavelength is high in each second inspection image; and store the third-wavelength pixel position P3 in the memory unit 33 together with the Z position, rotation angle θ, and sampling time as information when the third wavelength is high in each third inspection image.

 次に、ステップS14に示すように、イメージ情報を生成部34に生成させる。具体的には、制御部35は、記憶した複数の高輝度時情報に基づいて、ウェーハWFのエッジWFEに関するイメージ情報を生成部34に生成させる。ステップS14において、制御部35は、ウェーハWFに対する複数の光軸角度Φにおける高輝度時情報に基づいて、イメージ情報を生成部34に生成させてもよい。また、制御部35は、高輝度時情報に基づいて、ウェーハWFのエッジWFEにおける全周の回転角度θについてのイメージ情報を生成部34に生成させてもよい。さらに、制御部35は、ステップS14において、第1波長高輝度時情報、第2波長高輝度時情報及び第3波長高輝度時情報に基づいて、複数の色で表されたウェーハWFのエッジWFEのカラーイメージを生成部34に生成させてもよい。 Next, as shown in step S14, the control unit 35 causes the generation unit 34 to generate image information regarding the edge WFE of the wafer WF based on the stored multiple pieces of high-brightness information. In step S14, the control unit 35 may cause the generation unit 34 to generate image information based on the high-brightness information at multiple optical axis angles Φ relative to the wafer WF. The control unit 35 may also cause the generation unit 34 to generate image information regarding the rotation angle θ of the entire circumference of the edge WFE of the wafer WF based on the high-brightness information. Furthermore, in step S14, the control unit 35 may cause the generation unit 34 to generate a color image of the edge WFE of the wafer WF displayed in multiple colors based on the first-wavelength high-brightness information, second-wavelength high-brightness information, and third-wavelength high-brightness information.

 <検査方法>
 次に、本実施形態に係る検査装置1を用いた検査方法を説明する。図8は、実施形態1に係る検査装置1を用いた検査方法を例示したフローチャート図である。
<Testing method>
Next, a description will be given of an inspection method using the inspection device 1 according to this embodiment. Fig. 8 is a flow chart illustrating an inspection method using the inspection device 1 according to the first embodiment.

 図8のステップS21に示すように、ステージ10でウェーハWFを回転させる。具体的には、例えば、制御部35は、ステージ10の駆動部12を駆動させて、ステージ10上のウェーハWFを回転軸C1の周りで回転させる。 As shown in step S21 of FIG. 8, the wafer WF is rotated on the stage 10. Specifically, for example, the control unit 35 drives the drive unit 12 of the stage 10 to rotate the wafer WF on the stage 10 around the rotation axis C1.

 次に、ステップS22に示すように、ウェーハWFのエッジWFEで反射した反射光R1を対物レンズ21で集光する。具体的には、例えば、制御部35は、駆動部22を駆動させて、対物レンズ21が反射光R1を集光するように、対物レンズ21の位置を制御させる。 Next, as shown in step S22, the reflected light R1 reflected by the edge WFE of the wafer WF is focused by the objective lens 21. Specifically, for example, the control unit 35 drives the drive unit 22 to control the position of the objective lens 21 so that the objective lens 21 focuses the reflected light R1.

 次に、ステップS23に示すように、Z位置を駆動部22で移動させる。具体的には、例えば、制御部35は、駆動部22を駆動させて、光軸方向にZ位置を移動させる。 Next, as shown in step S23, the Z position is moved by the drive unit 22. Specifically, for example, the control unit 35 drives the drive unit 22 to move the Z position in the optical axis direction.

 次に、ステップS24に示すように、検査画像を撮像部20に撮像させる。例えば、制御部35は、受光部23に反射光を受光させて検査画像を撮像させる。 Next, as shown in step S24, the imaging unit 20 captures an inspection image. For example, the control unit 35 causes the light receiving unit 23 to receive reflected light and capture the inspection image.

 次に、ステップS25に示すように、検査画像を画像処理装置30に画像処理させる。画像処理装置30を用いた画像処理方法は、前述したとおりである。なお、イメージ情報に基づいて、対象物の検査を行うステップを備えてもよい。 Next, as shown in step S25, the inspection image is subjected to image processing by the image processing device 30. The image processing method using the image processing device 30 is as described above. Note that a step of inspecting the object based on the image information may also be included.

 次に、本実施形態の効果を説明する。本実施形態の画像処理装置30は、ウェーハWFを回転軸C1の周りで回転させながら、ウェーハWFのエッジWFEからの反射光R1を集光する対物レンズ21のZ位置を変更させてエッジWFEの検査画像を取得する。そして、画像処理装置30は、検査画像から輝度が所定以上となる画素位置Pを特定し、当該画素位置Pに基づいて、エッジWFEのイメージ情報を生成する。これにより、画像処理装置30は、複数の回転角度θごとのエッジWFEのプロファイルを得ることができる。具体的には、ウェーハWFの全周囲のエッジWFEのプロファイル情報を得ることができる。よって、画像処理装置30は、ウェーハWF等の対象物の情報を、高速かつ精度よく計測することができる。 Next, the effects of this embodiment will be described. The image processing device 30 of this embodiment acquires an inspection image of the edge WFE by changing the Z position of the objective lens 21, which collects reflected light R1 from the edge WFE of the wafer WF, while rotating the wafer WF around the rotation axis C1. The image processing device 30 then identifies pixel positions P from the inspection image where the brightness is above a predetermined level, and generates image information of the edge WFE based on these pixel positions P. This allows the image processing device 30 to obtain profiles of the edge WFE for multiple rotation angles θ. Specifically, it is possible to obtain profile information of the edge WFE around the entire periphery of the wafer WF. Therefore, the image processing device 30 can measure information about objects such as the wafer WF quickly and accurately.

 ステージ10を回転させながら、共焦点光学系によってフォーカスが合った位置から、ウェーハWFのエッジWFEの位置を測定することにより、エッジWFEのプロファイルを形成する技術がある。しかしながら、この方法では、任意の回転角度θにおけるエッジWFEの位置の測定が終了した後に、当該回転角度θにおけるデータは、その後の回転角度θにおけるエッジWFEの位置の測定に利用されない。 There is a technique for forming a profile of the edge WFE by measuring the position of the edge WFE of the wafer WF from the position focused by a confocal optical system while rotating the stage 10. However, with this method, after measurement of the position of the edge WFE at a given rotation angle θ is completed, the data at that rotation angle θ is not used to measure the position of the edge WFE at subsequent rotation angles θ.

 これに対して、本実施形態の画像処理装置30は、取得部31が取得した複数の検査画像のデータを記憶部33に記憶させている。よって、エッジWFEのプロファイルに、測定誤差等による特異点があった場合に、記憶部33に記憶された特異点の周囲のデータに基づいて、特異点に対して補間等の補正を行うことができ、ウェーハWFのエッジWFEのプロファイルの精度を向上させることができる。 In contrast, the image processing device 30 of this embodiment stores data on multiple inspection images acquired by the acquisition unit 31 in the memory unit 33. Therefore, if there is a singularity in the profile of the edge WFE due to a measurement error or the like, corrections such as interpolation can be made to the singularity based on the data surrounding the singularity stored in the memory unit 33, thereby improving the accuracy of the profile of the edge WFE of the wafer WF.

 近年の半導体装置は、3次元に積層され、集積度が向上されている。これに伴い、ウェーハWFの表面形状の凹凸が大きくなる場合がある。このような半導体装置が形成されたウェーハWFを共焦点光学系で測定した場合には、焦点距離(Depth of Field)の範囲のみフォーカスが合う。よって、半導体装置におけるフォーカスが合っている部分が限定的となる場合がある。また、この場合には、オートフォーカスの機能を十分に発揮できない場合がある。焦点距離の範囲を広くする方法も考えられるが、その場合には、解像度が低減する恐れがある。 In recent years, semiconductor devices have been stacked three-dimensionally, increasing the degree of integration. As a result, the unevenness of the surface shape of the wafer WF can become greater. When a wafer WF on which such semiconductor devices are formed is measured using a confocal optical system, only the focal length (depth of field) range is in focus. As a result, the portion of the semiconductor device that is in focus can be limited. In this case, the autofocus function may not be fully utilized. One option is to widen the focal length range, but this could result in a reduction in resolution.

 これに対して、本実施形態の画像処理装置30は、対物レンズ21の焦点距離に関係なく、対物レンズ21を光軸方向にスキャンする。よって、対象物にフォーカスが合った画像を取得することができる。これにより、オートフォーカス機能を不要とすることができる。 In contrast, the image processing device 30 of this embodiment scans the objective lens 21 in the optical axis direction regardless of the focal length of the objective lens 21. This makes it possible to obtain an image in which the object is in focus. This makes an autofocus function unnecessary.

 カラー画像を撮像する関連する装置では、複数の色に対応した受光部を用いて、カラー画像を取得する。その際に、各受光部のフォーカスが同時に合う必要がある。各受光部のフォーカスが同時に合わないと、カラー画像を生成した場合に、色収差によって、正しい色にならない。 In related devices that capture color images, light receiving elements corresponding to multiple colors are used to obtain the color image. In this case, each light receiving element must be in focus at the same time. If each light receiving element is not in focus at the same time, chromatic aberration will cause the colors to be inaccurate when a color image is generated.

 これに対して、本実施形態の画像処理装置30は、対物レンズ21の光軸方向へのスキャンをする中で、各受光部23(第1受光部23a、第2受光部23b及び第3受光部23c)において、それぞれフォーカスが合った第1波長高輝度時情報、第2波長高輝度時情報及び第3波長高輝度時情報に基づいてカラーイメージを生成する。つまり、第1検査画像、第2検査画像及び第3検査画像を合成してカラーイメージ画像を生成する。これにより、高速かつ高精度にカラーイメージを生成することができる。 In contrast, the image processing device 30 of this embodiment generates a color image based on the focused first wavelength high brightness information, second wavelength high brightness information, and third wavelength high brightness information at each light receiving unit 23 (first light receiving unit 23a, second light receiving unit 23b, and third light receiving unit 23c) while scanning the objective lens 21 in the optical axis direction. In other words, the first inspection image, second inspection image, and third inspection image are combined to generate a color image. This makes it possible to generate a color image quickly and with high accuracy.

 一般的に、半導体の製造プロセスにおいて、歩留まりを低下させる要因を迅速に解析できることは、重要な要素と言える。例えば、ウェーハWFにおける歩留まり低下要因を探す際に、キラーとなる欠陥の特定、ウェーハそのもののプロファイル形状の計測といった様々な情報が必要であり、これらを検査・計測しようとすると、複数の装置を用いることになり、時間とコストを多く消費することになる。 Generally, being able to quickly analyze factors that reduce yield is an important element in the semiconductor manufacturing process. For example, when searching for factors that reduce yield in wafer WF, various information is required, such as identifying killer defects and measuring the profile shape of the wafer itself. Inspecting and measuring this information requires the use of multiple pieces of equipment, which is time-consuming and costly.

 本実施形態の画像処理装置30及び検査装置1は、共焦点光学系の顕微鏡を実装した装置を用いて、高精細なカラーイメージの生成とウェーハWFの断面プロファイルの生成を同時に行うことができる。具体的には、上述したように、検査装置1は、構成としては、共焦点光学系の撮像部20と、回転軸C1を有するステージ10と、異なる波長を検出する複数の受光部23を備えてもよい。ステージ10を高速に回転させながら、対物レンズ21をフォーカスポジションに対し、光軸C2の方向に移動させながらスキャンを行う。 The image processing device 30 and inspection device 1 of this embodiment use a device equipped with a confocal optical microscope, and can simultaneously generate high-resolution color images and cross-sectional profiles of the wafer WF. Specifically, as described above, the inspection device 1 may be configured to include a confocal optical imaging unit 20, a stage 10 having a rotation axis C1, and multiple light receiving units 23 that detect different wavelengths. While rotating the stage 10 at high speed, scanning is performed by moving the objective lens 21 in the direction of the optical axis C2 relative to the focus position.

 この時に、複数の受光部23でフォーカスが合ったタイミングの輝度情報、対物レンズ21のZ位置の位置情報、ステージ10の回転角度θの角度情報を参照することで、ウェーハWFの全周のプロファイル、高精細な共焦点光学系の全焦点イメージを取得することができる。さらに、複数の波長(例えば、RGB)の全焦点イメージを結合することで、カラーイメージの生成、並びに、Z位置の位置情報から画像の諧調へ変換した高さイメージの生成も同時に可能である。 At this time, by referencing the brightness information at the time when the focus was achieved on the multiple light receiving elements 23, the position information of the Z position of the objective lens 21, and the angle information of the rotation angle θ of the stage 10, it is possible to obtain a profile of the entire circumference of the wafer WF and an all-in-focus image of the high-resolution confocal optical system. Furthermore, by combining all-in-focus images of multiple wavelengths (e.g., RGB), it is possible to generate a color image, as well as a height image converted from the Z position information into image gradation, all at the same time.

 このような構成により、本実施形態の画像処理装置及び検査装置1は、以下の効果を期待することができる。 With this configuration, the image processing device and inspection device 1 of this embodiment can be expected to have the following effects.

 一つ目は、コスト低減効果である。ウェーハWFのプロファイル情報と欠陥検査に用いる検査画像の撮像を同時に行うために、使用する装置の数を低減することができ、工数とコストを削減することができる。 The first is the cost reduction effect. Because the wafer WF profile information and the inspection image used for defect inspection are captured simultaneously, the number of devices used can be reduced, resulting in reduced labor and costs.

 二つ目は、焦点距離よりも大きい凹凸のプロファイルを得ることである。本実施形態の画像処理装置30は、形状の凹凸が大きいウェーハWFに対しても、フォーカスの合った画像を得ることができる。近年の半導体デバイスは、そのプロセスが3次元に形成される傾向にあり、ウェーハWF上の表面の凹凸が数十μm以上のものも多くある。このため、通常の共焦点光学系の顕微鏡における焦点距離では、全面にフォーカスが合った画像を得ることが困難である。全面にフォーカスが合った画像を得ることために、解像度を犠牲にして、焦点距離を広い方式にすることが考えられる。しかしながら、その場合には、解像度が低下する。本実施形態では、解像度が高く全面にフォーカスがあった画像を得ることができる。 The second is to obtain a profile of unevenness that is larger than the focal length. The image processing device 30 of this embodiment can obtain focused images even for wafers WF with large uneven shapes. Recent semiconductor devices tend to be formed three-dimensionally, and many wafers WF have surface unevenness of several tens of μm or more. For this reason, it is difficult to obtain an image in focus over the entire surface with the focal length of a typical confocal optical microscope. In order to obtain an image in focus over the entire surface, it is possible to sacrifice resolution and use a method with a wider focal length. However, this will result in a decrease in resolution. In this embodiment, it is possible to obtain an image with high resolution and in focus over the entire surface.

 三つ目は、カラー画像における色収差の低減である。共焦点光学系においては、焦点が合っているところで輝度が高くなり、焦点が合っていないところで輝度が低くなる。このため、複数のカメラにおいて、フォーカスが合うタイミングにズレがあると、正しい色として表現されない。本実施形態では、光軸方向のスキャンをした中で各受光部23のフォーカスが合ったイメージを得ることができる。よって、後から各色の画像を結合することにより、全色で焦点の合ったカラー画像を生成することができる。 The third is the reduction of chromatic aberration in color images. In a confocal optical system, brightness increases where the image is in focus and decreases where it is not. For this reason, if there is a discrepancy in the timing at which multiple cameras achieve focus, the colors will not be displayed correctly. In this embodiment, an image in which each light-receiving element 23 is in focus can be obtained by scanning along the optical axis. Therefore, by combining the images of each color later, a color image in which all colors are in focus can be generated.

 (実施形態2)
 次に、実施形態2に係る画像処理装置を説明する。以下では、実施形態1と同様の構成については、説明を省略する場合がある。本実施形態の画像処理装置を検査装置と呼んでもよい。また、前述の実施形態1の検査装置1を画像処理装置と呼んでもよい。その場合には、画像処理装置30を画像処理部30aと呼ぶ。以下で、<画像処理装置>及び<画像処理部>を説明した後で、<画像処理方法>を説明する。
(Embodiment 2)
Next, an image processing device according to a second embodiment will be described. In the following, the description of the same configuration as in the first embodiment may be omitted. The image processing device of this embodiment may be called an inspection device. Furthermore, the inspection device 1 of the first embodiment described above may be called an image processing device. In that case, the image processing device 30 will be called an image processing unit 30a. Below, the <image processing device> and <image processing unit> will be described first, followed by the <image processing method>.

 <画像処理装置>
 図9は、実施形態2に係る画像処理装置2を例示した構成図である。図10は、実施形態2に係る画像処理装置2において、対象物を例示した断面図である。図11は、実施形態2に係る画像処理装置2において、対象物を例示した平面図である。図9~11に示すように、画像処理装置2は、支持部40、駆動部12、駆動部22、撮像部20及び画像処理部30aを備えている。画像処理装置2は、例えば、対象物の撮像画像を処理する。画像処理装置2は、対象物を検査してもよい。その場合には、撮像画像は、検査画像を含む。画像処理装置2は、対象物の端部の撮像画像及び対象物の主面の撮像画像を処理してもよい。
<Image processing device>
FIG. 9 is a configuration diagram illustrating an image processing device 2 according to the second embodiment. FIG. 10 is a cross-sectional view illustrating an example of an object in the image processing device 2 according to the second embodiment. FIG. 11 is a plan view illustrating an example of an object in the image processing device 2 according to the second embodiment. As shown in FIGS. 9 to 11, the image processing device 2 includes a support unit 40, a drive unit 12, a drive unit 22, an imaging unit 20, and an image processing unit 30a. The image processing device 2 processes, for example, captured images of the object. The image processing device 2 may inspect the object. In that case, the captured images include an inspection image. The image processing device 2 may process captured images of the edge of the object and captured images of the main surface of the object.

 対象物は、例えば、ウェーハWFを含む。その場合には、対象物の端部は、ウェーハWFのエッジWFEを含む。また、対象物の主面は、ウェーハWFのウェーハ面WF1を含む。なお、対象物の主面は、ウェーハWFの裏面を含むことを排除しない。以下の説明では、対象物を、ウェーハWFとする場合がある。なお、対象物は、端部及び主面を有していれば、ウェーハWFに限らず、半導体チップ、プリント基板等の板状物体を含んでもよいし、板状物体以外の物体でもよい。 The object includes, for example, a wafer WF. In that case, the edge of the object includes the edge WFE of the wafer WF. The main surface of the object includes the wafer surface WF1 of the wafer WF. Note that the main surface of the object does not exclude including the back surface of the wafer WF. In the following description, the object may be referred to as a wafer WF. Note that the object is not limited to a wafer WF, and may include plate-like objects such as semiconductor chips and printed circuit boards, as long as it has an edge and a main surface, or may be an object other than a plate-like object.

 ここで、図11に示すように、画像処理装置2の説明の便宜のために、αβγ直交座標軸系におけるαβ面に対して、回転軸C1を中心としたrθ極座標軸系を導入する。半径位置rは、回転軸C1からの距離を示す。回転角度θは、α軸の-α軸方向とのなす角を示す。 Here, as shown in Figure 11, for ease of explanation of the image processing device 2, an rθ polar coordinate system centered on the rotation axis C1 is introduced for the αβ plane in the αβγ Cartesian coordinate system. The radial position r indicates the distance from the rotation axis C1. The rotation angle θ indicates the angle between the α axis and the -α axis direction.

 支持部40は、対象物を支持する。支持部40は、例えば、ステージ10を含んでもよい。ステージ10は、ステージ面11に載置されたウェーハWFを支持する。ステージ10は、モータ等の駆動部12に接続してもよい。駆動部12は、ステージ10を回転軸C1の周りで回転させる。ステージ10は、ステージ10の位置及びステージ面11の勾配を調整する3軸調整機構等を有してもよい。駆動部12は、ステージ10を回転軸C1に垂直な方向に移動させてもよい。また、駆動部12は、ステージ10を回転軸C1に平行な方向に移動させてもよい。駆動部12は、回転軸C1に平行な方向に、対象物と対物レンズ21の集光位置との相対位置を変更可能であってもよい。 The support unit 40 supports the object. The support unit 40 may include, for example, a stage 10. The stage 10 supports a wafer WF placed on a stage surface 11. The stage 10 may be connected to a drive unit 12 such as a motor. The drive unit 12 rotates the stage 10 around a rotation axis C1. The stage 10 may have a three-axis adjustment mechanism or the like that adjusts the position of the stage 10 and the gradient of the stage surface 11. The drive unit 12 may move the stage 10 in a direction perpendicular to the rotation axis C1. The drive unit 12 may also move the stage 10 in a direction parallel to the rotation axis C1. The drive unit 12 may be able to change the relative position between the object and the focusing position of the objective lens 21 in a direction parallel to the rotation axis C1.

 したがって、支持部40は、対象物を回転軸C1の回りで回転可能に支持する。また、支持部40は、対象物を回転軸C1に垂直な方向に移動可能に支持する。例えば、支持部40は、対象物をα軸方向に移動可能に支持する。支持部40は、α軸方向の移動量をセンサ13によってセンシングしてもよい。なお、支持部40のα軸方向の移動量は、撮像部20がウェーハ面WF1を撮像する位置の半径位置rに対応する。 Therefore, the support unit 40 supports the object so that it can rotate around the rotation axis C1. The support unit 40 also supports the object so that it can move in a direction perpendicular to the rotation axis C1. For example, the support unit 40 supports the object so that it can move in the α-axis direction. The support unit 40 may sense the amount of movement in the α-axis direction using the sensor 13. The amount of movement of the support unit 40 in the α-axis direction corresponds to the radial position r at which the imaging unit 20 images the wafer surface WF1.

 また、支持部40は、対象物を回転軸C1に平行な方向に移動可能に支持する。例えば、支持部40は、対象物をγ軸方向に移動可能に支持する。支持部40は、対象物の主面が撮像される場合に、γ軸方向の移動量からZ位置をセンサ13によってセンシングしてもよい。なお、支持部40は、対象物を回転軸C1の回りで回転可能に支持するとともに、回転軸C1に垂直な方向及び回転軸C1に平行な方向に移動可能に支持することができれば、ステージ10に限らない。 Furthermore, the support unit 40 supports the object so that it can move in a direction parallel to the rotation axis C1. For example, the support unit 40 supports the object so that it can move in the γ-axis direction. When the main surface of the object is imaged, the support unit 40 may sense the Z position from the amount of movement in the γ-axis direction using the sensor 13. Note that the support unit 40 is not limited to the stage 10, as long as it can support the object so that it can rotate around the rotation axis C1 and so that it can move in a direction perpendicular to and parallel to the rotation axis C1.

 支持部40は、無線及び有線の少なくとも何れかを含む通信回線で画像処理部30aに接続する。支持部40は、半径位置r、回転角度θ及びZ位置のデータを含む情報を画像処理部30aに伝達可能な状態で接続されている。支持部40は、これらの情報を出力する。 The support unit 40 is connected to the image processing unit 30a via a communication line that can be either wireless or wired. The support unit 40 is connected in a state where it can transmit information including data on the radial position r, rotation angle θ, and Z position to the image processing unit 30a. The support unit 40 outputs this information.

 駆動部12は、ステージ10を回転軸C1の周りで回転させる。したがって、駆動部12は、対象物を回転軸C1の周りで回転させる。駆動部12は、3軸調整機構等を駆動させて、ステージ10の位置を変更させてもよい。駆動部12は、回転軸C1に垂直な方向における、対象物と対物レンズ21の集光位置との相対位置(これを半径位置rと呼ぶ。)を変更可能である。例えば、駆動部12は、回転軸C1に垂直なα軸方向にステージ10を移動させる。したがって、駆動部12は、対象物をα軸方向に移動させる。また、駆動部12は、回転軸C1に平行なγ軸方向にステージ10を移動させる。したがって、駆動部12は、対象物をγ軸方向に移動させる。 The driver 12 rotates the stage 10 around the rotation axis C1. Therefore, the driver 12 rotates the object around the rotation axis C1. The driver 12 may change the position of the stage 10 by driving a three-axis adjustment mechanism or the like. The driver 12 can change the relative position (called the radial position r) between the object and the focusing position of the objective lens 21 in a direction perpendicular to the rotation axis C1. For example, the driver 12 moves the stage 10 in the α-axis direction perpendicular to the rotation axis C1. Therefore, the driver 12 moves the object in the α-axis direction. The driver 12 also moves the stage 10 in the γ-axis direction parallel to the rotation axis C1. Therefore, the driver 12 moves the object in the γ-axis direction.

 図10に示したZスキャンのように、駆動部12は、回転軸C1に垂直な方向に半径位置rを変更させる場合には、Z位置を固定してもよい。一例として、駆動部12は、回転軸C1に垂直な方向に半径位置rを変更させる場合には、対物レンズ21のγ軸方向における位置を固定してもよい。駆動部12は、回転軸C1に垂直な方向に半径位置rを変更させる場合には、Z位置を第1位置及び第2位置等に固定してもよい。例えば、駆動部12は、回転軸C1に垂直な方向に半径位置rを変更させる場合には、対物レンズ21のγ軸方向における位置を、第1位置及び第2位置等に固定してもよい。具体的には、図11の左側の支持部40が示すように、駆動部12は、支持部40を回転させながら、-α軸方向に移動させる際には、Z位置を第1位置に固定する。これに対して、図11の右側の支持部40が示すように、駆動部12は、支持部40を回転させながら、+α軸方向に移動させる際には、Z位置を第2位置に固定する。第1位置と第2位置とは、γ軸方向の位置が異なってよい。第1位置と第2位置とのγ軸方向における差は、対物レンズ21を含む光学系の焦点深度以内であることが好ましい。駆動部12は、回転軸C1に垂直な方向に半径位置rを変更させる場合には、光軸角度Φを略90°としてよい。すなわち、駆動部12は、回転軸C1に垂直な方向に半径位置rを変更させる場合には、対物レンズ21の光軸C2と回転軸C1とが略平行になるように光軸角度Φを設定してもよい。なお、上記では、駆動部12によって、回転軸C1に垂直な方向に半径位置rを変更され、Z位置が第1位置及び第2位置等に固定され、あるいは、光軸角度Φが設定されるとしたが、これに代えてまたはこれに合わせて、駆動部22によって、これらが行われてもよい。 As in the Z scan shown in Figure 10, the driver 12 may fix the Z position when changing the radial position r in a direction perpendicular to the rotation axis C1. As an example, the driver 12 may fix the position of the objective lens 21 in the γ-axis direction when changing the radial position r in a direction perpendicular to the rotation axis C1. The driver 12 may fix the Z position to a first position, a second position, etc. when changing the radial position r in a direction perpendicular to the rotation axis C1. For example, the driver 12 may fix the position of the objective lens 21 in the γ-axis direction to a first position, a second position, etc. when changing the radial position r in a direction perpendicular to the rotation axis C1. Specifically, as shown by the support unit 40 on the left side of Figure 11, the driver 12 fixes the Z position to the first position when moving the support unit 40 in the -α-axis direction while rotating it. In contrast, as shown by the support unit 40 on the right side of Figure 11, when the driver 12 rotates the support unit 40 and moves it in the +α-axis direction, the driver 12 fixes the Z position at the second position. The first and second positions may be different in the γ-axis direction. The difference in the γ-axis direction between the first and second positions is preferably within the focal depth of the optical system including the objective lens 21. When the driver 12 changes the radial position r in a direction perpendicular to the rotation axis C1, the driver 12 may set the optical axis angle Φ to approximately 90°. That is, when the driver 12 changes the radial position r in a direction perpendicular to the rotation axis C1, the driver 12 may set the optical axis angle Φ so that the optical axis C2 of the objective lens 21 and the rotation axis C1 are approximately parallel. Note that, in the above description, the driver 12 changes the radial position r in a direction perpendicular to the rotation axis C1, fixes the Z position at the first position, the second position, etc., or sets the optical axis angle Φ. However, instead of or in addition to this, the driver 22 may perform these operations.

 撮像部20は、対象物の撮像画像を撮像する。撮像部20は、ウェーハWFのエッジWFE及びウェーハ面WF1の撮像画像を撮像する。撮像部20は、例えば、対物レンズ21、駆動部22及び受光部23を含む。前述したように、撮像部20は、さらに、光源24、ピンホール25、ビームスプリッタ26、光学素子27、光学素子28、センサ29を含んでもよい。なお、撮像部20は、ウェーハWFのエッジWFE及びウェーハ面WF1の撮像画像を撮像することができれば、上記の光学部材のいずれかを他の光学部材と置き換えてもよいし、上記の光学部材に加えて、さらに、他の光学部材を含んでもよい。撮像部20は、ウェーハWFの端部WFEで反射した反射光R1からエッジWFEの撮像画像を撮像する。また、撮像部20は、ウェーハ面WF1で反射した反射光R1からウェーハ面WF1の撮像画像を撮像する。このことを、簡便のために単に、撮像部20は、ウェーハWFの端部WFE及びウェーハ面WF1で反射した反射光R1からエッジWFE及びウェーハ面WF1の撮像画像を撮像する、とまとめて記載する場合がある。同様に、ウェーハWFのエッジWFE及びウェーハ面WF1の少なくともいずれかで反射した反射光R1を、簡便のために単に、ウェーハWFのエッジWFE及びウェーハ面WF1で反射した反射光R1とまとめて記載する場合がある。 The imaging unit 20 captures an image of an object. The imaging unit 20 captures an image of the edge WFE and wafer surface WF1 of the wafer WF. The imaging unit 20 includes, for example, an objective lens 21, a drive unit 22, and a light receiving unit 23. As described above, the imaging unit 20 may further include a light source 24, a pinhole 25, a beam splitter 26, an optical element 27, an optical element 28, and a sensor 29. Note that, as long as the imaging unit 20 can capture images of the edge WFE and wafer surface WF1 of the wafer WF, any of the above optical components may be replaced with other optical components, or other optical components may be included in addition to the above optical components. The imaging unit 20 captures an image of the edge WFE from the reflected light R1 reflected by the end WFE of the wafer WF. The imaging unit 20 also captures an image of the wafer surface WF1 from the reflected light R1 reflected by the wafer surface WF1. For simplicity, this may be simply described as the imaging unit 20 capturing images of the edge WFE and wafer surface WF1 from the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF. Similarly, for simplicity, the reflected light R1 reflected by at least one of the edge WFE and wafer surface WF1 of the wafer WF may be simply described as the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF.

 対物レンズ21は、対象物で反射した反射光R1を集光する。具体的には、例えば、対物レンズ21は、ウェーハWFのエッジWFE及びウェーハ面WF1で反射した反射光R1を集光する。反射光R1は、光源24から出射した照明光L1がウェーハWFのエッジWFE及びウェーハ面WF1の少なくともいずれかで反射したものでもよい。対物レンズ21は、光軸C2を有している。光軸C2の方向を光軸方向と呼ぶ。 The objective lens 21 focuses the reflected light R1 reflected by the object. Specifically, for example, the objective lens 21 focuses the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF. The reflected light R1 may be illumination light L1 emitted from the light source 24, reflected by at least one of the edge WFE and wafer surface WF1 of the wafer WF. The objective lens 21 has an optical axis C2. The direction of the optical axis C2 is called the optical axis direction.

 駆動部12及び駆動部22の少なくともいずれかは、対物レンズ21の光軸方向における、対象物と対物レンズ21の集光位置との相対位置であるZ位置を変更可能である。例えば、駆動部22は、光軸方向における対物レンズ21の位置を移動させる。また、駆動部12及び駆動部22の少なくともいずれかは、ステージ10の回転軸C1に直交する面に対する対物レンズ21の光軸C2の角度である光軸角度Φを変更可能である。さらに、駆動部12及び駆動部22の少なくとも何れかは、ステージ10の回転軸C1に垂直な方向における対象物と対物レンズ21の集光位置との相対位置である半径位置rを変更可能である。例えば、駆動部12は、ステージ10をα軸方向に移動させることにより、対物レンズ21の半径位置rを変更させる。 At least one of the drive units 12 and 22 can change the Z position, which is the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21. For example, the drive unit 22 moves the position of the objective lens 21 in the optical axis direction. At least one of the drive units 12 and 22 can also change the optical axis angle Φ, which is the angle of the optical axis C2 of the objective lens 21 with respect to a plane perpendicular to the rotation axis C1 of the stage 10. Furthermore, at least one of the drive units 12 and 22 can change the radial position r, which is the relative position between the object and the focusing position of the objective lens 21 in a direction perpendicular to the rotation axis C1 of the stage 10. For example, the drive unit 12 changes the radial position r of the objective lens 21 by moving the stage 10 in the α-axis direction.

 図12及び図13は、実施形態2に係る画像処理装置2において、支持部40の回転時に撮像部20が撮像画像を撮像する位置G1及びG2を例示した図である。図12に示すように、サンプリング時刻t=t1において、撮像部20が撮像画像を撮像する位置G1の情報は、サンプリング時刻t1、半径位置r1及び回転角度θ1を含む。図13に示すように、サンプリング時刻t=t2において、ステージ10等の支持部40の回転によって、回転角度θがθ1からθ2に変化する。それとともに、ステージ10等の支持部40の-α軸方向への移動によって、半径位置rがr1からr2に変化する。撮像部20が撮像画像を撮像する位置G2の情報は、サンプリング時刻t2、半径位置r2及び回転角度θ2を含む。 12 and 13 are diagrams illustrating positions G1 and G2 at which the imaging unit 20 captures an image when the support unit 40 rotates in an image processing device 2 according to embodiment 2. As shown in FIG. 12, at sampling time t=t1, information about position G1 at which the imaging unit 20 captures an image includes sampling time t1, radial position r1, and rotation angle θ1. As shown in FIG. 13, at sampling time t=t2, rotation of the support unit 40, such as the stage 10, causes the rotation angle θ to change from θ1 to θ2. At the same time, movement of the support unit 40, such as the stage 10, in the -α-axis direction causes the radial position r to change from r1 to r2. Information about position G2 at which the imaging unit 20 captures an image includes sampling time t2, radial position r2, and rotation angle θ2.

 撮像部20は、共焦点光学系を有してもよい。これにより、撮像部20は、ウェーハWFのエッジWFE及びウェーハ面WF1にフォーカスが合うように撮像することができる。しかしながら、本実施形態では、撮像部20等は、撮像時においては、フォーカスに着目せずに、対物レンズ21のZ位置を移動させてよい。フォーカスが合ったエッジWFE及びウェーハ面WF1の位置からの反射光R1の輝度は大きくなる。よって、フォーカスが合ったZ位置を特定することができる。そこで、撮像後において、フォーカスが合った半径位置r、Z位置、輝度情報、評価パラメータ、回転角度θ、光軸角度Φ、撮像したサンプリング時刻等の各種データ情報を参照する。参照した各種データ情報により、ウェーハWFのエッジWFE及びウェーハ面WF1のプロファイルを形成することができる。このようにして、撮像部20は、3次元測定を可能とすることができる。 The imaging unit 20 may have a confocal optical system. This allows the imaging unit 20 to capture images so that the edge WFE and wafer surface WF1 of the wafer WF are in focus. However, in this embodiment, the imaging unit 20 and the like may move the Z position of the objective lens 21 during imaging without paying attention to focus. The brightness of reflected light R1 from the focused positions of the edge WFE and wafer surface WF1 increases. Therefore, the focused Z position can be identified. Therefore, after imaging, various data information such as the focused radial position r, Z position, brightness information, evaluation parameters, rotation angle θ, optical axis angle Φ, and sampling time of the image are referenced. Profiles of the edge WFE and wafer surface WF1 of the wafer WF can be formed using the referenced various data information. In this way, the imaging unit 20 enables three-dimensional measurement.

 撮像部20は、無線及び有線の少なくとも何れかを含む通信回線で画像処理部30aに接続する。具体的には、撮像部20は、画像データ、半径位置r、Z位置、回転角度θ、光軸角度Φ、サンプリング時刻等のデータを含む情報を画像処理部30aに伝達可能な状態で接続されている。撮像部20は、画像処理部30aに対して、これらの情報を出力する。 The imaging unit 20 is connected to the image processing unit 30a via a communication line that includes at least one of wireless and wired connections. Specifically, the imaging unit 20 is connected in a state where it can transmit information including image data, radial position r, Z position, rotation angle θ, optical axis angle Φ, sampling time, etc. to the image processing unit 30a. The imaging unit 20 outputs this information to the image processing unit 30a.

 <画像処理部>
 次に、画像処理部30aを説明する。図14は、実施形態2に係る画像処理部30aを例示したブロック図である。図14に示すように、画像処理部30aは、前述の画像処理装置30と同様に、取得部31、特定部32、記憶部33、生成部34及び制御部35を備えている。なお、画像処理部30aは、記憶部33を含まなくてもよい。
<Image processing unit>
Next, the image processing unit 30a will be described. Fig. 14 is a block diagram illustrating the image processing unit 30a according to the second embodiment. As shown in Fig. 14, the image processing unit 30a includes an acquisition unit 31, an identification unit 32, a storage unit 33, a generation unit 34, and a control unit 35, similar to the image processing device 30 described above. Note that the image processing unit 30a does not necessarily have to include the storage unit 33.

 取得部31は、対物レンズ21を介して、対象物の撮像画像を取得する。取得部31は、駆動部12及び22の少なくともいずれかで、対象物を回転軸C1の周りで回転させながら、Z位置を変更させることにより、対物レンズ21を介して対象物の端部または主面の撮像画像を取得する。取得部31は、Z位置がそれぞれ異なり、且つ、対象物の回転角度θがそれぞれ異なる複数の撮像画像を取得する。 The acquisition unit 31 acquires an image of the object via the objective lens 21. The acquisition unit 31 acquires an image of the end or main surface of the object via the objective lens 21 by changing the Z position while rotating the object around the rotation axis C1 using at least one of the drive units 12 and 22. The acquisition unit 31 acquires multiple images with different Z positions and different rotation angles θ of the object.

 図15は、実施形態2に係る画像処理部30aにおいて、取得部31が取得した撮像画像を例示した図であり、横軸は、回転角度θを示し、縦軸は、所定の半径位置rにおけるウェーハWFのウェーハ面WF1上の領域を撮像した画素を示す。図15に示すように、取得部31は、Z位置及び半径位置rがそれぞれ異なる複数の撮像画像を取得する。各撮像画像は、ウェーハWFのウェーハ面WF1における撮像部20が撮像する位置Gの回転角度θと、回転角度θにおけるウェーハ面WF1を撮像した画素と、を対応させている。撮像画像における回転角度θは、ウェーハWFのウェーハ面WF1の中心軸C1を囲む全周に対応するように、0°~360°の範囲を含んでもよいし、所定の一部の範囲を含んでもよい。 FIG. 15 is a diagram illustrating an example of an image acquired by the acquisition unit 31 in the image processing unit 30a according to embodiment 2, where the horizontal axis represents the rotation angle θ and the vertical axis represents the pixels capturing an image of an area on the wafer surface WF1 of the wafer WF at a predetermined radial position r. As shown in FIG. 15, the acquisition unit 31 acquires multiple images with different Z positions and radial positions r. Each image corresponds to the rotation angle θ of position G on the wafer surface WF1 of the wafer WF captured by the imaging unit 20, and the pixels capturing the image of the wafer surface WF1 at the rotation angle θ. The rotation angle θ in the captured image may include a range of 0° to 360° so as to correspond to the entire circumference surrounding the central axis C1 of the wafer surface WF1 of the wafer WF, or may include a predetermined partial range.

 取得部31は、Z位置がZ=Z1の場合において、半径位置rを変化させた複数の撮像画像を取得してもよい。例えば、取得部31は、半径位置r1~r3以上までの半径位置rを変化させた場合の各撮像画像を取得してもよい。そして、さらに、取得部31は、Z位置として、第1位置Z1、第2位置Z2、第3位置Z3、・・のように、対物レンズ21を含む光学系の焦点深度以内の所定の距離を空けてγ軸方向に移動させた場合の複数の撮像画像を取得してもよい。つまり、Z位置が第1位置Z1の場合において、半径位置r1~r3以上の複数の撮像画像、Z位置が第2位置Z2の場合において、半径位置r1~r3以上の複数の撮像画像、Z位置が第3位置Z3の場合において、半径位置r1~r3以上の複数の撮像画像を取得してもよい。各撮像画像は、横軸に、回転角度θを示し、縦軸に、所定の半径位置rにおけるウェーハWFのウェーハ面WF1上の領域を撮像した画素を示す。 When the Z position is Z=Z1, the acquisition unit 31 may acquire multiple captured images with the radial position r changed. For example, the acquisition unit 31 may acquire each captured image when the radial position r is changed from r1 to r3 or more. Furthermore, the acquisition unit 31 may acquire multiple captured images when the Z position is moved in the γ-axis direction at a predetermined distance within the focal depth of the optical system including the objective lens 21, such as from the first position Z1 to the second position Z2 to the third position Z3, etc. In other words, when the Z position is the first position Z1, multiple captured images may be acquired from the radial positions r1 to r3 or more; when the Z position is the second position Z2, multiple captured images may be acquired from the radial positions r1 to r3 or more; and when the Z position is the third position Z3, multiple captured images may be acquired from the radial positions r1 to r3 or more. For each captured image, the horizontal axis represents the rotation angle θ, and the vertical axis represents the pixels capturing the area on the wafer surface WF1 of the wafer WF at the predetermined radial position r.

 図16は、実施形態2に係る画像処理部30aにおいて、サンプリング時刻tごとに半径位置rが増加する場合における、各撮像画像に含まれる画素の位置とウェーハWFのウェーハ面WF1上の位置との対応関係を説明する図である。以下で、r01<r02<r03<r0N<r0Eである。 FIG. 16 is a diagram illustrating the correspondence between the positions of pixels included in each captured image and the positions on the wafer surface WF1 of the wafer WF when the radial position r increases at each sampling time t in the image processing unit 30a according to the second embodiment. Here, r01<r02<r03<r0N<r0E.

 サンプリング時刻t=t11において、撮像画像に含まれる画素は、半径位置r=r01及び回転角度θ=0°におけるウェーハWFのウェーハ面WF1上の位置に対応している。その後のサンプリング時刻t=t13において、撮像画像に含まれる画素は、半径位置r=r02及び回転角度θ=360°(0°でもある)におけるウェーハWFのウェーハ面WF1上の位置に対応している。時刻t11と時刻t13の間のサンプリング時刻t=t12において、撮像画像に含まれる画素は、半径位置がr01とr02との間である半径位置及び回転角度が0°から360°までの角度におけるウェーハWFのウェーハ面WF1上の位置に対応してよい。 At sampling time t=t11, the pixels included in the captured image correspond to positions on the wafer surface WF1 of the wafer WF at radial position r=r01 and rotation angle θ=0°. At a subsequent sampling time t=t13, the pixels included in the captured image correspond to positions on the wafer surface WF1 of the wafer WF at radial position r=r02 and rotation angle θ=360° (which is also 0°). At sampling time t=t12 between times t11 and t13, the pixels included in the captured image may correspond to positions on the wafer surface WF1 of the wafer WF at a radial position between r01 and r02 and at a rotation angle between 0° and 360°.

 サンプリング時刻t=t21(t13と同じ時刻でもよい)において、撮像画像に含まれる画素は、半径位置r=r02及び回転角度θ=0°におけるウェーハWFのウェーハ面WF1上の位置に対応している。その後のサンプリング時刻t=t23において、撮像画像に含まれる画素は、半径位置r=r03及び回転角度θ=360°(0°でもある)におけるウェーハWFのウェーハ面WF1上の位置に対応している。 At sampling time t = t21 (which may be the same time as t13), the pixels included in the captured image correspond to positions on the wafer surface WF1 of the wafer WF at radial position r = r02 and rotation angle θ = 0°. At the subsequent sampling time t = t23, the pixels included in the captured image correspond to positions on the wafer surface WF1 of the wafer WF at radial position r = r03 and rotation angle θ = 360° (which is also 0°).

 サンプリング時刻t=tN1において、撮像画像に含まれる画素は、半径位置r=r0N及び回転角度θ=0°におけるウェーハWFのウェーハ面WF1上の位置に対応している。その後のサンプリング時刻t=tN3において、撮像画像に含まれる画素は半径位置r=r0E及び回転角度θ=360°(0°でもある)におけるウェーハWFのウェーハ面WF1上の位置に対応している。 At sampling time t = tN1, the pixels included in the captured image correspond to positions on the wafer surface WF1 of the wafer WF at radial position r = r0N and rotation angle θ = 0°. At a subsequent sampling time t = tN3, the pixels included in the captured image correspond to positions on the wafer surface WF1 of the wafer WF at radial position r = r0E and rotation angle θ = 360° (which is also 0°).

 半径位置r01は、一例として、回転角度0°において半径位置rを半径位置r01から微小量増加させた場合に、最もウェーハ面WF1の中央近傍にある計測対象領域が撮像画像に含まれることとなる半径位置であってよい。半径位置r0Eは、一例として、回転角度0°において半径位置を半径位置r0Eから微小量減少させた場合に、最もウェーハWFのエッジWFEの近傍にある計測対象領域が撮像画像に含まれることとなる半径位置であってよい。これにより、ウェーハ面WF1上の計測対象領域を不足なく撮像画像に含めることができる。 As an example, radial position r01 may be the radial position at which the measurement target area closest to the center of the wafer surface WF1 will be included in the captured image when the radial position r is increased by a small amount from radial position r01 at a rotation angle of 0°. As an example, radial position r0E may be the radial position at which the measurement target area closest to the edge WFE of the wafer WF will be included in the captured image when the radial position is decreased by a small amount from radial position r0E at a rotation angle of 0°. This allows the measurement target area on the wafer surface WF1 to be included in the captured image without any deficiencies.

 簡便のために、図16では、撮像画像に含まれる画素を一方向に延びるライン状に配置しているがこれに限らない。また、図16は、半径位置rが順次増加する場合の例を示したが、半径位置rが順次減少する場合には、上記の説明を逆に辿ればよい。 For simplicity, in Figure 16, the pixels included in the captured image are arranged in a line extending in one direction, but this is not limited to this. Also, Figure 16 shows an example where the radial position r increases sequentially, but if the radial position r decreases sequentially, the above explanation can be followed in reverse.

 取得部31は、図16で示す例により、複数のサンプリング時刻tでウェーハWFのウェーハ面WF1を撮像した撮像画像に基づいて、図15に例示した撮像画像を取得することが可能である。 The acquisition unit 31 can acquire the captured image shown in FIG. 15 based on captured images of the wafer surface WF1 of the wafer WF at multiple sampling times t, as shown in the example of FIG. 16.

 取得部31は、第1波長の光を受光する第1受光部23aから撮像画像を取得してもよい。第1受光部23aから取得した撮像画像を第1撮像画像と呼ぶ。取得部31は、第2波長の光を受光する第2受光部23bから撮像画像を取得してもよい。第2受光部23bから取得した撮像画像を第2撮像画像と呼ぶ。取得部31は、第3波長の光を受光する第3受光部23cから撮像画像を取得してもよい。第3受光部23cから取得した撮像画像を第3撮像画像と呼ぶ。取得部31は、図5と同様な第2検査画像及び第3検査画像を取得してもよい。第1撮像画像、第2撮像画像及び第3撮像画像は、それぞれ、半径位置r及びZ位置が異なる複数の検査画像を含んでもよい。 The acquisition unit 31 may acquire captured images from the first light receiving unit 23a that receives light of a first wavelength. The captured image acquired from the first light receiving unit 23a is referred to as the first captured image. The acquisition unit 31 may acquire captured images from the second light receiving unit 23b that receives light of a second wavelength. The captured image acquired from the second light receiving unit 23b is referred to as the second captured image. The acquisition unit 31 may acquire captured images from the third light receiving unit 23c that receives light of a third wavelength. The captured image acquired from the third light receiving unit 23c is referred to as the third captured image. The acquisition unit 31 may acquire second and third inspection images similar to those in FIG. 5. The first, second, and third captured images may each include multiple inspection images with different radial positions r and Z positions.

 制御部35は、支持部40及び駆動部12、22を制御し、取得部31に複数の撮像画像を取得させる。また、制御部35は、取得部31、特定部32、記憶部33及び生成部34の動作を制御する。前述の実施形態1では、制御部35は、支持部40により対象物を回転軸C1の回りで回転させながら、駆動部12及び駆動部22の少なくともいずれかによりZ位置を変更させることで、取得部31に対象物の端部の撮像画像を複数取得させる。 The control unit 35 controls the support unit 40 and drive units 12 and 22, and causes the acquisition unit 31 to acquire multiple captured images. The control unit 35 also controls the operation of the acquisition unit 31, identification unit 32, memory unit 33, and generation unit 34. In the above-described first embodiment, the control unit 35 causes the acquisition unit 31 to acquire multiple captured images of the end of the object by changing the Z position using at least one of the drive units 12 and 22 while rotating the object around the rotation axis C1 using the support unit 40.

 一方、本実施形態において、制御部35は、支持部40により対象物を回転軸C1の回りで回転させさせながら、駆動部22により半径位置rを回転軸に垂直な第1方向に変更させ、且つ、Z位置を第1位置で一定にさせて、取得部41に対象物の主面の複数の撮像画像を取得させる。また、制御部35は、支持部40により対象物を回転軸C1の回りで回転させさせながら、駆動部22により半径位置rを回転軸に垂直な第2方向に変更させ、且つ、Z位置を第2位置で一定にさせて、取得部41に対象物の主面の複数の撮像画像を取得させる。ここで、第2方向は、第1方向と反対方向である。第2位置は、第1位置と異なる位置である。また、第1位置と第2位置との差は、対物レンズ21を含む撮像部20の光学系の焦点深度以内である。なお、本実施形態は、駆動部22の代わりに、または、駆動部22に加えて、駆動部12を駆動させてもよい。 In contrast, in this embodiment, the control unit 35 rotates the object around the rotation axis C1 using the support unit 40, while causing the drive unit 22 to change the radial position r in a first direction perpendicular to the rotation axis, and while maintaining the Z position at the first position, causing the acquisition unit 41 to acquire multiple captured images of the object's main surface. Furthermore, while rotating the object around the rotation axis C1 using the support unit 40, the control unit 35 changes the radial position r in a second direction perpendicular to the rotation axis using the drive unit 22, and while maintaining the Z position at the second position, causing the acquisition unit 41 to acquire multiple captured images of the object's main surface. Here, the second direction is the opposite direction to the first direction. The second position is a position different from the first position. Furthermore, the difference between the first position and the second position is within the focal depth of the optical system of the imaging unit 20, including the objective lens 21. Note that in this embodiment, the drive unit 12 may be driven instead of or in addition to the drive unit 22.

 特定部32は、Z位置がそれぞれ異なり、且つ、対象物の回転角度θがそれぞれ異なる複数の撮像画像において、輝度が所定以上となる画素の位置である画素位置Pを特定する。特定部32は、半径位置r、Z位置、回転角度θ及び光軸角度Φがそれぞれ異なる複数の撮像画像において、輝度が所定以上となる画素の位置である画素位置Pを特定してもよい。なお、前述したように、輝度が所定以上であることは、画素についての評価パラメータの良好さが所定以上であることの一例である。 The identification unit 32 identifies pixel position P, which is the position of a pixel where the brightness is above a predetermined level, in a plurality of captured images that each have a different Z position and a different rotation angle θ of the object. The identification unit 32 may also identify pixel position P, which is the position of a pixel where the brightness is above a predetermined level, in a plurality of captured images that each have a different radial position r, Z position, rotation angle θ, and optical axis angle Φ. Note that, as described above, brightness above a predetermined level is an example of the quality of the evaluation parameters for a pixel being above a predetermined level.

 特定部32は、取得部31によって取得された半径位置rまたはZ位置がそれぞれ異なる複数の第1撮像画像において第1波長画素位置P1を特定してもよい。特定部32は、取得部31によって取得された半径位置rまたはZ位置がそれぞれ異なる複数の第2撮像画像において第2波長画素位置を特定してもよいし、半径位置rまたはZ位置がそれぞれ異なる複数の第3撮像画像において第3波長画素位置を特定してもよい。ここで、第1撮像画像において輝度が所定以上となる画素位置を第1波長画素位置P1と呼び、第2撮像画像において輝度が所定以上となる画素位置を第2波長画素位置P2(図示省略)と呼び、第3撮像画像において輝度が所定以上となる画素位置を第3波長画素位置P3(図示省略)と呼ぶ。 The identification unit 32 may identify a first-wavelength pixel position P1 in a plurality of first captured images acquired by the acquisition unit 31, each having a different radial position r or Z position. The identification unit 32 may identify a second-wavelength pixel position in a plurality of second captured images acquired by the acquisition unit 31, each having a different radial position r or Z position, or may identify a third-wavelength pixel position in a plurality of third captured images, each having a different radial position r or Z position. Here, a pixel position in the first captured image where the brightness is greater than or equal to a predetermined value is referred to as a first-wavelength pixel position P1, a pixel position in the second captured image where the brightness is greater than or equal to a predetermined value is referred to as a second-wavelength pixel position P2 (not shown), and a pixel position in the third captured image where the brightness is greater than or equal to a predetermined value is referred to as a third-wavelength pixel position P3 (not shown).

 記憶部33は、特定した画素位置Pを、各撮像画像におけるZ位置及び回転角度θとともに高輝度時情報として記憶する。記憶部33は、特定した画素位置Pを、各撮像画像における半径位置r、Z位置、回転角度θ、光軸角度Φ及びサンプリング時刻とともに高輝度時情報として記憶してもよい。 The storage unit 33 stores the identified pixel position P together with the Z position and rotation angle θ in each captured image as high-brightness information. The storage unit 33 may also store the identified pixel position P together with the radial position r, Z position, rotation angle θ, optical axis angle Φ, and sampling time in each captured image as high-brightness information.

 生成部34は、記憶した複数の高輝度時情報に基づいて、対象物のイメージ情報を生成してもよい。また、生成部34は、輝度が所定以上となる画素に基づいて、対象物のイメージ情報を生成してもよい。具体的には、生成部34は、回転角度θが特定の回転角度θであってZ位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、特定の回転角度θにおける対象物のイメージ情報を生成してもよい。 The generation unit 34 may generate image information of the object based on multiple pieces of high-brightness information that have been stored. The generation unit 34 may also generate image information of the object based on pixels with a predetermined or higher brightness. Specifically, the generation unit 34 may generate image information of the object at a specific rotation angle θ based on pixels with a predetermined or higher brightness in multiple captured images that have different Z positions at a specific rotation angle θ.

 つまり、生成部34は、回転角度θが第1の回転角度θ1であって、Z位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素と、回転角度θが第2の回転角度θ2であって、Z位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素と、に基づいて、複数の特定の回転角度θにおける対象物のイメージ情報を生成してもよい。生成部34は、上述したイメージ情報を、複数の半径位置rについて生成してもよい。このようにして、生成部34は、対象物の主面のイメージ情報を生成してもよい。イメージ情報は、対象物の主面のプロファイルを含んでもよい。 In other words, the generation unit 34 may generate image information of the object at multiple specific rotation angles θ based on pixels whose brightness is a predetermined value or higher in multiple captured images whose Z positions are different and whose rotation angle θ is a first rotation angle θ1, and pixels whose brightness is a predetermined value or higher in multiple captured images whose Z positions are different and whose rotation angle θ is a second rotation angle θ2. The generation unit 34 may generate the above-mentioned image information for multiple radial positions r. In this way, the generation unit 34 may generate image information of the main surface of the object. The image information may include a profile of the main surface of the object.

 図17は、実施形態2に係る画像処理装置2において、生成部34が生成したウェーハWFのウェーハ面WF1のプロファイルを例示した図である。図17では、回転角度θがθ=θ1の場合を示している。また、図17では、半径位置rが複数のr1~r5の場合を示している。図17に示すように、生成部34は、イメージ情報として、ウェーハWFのウェーハ面WF1の断面におけるプロファイルを生成してもよい。前述のように輝度(画素の評価パラメータの良好さ)が所定以上である画素は、そのZ位置において、エッジWFEやウェーハ面WF1の表面に対して適切にフォーカスされ、例えばジャストフォーカスされているものと考えてもよい。従って、生成部34は、このときのZ位置に基づいて、ウェーハWFのウェーハ面WF1やエッジWFEの断面におけるプロファイルを生成することができる。 FIG. 17 is a diagram illustrating a profile of the wafer surface WF1 of the wafer WF generated by the generation unit 34 in the image processing device 2 according to the second embodiment. FIG. 17 shows the case where the rotation angle θ is θ=θ1. FIG. 17 also shows the case where the radial position r is multiple values r1 to r5. As shown in FIG. 17, the generation unit 34 may generate a profile of the cross section of the wafer surface WF1 of the wafer WF as image information. As described above, a pixel whose brightness (goodness of the pixel evaluation parameter) is equal to or greater than a predetermined value can be considered to be appropriately focused on the surface of the edge WFE or the wafer surface WF1 at that Z position, e.g., just focused. Therefore, the generation unit 34 can generate a profile of the cross section of the wafer surface WF1 or the edge WFE of the wafer WF based on the Z position at that time.

 生成部34は、高輝度時情報または輝度が所定以上となる画素に基づいて、対象物の主面における全周の回転角度θについてのイメージ情報を生成してもよい。例えば、生成部34は、ウェーハWFのウェーハ面WF1における全周の回転角度θについての断面のプロファイルを生成してもよい。 The generation unit 34 may generate image information about the rotation angle θ of the entire circumference of the main surface of the object based on the high-brightness information or pixels with a brightness equal to or greater than a predetermined value. For example, the generation unit 34 may generate a cross-sectional profile about the rotation angle θ of the entire circumference of the wafer surface WF1 of the wafer WF.

 なお、生成部34は、前述した実施形態1にように、対象物の端部のイメージ情報を生成してもよい。すなわち、生成部34は、回転角度θが特定の回転角度θであり、光軸角度Φが第1の光軸角度であってZ位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素と、回転角度θが特定の回転角度θであり、光軸角度Φが第2の光軸角度であってZ位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素と、に基づいて、特定の回転角度θにおける対象物の端部のイメージ情報を生成する。生成部34が、高輝度時情報または輝度が所定以上となる画素に基づいて、イメージ情報を生成することには、生成部34が、輝度が所定以上である画素における輝度について、輝度の高低情報や、このときのZ位置に基づく物理的な高さを示す高低情報を出力することが含まれてよい。また、生成部34が、高輝度時情報または輝度が所定以上となる画素に基づいて、イメージ情報を生成することには、生成部34が、輝度が所定以上である複数の画素における複数の輝度について、その平均や標準偏差などの統計値を取得し、あるいは、複数の輝度をもとに内挿補完した値を取得し、これを輝度における高低情報として、あるいは物理的な高さに変換した高低情報として、出力することが含まれてよい。生成部は、第1撮像画像及び第2撮像画像等において、それぞれ輝度が所定以上となる画素に基づいて、複数の色で表された対象物のカラーイメージを生成してもよい。 The generation unit 34 may generate image information of the end of the object, as in the first embodiment described above. That is, the generation unit 34 generates image information of the end of the object at a specific rotation angle θ based on pixels whose brightness is above a predetermined level in multiple captured images where the rotation angle θ is a specific rotation angle θ, the optical axis angle Φ is a first optical axis angle, and the Z positions are different from each other, and pixels whose brightness is above a predetermined level in multiple captured images where the rotation angle θ is a specific rotation angle θ, the optical axis angle Φ is a second optical axis angle, and the Z positions are different from each other. Generating image information based on high-brightness information or pixels whose brightness is above a predetermined level may include the generation unit 34 outputting brightness level information for the brightness of pixels whose brightness is above a predetermined level, or height information indicating the physical height based on the Z position at that time. Furthermore, the generation unit 34 generating image information based on high-brightness information or pixels with a predetermined or higher brightness may include the generation unit 34 obtaining statistical values such as the average or standard deviation of multiple brightnesses for multiple pixels with a predetermined or higher brightness, or obtaining values interpolated based on the multiple brightnesses, and outputting this as brightness level information or height information converted into physical height. The generation unit may generate a color image of the object represented in multiple colors based on pixels with a predetermined or higher brightness in the first captured image, the second captured image, etc.

 制御部35は、イメージ情報に基づいて、ウェーハWFについての欠陥等の検査を行ってもよい。例えば理想的なウェーハWFのイメージ情報と、検査対象ウェーハWFのイメージ情報とを比較することにより検査対象ウェーハWFを検査してよい。イメージ情報に代えて、イメージ情報に基づき生成されたエッジWFEのプロファイル、カラーイメージ、エッジWFEの画像を用いて、検査対象ウェーハWFを検査してよい。 The control unit 35 may inspect the wafer WF for defects, etc. based on the image information. For example, the inspection target wafer WF may be inspected by comparing image information of an ideal wafer WF with image information of the inspection target wafer WF. Instead of image information, the inspection target wafer WF may be inspected using a profile, color image, or image of the edge WFE generated based on the image information.

 <画像処理方法>
 次に、画像処理方法を説明する。図18は、実施形態2に係る画像処理装置2を用いた画像処理方法を例示したフローチャート図である。
<Image processing method>
Next, an image processing method will be described with reference to Fig. 18. Fig. 18 is a flow chart illustrating an image processing method using the image processing device 2 according to the second embodiment.

 図18のステップS31に示すように、対象物を支持する。例えば、制御部35は、対象物を回転軸C1の回りで回転可能に支持部40に支持させる。 As shown in step S31 of FIG. 18, the object is supported. For example, the control unit 35 causes the support unit 40 to support the object so that it can rotate around the rotation axis C1.

 次に、ステップS32に示すように、反射光R1を対物レンズ21で集光する。具体的には、例えば、制御部35は、駆動部22を駆動させて、対象物で反射した反射光R1を対物レンズ21で集光させるように、対物レンズ21の位置を制御させる。 Next, as shown in step S32, the reflected light R1 is focused by the objective lens 21. Specifically, for example, the control unit 35 drives the drive unit 22 to control the position of the objective lens 21 so that the reflected light R1 reflected by the object is focused by the objective lens 21.

 次に、ステップS33に示すように、Z位置を駆動部22等で変更させる。具体的には、例えば、制御部35は、駆動部22を駆動させて、対物レンズ21の光軸方向における、対象物と対物レンズ21の集光位置との相対位置であるZ位置を駆動部22に変更させる。駆動部22等は、回転軸C1に直交した面に対する対物レンズ21の光軸角度Φを変更させてもよい。 Next, as shown in step S33, the Z position is changed by the driver 22 or the like. Specifically, for example, the control unit 35 drives the driver 22 to cause the driver 22 to change the Z position, which is the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21. The driver 22 or the like may also change the optical axis angle Φ of the objective lens 21 with respect to a plane perpendicular to the rotation axis C1.

 次に、ステップS34に示すように、撮像画像を取得部31に取得させる。例えば、制御部35は、取得部31が撮像画像を取得するように制御する。これにより、制御部35は、対物レンズ21を介して対象物の撮像画像を取得部31に取得させる。制御部35は、取得部31に複数の撮像画像を取得させてもよい。 Next, as shown in step S34, the acquisition unit 31 is caused to acquire the captured image. For example, the control unit 35 controls the acquisition unit 31 to acquire the captured image. As a result, the control unit 35 causes the acquisition unit 31 to acquire the captured image of the object via the objective lens 21. The control unit 35 may also cause the acquisition unit 31 to acquire multiple captured images.

 制御部35は、支持部40により対象物を回転軸C1の回りで回転させながら、駆動部12及び駆動部22の少なくともいずれかによりZ位置を変更させることで、取得部31に対象物の端部の複数の撮像画像を取得させてもよい。 The control unit 35 may cause the acquisition unit 31 to acquire multiple captured images of the end of the object by rotating the object around the rotation axis C1 using the support unit 40 and changing the Z position using at least one of the drive units 12 and 22.

 制御部35は、支持部40により対象物を回転軸C1の回りで回転させながら、駆動部により前記半径位置を前記回転軸に垂直な第1方向に変更させ、且つ、Z位置を第1位置で一定にさせて、取得部31に対象物の主面の複数の撮像画像を取得させてもよい。 The control unit 35 may rotate the object around the rotation axis C1 using the support unit 40, change the radial position in a first direction perpendicular to the rotation axis using the drive unit, and keep the Z position constant at the first position, causing the acquisition unit 31 to acquire multiple captured images of the main surface of the object.

 制御部35は、支持部40により対象物を回転軸C1の回りで回転させながら、駆動部により前記半径位置を前記回転軸に垂直な第2方向に変更させ、且つ、Z位置を第2位置で一定にさせて、取得部31に対象物の主面の複数の撮像画像を取得させてもよい。 The control unit 35 may rotate the object around the rotation axis C1 using the support unit 40, change the radial position in a second direction perpendicular to the rotation axis using the drive unit, and keep the Z position constant at the second position, causing the acquisition unit 31 to acquire multiple captured images of the main surface of the object.

 次に、ステップS35に示すように、画素位置を特定する。例えば、制御部35は、Z位置がそれぞれ異なり、且つ、対象物の回転角度θがそれぞれ異なる複数の撮像画像において、輝度が所定以上となる画素の位置である画素位置Pを特定部32に特定させる。本実施形態の画像処理方法は、ステップS35とステップ36との間に、特定した画素位置Pを、各撮像画像における半径位置r、Z位置、回転角度θ、光軸角度Φ及びサンプリング時刻とともに高輝度時情報として記憶部33に記憶させるステップを有してもよい。 Next, as shown in step S35, the pixel position is identified. For example, the control unit 35 causes the identification unit 32 to identify pixel position P, which is the position of a pixel where the brightness is equal to or greater than a predetermined value, in multiple captured images each having a different Z position and a different rotation angle θ of the object. The image processing method of this embodiment may include, between steps S35 and 36, a step of storing the identified pixel position P in the memory unit 33 as high-brightness information, together with the radial position r, Z position, rotation angle θ, optical axis angle Φ, and sampling time in each captured image.

 次に、ステップS36に示すように、イメージ情報を生成する。具体的には、制御部35は、輝度が所定以上となる画素に基づいて、対象物のイメージ情報を生成部34に生成させる。生成部34は、回転角度θが特定の回転角度θであってZ位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、特定の回転角度θにおける対象物のイメージ情報を生成する。生成部34は、回転角度θが第1の回転角度θ1であってZ位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素と、回転角度θが第2の回転角度θ2であってZ位置が互いに異なる複数の撮像画像において、それぞれ輝度が所定以上となる画素と、に基づいて、複数の特定の回転角度θにおける対象物のイメージ情報を生成してもよい。なお、制御部35は、記憶した複数の高輝度時情報に基づいて、対象物のイメージ情報を生成部34に生成させてもよい。 Next, as shown in step S36, image information is generated. Specifically, the control unit 35 causes the generation unit 34 to generate image information of the object based on pixels with a predetermined brightness or higher. The generation unit 34 generates image information of the object at a specific rotation angle θ based on pixels with a predetermined brightness or higher in multiple captured images at a specific rotation angle θ and at different Z positions. The generation unit 34 may generate image information of the object at multiple specific rotation angles θ based on pixels with a predetermined brightness or higher in multiple captured images at a first rotation angle θ1 and at different Z positions, and pixels with a predetermined brightness or higher in multiple captured images at a second rotation angle θ2 and at different Z positions. Note that the control unit 35 may cause the generation unit 34 to generate image information of the object based on multiple stored high-brightness information.

 以上、本発明の実施形態を説明したが、本発明はその目的と利点を損なうことのない適宜の変形を含み、更に、上記の実施形態よる限定は受けない。 The above describes an embodiment of the present invention, but the present invention includes appropriate modifications that do not impair its objects and advantages, and is not limited to the above embodiment.

 (付記1)
 回転軸を有するステージで対象物を前記回転軸の周りで回転させながら、前記対象物の端部で反射した反射光を集光する対物レンズの光軸方向における前記対物レンズの位置であるZ位置を移動させることにより、前記対物レンズで集光された前記反射光から複数の前記Z位置における前記対象物の前記端部の検査画像を取得する取得部と、
 前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記検査画像において、輝度が所定以上となる画素の位置である画素位置を特定する特定部と、
 特定した前記画素位置を、各検査画像における前記Z位置及び前記回転角度とともに高輝度時情報として記憶する記憶部と、
 記憶した複数の前記高輝度時情報に基づいて、前記対象物の前記端部のイメージ情報を生成する生成部と、
 を備えた画像処理装置。
 (付記2)
 前記記憶部は、前記高輝度時情報として、前記回転軸に直交した面に対する前記対物レンズの光軸の角度である光軸角度を前記画素位置とともに記憶する、
 付記1に記載の画像処理装置。
 (付記3)
 前記対象物は、板状物体を含み、
 前記記憶部は、複数の前記光軸角度における前記高輝度時情報を記憶し、
 前記生成部は、前記対象物に対する複数の前記光軸角度における前記高輝度時情報に基づいて、前記イメージ情報を生成する、
 付記2に記載の画像処理装置。
 (付記4)
 前記生成部は、前記高輝度時情報に基づいて、前記対象物の前記端部における全周の前記回転角度についての前記イメージ情報を生成する、
 付記1に記載の画像処理装置。
 (付記5)
 前記取得部は、
 前記反射光における第1の色に対応した第1波長の光を受光する第1受光部から第1検査画像を取得し、
 前記反射光における第2の色に対応した第2波長の光を受光する第2受光部から第2検査画像を取得し、
 前記特定部は、
 前記Z位置がそれぞれ異なる複数の前記第1検査画像において前記輝度が所定以上となる前記画素位置である第1波長画素位置を特定し、
 前記Z位置がそれぞれ異なる複数の前記第2検査画像において前記輝度が所定以上となる前記画素位置である第2波長画素位置を特定し、
 前記記憶部は、
 前記第1波長画素位置を、各第1検査画像における第1波長高輝度時情報として、前記Z位置及び前記回転角度とともに記憶し、
 前記第2波長画素位置を、各第2検査画像における第2波長高輝度時情報として、前記Z位置及び前記回転角度とともに記憶し、
 前記生成部は、
 前記第1波長高輝度時情報及び前記第2波長高輝度時情報に基づいて、複数の色で表された前記対象物の前記端部のカラーイメージを生成する、
 付記1に記載の画像処理装置。
 (付記6)
 前記取得部は、
 前記反射光における第3の色に対応した第3波長の光を受光する第3受光部から第3検査画像をさらに取得し、
 前記特定部は、
 前記Z位置がそれぞれ異なる複数の前記第3検査画像において前記輝度が所定以上となる前記画素位置である第3波長画素位置をさらに特定し、
 前記記憶部は、
 前記第3波長画素位置を、各第3検査画像における第3波長高輝度時情報として、前記Z位置及び前記回転角度とともにさらに記憶し、
 前記生成部は、
 前記第1波長高輝度時情報、前記第2波長高輝度時情報及び前記第3波長高輝度時情報に基づいて、複数の色で表された前記対象物の前記端部のカラーイメージを生成する、
 付記5に記載の画像処理装置。
 (付記7)
 前記対象物は、ウェーハを含み、
 前記端部は、前記ウェーハのエッジを含む、
 付記1に記載の画像処理装置。
 (付記8)
 前記イメージ情報に基づいて、前記対象物の検査を行う制御部を備える、
 付記1~7のいずれか1項に記載の画像処理装置。
 (付記9)
 回転軸を有するステージで対象物を前記回転軸の周りで回転させながら、前記対象物の端部で反射した反射光を集光する対物レンズの光軸方向における前記対物レンズの位置であるZ位置を移動させることにより、前記対物レンズで集光された前記反射光から複数の前記Z位置における前記対象物の前記端部の検査画像を取得部に取得させる第1ステップと、
 前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記検査画像において、輝度が所定以上となる画素の位置である画素位置を特定部に特定させる第2ステップと、
 特定した前記画素位置を、各検査画像における前記Z位置及び前記回転角度とともに高輝度時情報として記憶部に記憶させる第3ステップと、
 記憶した複数の前記高輝度時情報に基づいて、前記対象物の前記端部のイメージ情報を生成部に生成させる第4ステップと、
 を備えた画像処理方法。
 (付記10)
 前記第3ステップにおいて、
 前記高輝度時情報として、前記回転軸に直交した面に対する前記対物レンズの光軸の角度である光軸角度を前記画素位置とともに記憶させる、
 付記9に記載の画像処理方法。
 (付記11)
 前記対象物は、板状物体を含み、
 前記第3ステップにおいて、
 複数の前記光軸角度における前記高輝度時情報を記憶させ、
 前記第4ステップにおいて、
 前記対象物に対する複数の前記光軸角度における前記高輝度時情報に基づいて、前記イメージ情報を生成させる、
 付記10に記載の画像処理方法。
 (付記12)
 前記第4ステップにおいて、
 前記高輝度時情報に基づいて、前記対象物の前記端部における全周の前記回転角度についての前記イメージ情報を生成させる、
 付記9に記載の画像処理方法。
 (付記13)
 前記第1ステップにおいて、
 前記反射光における第1の色に対応した第1波長の光を受光する第1受光部から第1検査画像を取得させ、
 前記反射光における第2の色に対応した第2波長の光を受光する第2受光部から第2検査画像を取得させ、
 前記第2ステップにおいて、
 前記Z位置がそれぞれ異なる複数の前記第1検査画像において前記輝度が所定以上となる前記画素位置である第1波長画素位置を特定させ、
 前記Z位置がそれぞれ異なる複数の前記第2検査画像において前記輝度が所定以上となる前記画素位置である第2波長画素位置を特定させ、
 前記第3ステップにおいて、
 前記第1波長画素位置を、各第1検査画像における第1波長高輝度時情報として、前記Z位置及び前記回転角度とともに記憶させ、
 前記第2波長画素位置を、各第2検査画像における第2波長高輝度時情報として、前記Z位置及び前記回転角度とともに記憶させ、
 前記第4ステップにおいて、
 前記第1波長高輝度時情報及び前記第2波長高輝度時情報に基づいて、複数の色で表された前記対象物の前記端部のカラーイメージを生成させる、
 付記9に記載の画像処理方法。
 (付記14)
 前記第1ステップにおいて、
 前記反射光における第3の色に対応した第3波長の光を受光する第3受光部から第3検査画像をさらに取得させ、
 前記第2ステップにおいて、
 前記Z位置がそれぞれ異なる複数の前記第3検査画像において前記輝度が所定以上となる前記画素位置である第3波長画素位置をさらに特定させ、
 前記第3ステップにおいて、
 前記第3波長画素位置を、各第3検査画像における第3波長高輝度時情報として、前記Z位置及び前記回転角度とともにさらに記憶させ、
 前記第4ステップにおいて、
 前記第1波長高輝度時情報、前記第2波長高輝度時情報及び前記第3波長高輝度時情報に基づいて、複数の色で表された前記対象物の前記端部のカラーイメージを生成させる、
 付記13に記載の画像処理方法。
 (付記15)
 前記対象物は、ウェーハを含み、
 前記端部は、前記ウェーハのエッジを含む、
 付記9に記載の画像処理方法。
 (付記16)
 前記イメージ情報に基づいて、前記対象物の検査を行うステップを備える、
 付記9~15のいずれか1項に記載の画像処理方法。
(Appendix 1)
an acquisition unit that acquires inspection images of the end of the object at a plurality of Z positions from the reflected light collected by the objective lens by rotating the object around the rotation axis on a stage having the rotation axis and moving a Z position, which is a position of the objective lens in the optical axis direction of the objective lens that collects the reflected light reflected by the end of the object;
an identifying unit that identifies pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of inspection images each having a different Z position and a different rotation angle of the object;
a storage unit that stores the identified pixel position together with the Z position and the rotation angle in each inspection image as high-brightness information;
a generating unit that generates image information of the edge of the object based on the stored plurality of pieces of high-brightness information;
An image processing device comprising:
(Appendix 2)
the storage unit stores, as the high-brightness information, an optical axis angle, which is an angle of the optical axis of the objective lens with respect to a plane orthogonal to the rotation axis, together with the pixel position;
2. The image processing device of claim 1.
(Appendix 3)
the target object includes a plate-like object,
the storage unit stores the high-brightness information at a plurality of the optical axis angles,
the generation unit generates the image information based on the high-brightness information at a plurality of the optical axis angles with respect to the object.
3. The image processing device according to claim 2.
(Appendix 4)
the generation unit generates the image information regarding the rotation angle of the entire circumference at the end of the object based on the high-brightness information.
2. The image processing device of claim 1.
(Appendix 5)
The acquisition unit
acquiring a first inspection image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light;
acquiring a second inspection image from a second light receiving unit that receives light of a second wavelength corresponding to a second color in the reflected light;
The identification unit
identifying first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first inspection images each having a different Z position;
identifying second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the second inspection images each having a different Z position;
The storage unit
storing the first-wavelength pixel position together with the Z position and the rotation angle as first-wavelength high-brightness information in each first inspection image;
storing the second-wavelength pixel positions together with the Z position and the rotation angle as second-wavelength high-brightness information in each second inspection image;
The generation unit
generating a color image of the edge of the object represented in a plurality of colors based on the first wavelength high brightness information and the second wavelength high brightness information;
2. The image processing device of claim 1.
(Appendix 6)
The acquisition unit
further acquiring a third inspection image from a third light receiving unit that receives light of a third wavelength corresponding to a third color in the reflected light;
The identification unit
further identifying third-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the third inspection images each having a different Z position;
The storage unit
The third-wavelength pixel position is further stored as third-wavelength high-brightness information in each third inspection image, together with the Z position and the rotation angle;
The generation unit
generating a color image of the edge of the object represented in a plurality of colors based on the first wavelength high brightness information, the second wavelength high brightness information, and the third wavelength high brightness information;
6. The image processing device according to claim 5.
(Appendix 7)
the object includes a wafer;
the end portion includes an edge of the wafer;
2. The image processing device of claim 1.
(Appendix 8)
a control unit that inspects the object based on the image information;
8. An image processing device according to any one of claims 1 to 7.
(Appendix 9)
a first step of rotating an object around a rotation axis on a stage having the rotation axis, while moving a Z position, which is a position of the objective lens in the optical axis direction of the objective lens that collects reflected light reflected at an end of the object, and having an acquisition unit acquire inspection images of the end of the object at a plurality of the Z positions from the reflected light collected by the objective lens;
a second step of causing an identifying unit to identify pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of inspection images each having a different Z position and a different rotation angle of the object;
a third step of storing the identified pixel position together with the Z position and the rotation angle in each inspection image in a storage unit as high-brightness information;
a fourth step of causing a generation unit to generate image information of the edge of the object based on the stored plurality of pieces of high-brightness information;
An image processing method comprising:
(Appendix 10)
In the third step,
an optical axis angle, which is the angle of the optical axis of the objective lens with respect to a plane perpendicular to the rotation axis, is stored as the high-brightness information together with the pixel position;
10. The image processing method according to claim 9.
(Appendix 11)
the target object includes a plate-like object,
In the third step,
storing the high-brightness information for a plurality of the optical axis angles;
In the fourth step,
generating the image information based on the high-brightness information at a plurality of the optical axis angles with respect to the object;
11. An image processing method according to claim 10.
(Appendix 12)
In the fourth step,
generating the image information for the rotation angle of the entire circumference of the end portion of the object based on the high-brightness information;
10. The image processing method according to claim 9.
(Appendix 13)
In the first step,
acquiring a first inspection image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light;
acquiring a second inspection image from a second light receiving unit that receives light of a second wavelength corresponding to a second color in the reflected light;
In the second step,
identifying first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first inspection images each having a different Z position;
identifying second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the second inspection images each having a different Z position;
In the third step,
storing the first wavelength pixel position together with the Z position and the rotation angle as first wavelength high brightness information in each first inspection image;
storing the second-wavelength pixel positions together with the Z position and the rotation angle as second-wavelength high-brightness information in each second inspection image;
In the fourth step,
generating a color image of the edge of the object represented in a plurality of colors based on the first wavelength high brightness information and the second wavelength high brightness information;
10. The image processing method according to claim 9.
(Appendix 14)
In the first step,
further acquiring a third inspection image from a third light receiving unit that receives light of a third wavelength corresponding to a third color in the reflected light;
In the second step,
further identifying third-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the third inspection images each having a different Z position;
In the third step,
the third wavelength pixel position is further stored together with the Z position and the rotation angle as third wavelength high-brightness information in each third inspection image;
In the fourth step,
generating a color image of the edge of the object represented in a plurality of colors based on the first wavelength high brightness information, the second wavelength high brightness information, and the third wavelength high brightness information;
14. The image processing method according to claim 13.
(Appendix 15)
the object includes a wafer;
the end portion includes an edge of the wafer;
10. The image processing method according to claim 9.
(Appendix 16)
inspecting the object based on the image information;
An image processing method according to any one of appendices 9 to 15.

 この出願は、2024年2月14日に出願された日本出願特願2024-020110を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2024-020110, filed February 14, 2024, the disclosure of which is incorporated herein in its entirety.

1 検査装置
2 画像処理装置
10 ステージ
11 ステージ面
12 駆動部
13 センサ
20 撮像部
21 対物レンズ
22 駆動部
23 受光部
23a 第1受光部
23b 第2受光部
23c 第3受光部
24 光源
25 ピンホール
26 ビームスプリッタ
27、28 光学素子
29 センサ
30 画像処理装置
30a 画像処理部
31 取得部
32 特定部
33 記憶部
34 生成部
35 制御部
40 支持部
C1 回転軸
C2 光軸
L1 照明光
R1 反射光
WF ウェーハ
WF1 ウェーハ面
WFE エッジ
θ 回転角度
Φ 光軸角度
1 Inspection device 2 Image processing device 10 Stage 11 Stage surface 12 Drive unit 13 Sensor 20 Imaging unit 21 Objective lens 22 Drive unit 23 Light receiving unit 23a First light receiving unit 23b Second light receiving unit 23c Third light receiving unit 24 Light source 25 Pinhole 26 Beam splitter 27, 28 Optical element 29 Sensor 30 Image processing device 30a Image processing unit 31 Acquisition unit 32 Identification unit 33 Storage unit 34 Generation unit 35 Control unit 40 Support unit C1 Rotation axis C2 Optical axis L1 Illumination light R1 Reflected light WF Wafer WF1 Wafer surface WFE Edge θ Rotation angle Φ Optical axis angle

Claims (16)

 対象物を回転軸の回りで回転可能に支持する支持部と、
 前記対象物で反射した反射光を集光する対物レンズと、
 前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を変更可能な駆動部と、
 前記対物レンズを介して前記対象物の撮像画像を取得する取得部と、
 前記支持部及び前記駆動部を制御し、前記取得部に複数の前記撮像画像を取得させる制御部と、
 前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定する特定部と、
 特定した前記画素位置を、各撮像画像における前記Z位置及び前記回転角度とともに高輝度時情報として記憶する記憶部と、
 記憶した複数の前記高輝度時情報に基づいて、前記対象物のイメージ情報を生成する生成部と、
 を備えた画像処理装置。
a support portion that supports an object so that the object can rotate around a rotation axis;
an objective lens that collects light reflected by the object;
a drive unit capable of changing a Z position, which is a relative position between the object and a focusing position of the objective lens in an optical axis direction of the objective lens;
an acquisition unit that acquires a captured image of the object through the objective lens;
a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire the plurality of captured images;
an identifying unit that identifies pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of captured images each having a different Z position and a different rotation angle of the object;
a storage unit that stores the identified pixel position together with the Z position and the rotation angle in each captured image as high-brightness information;
a generating unit that generates image information of the object based on the stored plurality of pieces of high-brightness information;
An image processing device comprising:
 前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記Z位置を変更させることで、前記取得部に前記対象物の端部の複数の前記撮像画像を取得させる、
 請求項1に記載の画像処理装置。
the control unit causes the drive unit to change the Z position while rotating the object around the rotation axis using the support unit, thereby causing the acquisition unit to acquire the plurality of captured images of the end portion of the object;
The image processing device according to claim 1 .
 前記駆動部は、前記回転軸に垂直な方向に前記対象物と前記対物レンズの集光位置との相対位置である半径位置を変更可能であり、
 前記制御部は、
 前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記回転軸に垂直な第1方向に変更させ、且つ、前記Z位置を第1位置で一定にさせて、前記取得部に前記対象物の主面の複数の前記撮像画像を取得させる、
 請求項1に記載の画像処理装置。
the drive unit is capable of changing a radial position, which is a relative position between the object and a focusing position of the objective lens, in a direction perpendicular to the rotation axis,
The control unit
While rotating the object around the rotation axis by the support unit, changing the radial position in a first direction perpendicular to the rotation axis by the drive unit and keeping the Z position constant at the first position, and causing the acquisition unit to acquire the plurality of captured images of the main surface of the object.
The image processing device according to claim 1 .
 前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記第1方向と反対方向の第2方向に変更させ、且つ、前記Z位置を第2位置で一定にさせて、前記取得部に前記対象物の前記主面の複数の前記撮像画像を取得させ、
 前記第2位置は、前記第1位置と異なり、
 前記第2位置と前記第1位置との差は、前記対物レンズを含む光学系の焦点深度以内である、
 請求項3に記載の画像処理装置。
the control unit causes the drive unit to change the radial position in a second direction opposite to the first direction and keeps the Z position constant at the second position while rotating the object around the rotation axis using the support unit, and causes the acquisition unit to acquire the plurality of captured images of the main surface of the object;
the second position is different from the first position;
a difference between the second position and the first position is within a focal depth of an optical system including the objective lens;
The image processing device according to claim 3 .
 対象物を回転軸の回りで回転可能に支持する支持部と、
 前記対象物で反射した反射光を集光する対物レンズと、
 前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を変更可能な駆動部と、
 前記対物レンズを介して前記対象物の撮像画像を取得する取得部と、
 前記支持部及び前記駆動部を制御し、前記取得部に複数の前記撮像画像を取得させる制御部と、
 前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定する特定部と、
 前記輝度が所定以上となる画素に基づいて、前記対象物のイメージ情報を生成する生成部と、
 を備え、
 前記生成部は、前記回転角度が特定の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、前記特定の回転角度における前記対象物のイメージ情報を生成する、
 画像処理装置。
a support portion that supports an object so that the object can rotate around a rotation axis;
an objective lens that collects light reflected by the object;
a drive unit capable of changing a Z position, which is a relative position between the object and a focusing position of the objective lens in an optical axis direction of the objective lens;
an acquisition unit that acquires a captured image of the object through the objective lens;
a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire the plurality of captured images;
an identifying unit that identifies pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of captured images each having a different Z position and a different rotation angle of the object;
a generation unit that generates image information of the object based on the pixels having a luminance equal to or greater than a predetermined value;
Equipped with
the generation unit generates image information of the object at the specific rotation angle based on pixels having a luminance equal to or greater than a predetermined value in the captured images in which the rotation angle is a specific rotation angle and the Z positions are different from each other.
Image processing device.
 前記生成部は、
 前記回転角度が第1の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 前記回転角度が第2の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 に基づいて、複数の特定の前記回転角度における前記対象物の前記イメージ情報を生成する、
 請求項5に記載の画像処理装置。
The generation unit
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a first rotation angle and the Z position is different from one another;
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a second rotation angle and the Z position is different from one another;
generating the image information of the object at a plurality of specific rotation angles based on the
The image processing device according to claim 5 .
 前記駆動部は、前記回転軸に直交した面に対する前記対物レンズの光軸の角度である光軸角度を変更可能であり、
 前記特定部は、前記Z位置、前記回転角度及び前記光軸角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定し、
 前記生成部は、
 前記回転角度が特定の回転角度であり、前記光軸角度が第1の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 前記回転角度が特定の回転角度であり、前記光軸角度が第2の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 に基づいて、前記特定の前記回転角度における前記対象物の端部の前記イメージ情報を生成する、
 請求項5に記載の画像処理装置。
the driving unit is capable of changing an optical axis angle, which is an angle of the optical axis of the objective lens with respect to a plane perpendicular to the rotation axis,
the specifying unit specifies pixel positions, which are positions of pixels having a luminance equal to or greater than a predetermined value, in the plurality of captured images each having a different Z position, a different rotation angle, and a different optical axis angle;
The generation unit
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a specific rotation angle, the optical axis angle is a first optical axis angle, and the Z positions are different from one another;
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a specific rotation angle, the optical axis angle is a second optical axis angle, and the Z positions are different from one another;
generating the image information of the edge of the object at the particular rotation angle based on
The image processing device according to claim 5 .
 前記取得部は、
 前記反射光における第1の色に対応した第1波長の光を受光する第1受光部から第1撮像画像を取得し、
 前記反射光における第2の色に対応した第2波長の光を受光する第2受光部から第2撮像画像を取得し、
 前記特定部は、
 前記Z位置がそれぞれ異なる複数の前記第1撮像画像において前記輝度が所定以上となる前記画素位置である第1波長画素位置を特定し、
 前記Z位置がそれぞれ異なる複数の前記第2撮像画像において前記輝度が所定以上となる前記画素位置である第2波長画素位置を特定し、
 前記生成部は、
 前記第1撮像画像及び前記第2撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、複数の色で表された前記対象物のカラーイメージを生成する、
 請求項1または5に記載の画像処理装置。
The acquisition unit
acquiring a first captured image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light;
acquiring a second captured image from a second light receiving unit that receives light of a second wavelength corresponding to a second color in the reflected light;
The identification unit
identifying first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first captured images each having a different Z position;
identifying second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the second captured images each having a different Z position;
The generation unit
generating a color image of the object represented in a plurality of colors based on pixels in the first captured image and the second captured image that have a luminance equal to or greater than a predetermined value;
The image processing device according to claim 1 or 5.
 対象物を回転軸の回りで回転可能に支持部に支持させるステップと、
 前記対象物で反射した反射光を対物レンズで集光させるステップと、
 前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を駆動部に変更させるステップと、
 前記対物レンズを介して前記対象物の撮像画像を取得部に取得させるステップと、
 前記支持部及び前記駆動部を制御する制御部が、前記取得部に複数の前記撮像画像を取得させるステップと、
 前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定部に特定させるステップと、
 特定した前記画素位置を、各撮像画像における前記Z位置及び前記回転角度とともに高輝度時情報として記憶部に記憶させるステップと、
 記憶した複数の前記高輝度時情報に基づいて、前記対象物のイメージ情報を生成部に生成させるステップと、
 を備えた画像処理方法。
A step of supporting an object on a support portion so as to be rotatable around a rotation axis;
A step of collecting the light reflected by the object with an objective lens;
a step of changing a Z position, which is a relative position between the object and a focusing position of the objective lens in an optical axis direction of the objective lens, with a driving unit;
causing an acquisition unit to acquire a captured image of the object through the objective lens;
a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire a plurality of the captured images;
causing an identifying unit to identify pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of captured images each having a different Z position and a different rotation angle of the object;
storing the identified pixel position together with the Z position and the rotation angle in each captured image as high-brightness information in a storage unit;
generating image information of the object based on the stored plurality of pieces of high-brightness information;
An image processing method comprising:
 前記撮像画像を前記取得部に取得させるステップにおいて、
 前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記Z位置を変更させることで、前記取得部に前記対象物の端部の複数の前記撮像画像を取得させる、
 請求項9に記載の画像処理方法。
In the step of causing the acquisition unit to acquire the captured image,
the control unit causes the drive unit to change the Z position while rotating the object around the rotation axis using the support unit, thereby causing the acquisition unit to acquire the plurality of captured images of the end portion of the object;
The image processing method according to claim 9.
 前記支持部に支持させるステップにおいて、
 前記駆動部は、前記回転軸に垂直な方向に前記対象物と前記対物レンズの集光位置との相対位置である半径位置を移動させ、
 前記撮像画像を前記取得部に取得させるステップにおいて、
 前記制御部は、
 前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記回転軸に垂直な第1方向に変更させ、且つ、前記Z位置を第1位置で一定にさせて、前記取得部に前記対象物の主面の複数の前記撮像画像を取得させる、
 請求項9に記載の画像処理方法。
In the step of supporting the support portion,
the driving unit moves a radial position, which is a relative position between the object and a focusing position of the objective lens, in a direction perpendicular to the rotation axis;
In the step of causing the acquisition unit to acquire the captured image,
The control unit
While rotating the object around the rotation axis by the support unit, changing the radial position in a first direction perpendicular to the rotation axis by the drive unit and keeping the Z position constant at the first position, and causing the acquisition unit to acquire the plurality of captured images of the main surface of the object.
The image processing method according to claim 9.
 前記撮像画像を前記取得部に取得させるステップにおいて、
 前記制御部は、前記支持部により前記対象物を前記回転軸の回りで回転させながら、前記駆動部により前記半径位置を前記第1方向と反対方向の第2方向に変更させ、且つ、前記Z位置を第2位置で一定にさせて、前記取得部に前記対象物の前記主面の複数の前記撮像画像を取得させ、
 前記第2位置は、前記第1位置と異なり、
 前記第2位置と前記第1位置との差は、前記対物レンズを含む光学系の焦点深度以内である、
 請求項11に記載の画像処理方法。
In the step of causing the acquisition unit to acquire the captured image,
the control unit causes the drive unit to change the radial position in a second direction opposite to the first direction and keeps the Z position constant at the second position while rotating the object around the rotation axis using the support unit, and causes the acquisition unit to acquire the plurality of captured images of the main surface of the object;
the second position is different from the first position;
a difference between the second position and the first position is within a focal depth of an optical system including the objective lens;
The image processing method according to claim 11.
 対象物を回転軸の回りで回転可能に支持部に支持させるステップと、
 前記対象物で反射した反射光を対物レンズで集光させるステップと、
 前記対物レンズの光軸方向における、前記対象物と前記対物レンズの集光位置との相対位置であるZ位置を駆動部に変更させるステップと、
 前記対物レンズを介して前記対象物の撮像画像を取得部に取得させるステップと、
 前記支持部及び前記駆動部を制御する制御部が、前記取得部に複数の前記撮像画像を取得させるステップと、
 前記Z位置がそれぞれ異なり、且つ、前記対象物の回転角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定部に特定させるステップと、
 前記輝度が所定以上となる画素に基づいて、前記対象物のイメージ情報を生成部に生成させるステップと、
 を備え、
 前記生成部に生成させるステップにおいて、
 前記生成部は、前記回転角度が特定の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、前記特定の回転角度における前記対象物のイメージ情報を生成する、
 画像処理方法。
A step of supporting an object on a support portion so as to be rotatable around a rotation axis;
A step of collecting the light reflected by the object with an objective lens;
a step of changing a Z position, which is a relative position between the object and a focusing position of the objective lens in an optical axis direction of the objective lens, with a driving unit;
causing an acquisition unit to acquire a captured image of the object through the objective lens;
a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire a plurality of the captured images;
causing an identifying unit to identify pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of captured images each having a different Z position and a different rotation angle of the object;
generating image information of the object based on the pixels having a luminance equal to or greater than a predetermined value;
Equipped with
In the step of causing the generation unit to generate
the generation unit generates image information of the object at the specific rotation angle based on pixels having a luminance equal to or higher than a predetermined value in the captured images in which the rotation angle is a specific rotation angle and the Z positions are different from each other.
Image processing methods.
 前記生成部に生成させるステップにおいて、
 前記生成部は、
 前記回転角度が第1の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 前記回転角度が第2の回転角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 に基づいて、複数の特定の前記回転角度における前記対象物の前記イメージ情報を生成する、
 請求項13に記載の画像処理方法。
In the step of causing the generation unit to generate
The generation unit
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a first rotation angle and the Z position is different from one another;
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a second rotation angle and the Z position is different from one another;
generating the image information of the object at a plurality of specific rotation angles based on the
The image processing method according to claim 13.
 前記Z位置を前記駆動部に変更させるステップにおいて、
 前記駆動部は、前記回転軸に直交した面に対する前記対物レンズの光軸の角度である光軸角度を変更させ、
 前記画素位置を前記特定部に特定させるステップにおいて、
 前記特定部は、前記Z位置、前記回転角度及び前記光軸角度がそれぞれ異なる複数の前記撮像画像において、輝度が所定以上となる画素の位置である画素位置を特定し、
 前記生成部に生成させるステップにおいて、
 前記生成部は、
 前記回転角度が特定の回転角度であり、前記光軸角度が第1の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 前記回転角度が特定の回転角度であり、前記光軸角度が第2の光軸角度であって前記Z位置が互いに異なる複数の前記撮像画像において、それぞれ輝度が所定以上となる画素と、
 に基づいて、前記特定の前記回転角度における前記対象物の端部の前記イメージ情報を生成する、
 請求項13に記載の画像処理方法。
In the step of changing the Z position to the drive unit,
the driving unit changes an optical axis angle, which is an angle of the optical axis of the objective lens with respect to a plane orthogonal to the rotation axis,
In the step of causing the specifying unit to specify the pixel position,
the specifying unit specifies pixel positions, which are positions of pixels having a luminance equal to or greater than a predetermined value, in the plurality of captured images each having a different Z position, a different rotation angle, and a different optical axis angle;
In the step of causing the generation unit to generate
The generation unit
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a specific rotation angle, the optical axis angle is a first optical axis angle, and the Z positions are different from one another;
a pixel having a luminance equal to or greater than a predetermined value in the plurality of captured images in which the rotation angle is a specific rotation angle, the optical axis angle is a second optical axis angle, and the Z positions are different from one another;
generating the image information of the edge of the object at the particular rotation angle based on
The image processing method according to claim 13.
 前記撮像画像を取得させるステップにおいて、
 前記制御部は、前記取得部に、前記反射光における第1の色に対応した第1波長の光を受光する第1受光部から第1撮像画像を取得させ、
 前記反射光における第2の色に対応した第2波長の光を受光する第2受光部から第2撮像画像を取得させ、
 前記画素位置を前記特定部に特定させるステップにおいて、
 前記特定部は、前記Z位置がそれぞれ異なる複数の前記第1撮像画像において前記輝度が所定以上となる前記画素位置である第1波長画素位置を特定し、
 前記Z位置がそれぞれ異なる複数の前記第2撮像画像において前記輝度が所定以上となる前記画素位置である第2波長画素位置を特定し、
 前記生成部に生成させるステップにおいて、
 前記生成部は、前記第1撮像画像及び前記第2撮像画像において、それぞれ輝度が所定以上となる画素に基づいて、複数の色で表された前記対象物のカラーイメージを生成する、
 請求項9または13に記載の画像処理方法。
In the step of acquiring the captured image,
the control unit causes the acquisition unit to acquire a first captured image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light;
acquiring a second captured image from a second light receiving unit that receives light of a second wavelength corresponding to a second color in the reflected light;
In the step of causing the specifying unit to specify the pixel position,
the specifying unit specifies a first-wavelength pixel position, which is the pixel position where the luminance is equal to or greater than a predetermined value, in the plurality of first captured images each having a different Z position;
identifying second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the second captured images each having a different Z position;
In the step of causing the generation unit to generate
the generation unit generates a color image of the object expressed in a plurality of colors based on pixels in the first captured image and the second captured image, each of which has a luminance equal to or higher than a predetermined value.
14. The image processing method according to claim 9 or 13.
PCT/JP2025/002226 2024-02-14 2025-01-24 Image processing device and image processing method Pending WO2025173500A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2025506014A JP7748600B1 (en) 2024-02-14 2025-01-24 Image processing device and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2024020110 2024-02-14
JP2024-020110 2024-02-14

Publications (1)

Publication Number Publication Date
WO2025173500A1 true WO2025173500A1 (en) 2025-08-21

Family

ID=96772923

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2025/002226 Pending WO2025173500A1 (en) 2024-02-14 2025-01-24 Image processing device and image processing method

Country Status (3)

Country Link
JP (1) JP7748600B1 (en)
TW (1) TW202542483A (en)
WO (1) WO2025173500A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251143A (en) * 2006-02-15 2007-09-27 Olympus Corp Visual inspection system
WO2009072483A1 (en) * 2007-12-05 2009-06-11 Nikon Corporation Monitoring apparatus and monitoring method
JP2010060532A (en) * 2008-09-08 2010-03-18 Raytex Corp Surface inspection device
JP2010519772A (en) * 2007-02-23 2010-06-03 ルドルフテクノロジーズ インコーポレイテッド Wafer manufacturing monitoring system and method including an edge bead removal process
JP2013160687A (en) * 2012-02-07 2013-08-19 Ohkura Industry Co Ltd Inspection device
JP2021110551A (en) * 2020-01-06 2021-08-02 東レエンジニアリング株式会社 Substrate edge inspection apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251143A (en) * 2006-02-15 2007-09-27 Olympus Corp Visual inspection system
JP2010519772A (en) * 2007-02-23 2010-06-03 ルドルフテクノロジーズ インコーポレイテッド Wafer manufacturing monitoring system and method including an edge bead removal process
WO2009072483A1 (en) * 2007-12-05 2009-06-11 Nikon Corporation Monitoring apparatus and monitoring method
JP2010060532A (en) * 2008-09-08 2010-03-18 Raytex Corp Surface inspection device
JP2013160687A (en) * 2012-02-07 2013-08-19 Ohkura Industry Co Ltd Inspection device
JP2021110551A (en) * 2020-01-06 2021-08-02 東レエンジニアリング株式会社 Substrate edge inspection apparatus

Also Published As

Publication number Publication date
TW202542483A (en) 2025-11-01
JPWO2025173500A1 (en) 2025-08-21
JP7748600B1 (en) 2025-10-02

Similar Documents

Publication Publication Date Title
TWI551855B (en) System and method for detecting wafers and program storage device read by the system
EP1785714B1 (en) Lens evaluation device
TWI575625B (en) System and method for detecting wafer
JP5672240B2 (en) System and method for inspecting a wafer
US10142621B2 (en) Mass production MTF testing machine
JP2013011856A (en) Imaging system and control method thereof
US8810799B2 (en) Height-measuring method and height-measuring device
JP2009283633A (en) Surface inspection device, and surface inspection method
US10551174B2 (en) Calibration method of image measuring device
JP2008175818A (en) Surface inspection apparatus and method
JPWO2009125839A1 (en) Inspection device
KR102874281B1 (en) Display Pixel Automatic Inspection System and method
WO2014208193A1 (en) Wafer appearance inspection device
JP2006242821A (en) Optical panel imaging method, optical panel inspection method, optical panel imaging device, optical panel inspection device
JP2011145160A (en) Device and method for multi-focus inspection
JP7748600B1 (en) Image processing device and image processing method
JP4136635B2 (en) Analysis equipment
US20070133969A1 (en) System and method for measuring and setting the focus of a camera assembly
JP2010243212A (en) Inclination detection method and inclination detection apparatus
JP5191265B2 (en) Optical microscope apparatus and data processing apparatus for optical microscope
JP2005274156A (en) Flaw inspection device
NL2028376B1 (en) Method of and arrangement for verifying an alignment of an infinity-corrected objective.
JP5217093B2 (en) Inspection apparatus and inspection method
JP5073346B2 (en) Lens inspection method and lens inspection apparatus for image reading apparatus
JP2022047893A (en) Workpiece imaging device and workpiece imaging method

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2025506014

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2025506014

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25754827

Country of ref document: EP

Kind code of ref document: A1