[go: up one dir, main page]

WO2025173500A1 - Dispositif de traitement d'images et procédé de traitement d'images - Google Patents

Dispositif de traitement d'images et procédé de traitement d'images

Info

Publication number
WO2025173500A1
WO2025173500A1 PCT/JP2025/002226 JP2025002226W WO2025173500A1 WO 2025173500 A1 WO2025173500 A1 WO 2025173500A1 JP 2025002226 W JP2025002226 W JP 2025002226W WO 2025173500 A1 WO2025173500 A1 WO 2025173500A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
rotation angle
objective lens
captured images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2025/002226
Other languages
English (en)
Japanese (ja)
Inventor
芳晴 渡辺
雅文 篠田
大地 吉本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lasertec Corp
Original Assignee
Lasertec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lasertec Corp filed Critical Lasertec Corp
Priority to JP2025506014A priority Critical patent/JP7748600B1/ja
Publication of WO2025173500A1 publication Critical patent/WO2025173500A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • H10P74/00

Definitions

  • This disclosure relates to an image processing device and an image processing method.
  • Patent Document 1 proposes a method for measuring the peripheral shape of a wafer edge.
  • the image processing device disclosed herein includes a support unit that supports an object rotatably around a rotation axis, an objective lens that focuses light reflected by the object, a drive unit that can change the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, an acquisition unit that acquires captured images of the object via the objective lens, a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire multiple captured images, an identification unit that identifies pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object, a storage unit that stores the identified pixel positions together with the Z position and the rotation angle in each captured image as high-brightness information, and a generation unit that generates image information of the object based on the stored multiple pieces of high-brightness information.
  • control unit causes the support unit to rotate the object around the rotation axis, while causing the drive unit to change the radial position in a second direction opposite to the first direction and keeping the Z position constant at the second position, causing the acquisition unit to acquire multiple captured images of the main surface of the object, wherein the second position is different from the first position, and the difference between the second position and the first position may be within the focal depth of an optical system including the objective lens.
  • the image processing device includes a support unit that supports an object rotatably around a rotation axis; an objective lens that focuses light reflected by the object; a drive unit that can change the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens; an acquisition unit that acquires captured images of the object via the objective lens; a control unit that controls the support unit and the drive unit and causes the acquisition unit to acquire multiple captured images; an identification unit that identifies pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object; and a generation unit that generates image information of the object based on the pixels where the brightness is equal to or greater than the predetermined value, and the generation unit may generate image information of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than the predetermined value in multiple captured images that have different Z positions at the specific rotation angle.
  • the drive unit can change the optical axis angle, which is the angle of the optical axis of the objective lens with respect to a plane perpendicular to the rotation axis, and the identification unit identifies pixel positions where the brightness is greater than or equal to a predetermined value in the plurality of captured images where the Z position, rotation angle, and optical axis angle are different, and the generation unit may generate the image information of the edge of the object at the specific rotation angle based on the pixels where the brightness is greater than or equal to the predetermined value in the plurality of captured images where the rotation angle is the specific rotation angle, the optical axis angle is a first optical axis angle, and the Z positions are different, and the pixels where the brightness is greater than or equal to the predetermined value in the plurality of captured images where the rotation angle is the specific rotation angle, the optical axis angle is a second optical axis angle, and the Z positions are different.
  • the optical axis angle which is the angle of the optical axis of the objective lens with respect to a plane per
  • the image processing method disclosed herein comprises the steps of supporting an object on a support unit so that it can rotate around a rotation axis; focusing light reflected by the object with an objective lens; changing the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, with a drive unit; causing an acquisition unit to acquire an image of the object via the objective lens; a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire a plurality of the imaged images; causing an identification unit to identify pixel positions, which are positions of pixels where the brightness is equal to or greater than a predetermined value, in a plurality of imaged images that have different Z positions and different rotation angles of the object; storing the identified pixel positions, along with the Z positions and rotation angles in each imaged image, in a storage unit as high-brightness information; and causing a generation unit to generate image information of the object based on the stored plurality of high-brightness information.
  • the drive unit moves the object in a direction perpendicular to the rotation axis
  • the control unit The object may be rotated around the rotation axis by the support unit, while the radial position is changed in a first direction perpendicular to the rotation axis by the drive unit, and the Z position is kept constant at the first position, causing the acquisition unit to acquire multiple captured images of the main surface of the object.
  • the image processing method disclosed herein includes the steps of supporting an object on a support unit so that it can rotate around a rotation axis; focusing light reflected by the object with an objective lens; changing the Z position, which is the relative position between the object and the focusing position of the objective lens in the optical axis direction of the objective lens, with a drive unit; causing an acquisition unit to acquire an image of the object via the objective lens; a control unit that controls the support unit and the drive unit causing the acquisition unit to acquire multiple captured images; causing an identification unit to identify pixel positions where the brightness is equal to or greater than a predetermined value in multiple captured images that have different Z positions and different rotation angles of the object; and causing a generation unit to generate image information of the object based on the pixels where the brightness is equal to or greater than the predetermined value.
  • the generation unit In the step of causing the generation unit to generate image information, the generation unit generates image information of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than the predetermined value in multiple captured
  • the generation unit may generate the image information of the object at a plurality of specific rotation angles based on pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images whose rotation angle is a first rotation angle and whose Z positions are different from one another, and pixels whose brightness is equal to or greater than a predetermined value in a plurality of captured images whose rotation angle is a second rotation angle and whose Z positions are different from one another.
  • the identifying unit identifies pixel positions, which are positions of pixels where the brightness is equal to or greater than a predetermined value, in a plurality of the captured images, each of which has a different Z position, rotation angle, and optical axis angle, and causes the generating unit to generate the image information
  • the generating unit may generate the image information of the edge of the object at the specific rotation angle based on pixels where the brightness is equal to or greater than a predetermined value in a plurality of the captured images, each of which has a rotation angle that is a specific rotation angle, a first optical axis angle, and a different Z position, and pixels where the brightness is equal to or greater than a predetermined value in a pluralit
  • the identification unit In the step of causing the identification unit to identify the pixel positions, the identification unit identifies first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first captured images having different Z positions, and identifies second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of second captured images having different Z positions.
  • the generation unit may generate a color image of the edge of the object represented in a plurality of colors based on the first-wavelength high-luminance information and the second-wavelength high-luminance information.
  • This disclosure provides an image processing device and image processing method that can measure information about an object quickly and accurately.
  • FIG. 1 is a configuration diagram illustrating an inspection device according to a first embodiment.
  • FIG. 2 is a plan view illustrating a stage in the inspection device according to the first embodiment.
  • FIG. 2 is a configuration diagram illustrating an imaging unit in the inspection device according to the first embodiment.
  • 1 is a block diagram illustrating an image processing apparatus according to a first embodiment.
  • 1 is a diagram illustrating an example of an inspection image acquired by an acquisition unit in the image processing apparatus according to the first embodiment, where the horizontal axis indicates the rotation angle and the vertical axis indicates the pixels capturing an image of the edge of the wafer.
  • 3 is a diagram illustrating an example of a profile of a wafer edge generated by a generating unit in the image processing apparatus according to the first embodiment;
  • FIG. 10 is a diagram illustrating a profile of a wafer surface of a wafer generated by a generating unit in an image processing apparatus according to a second embodiment.
  • FIG. 10 is a flowchart illustrating an image processing method using the image processing device according to the second embodiment.
  • Fig. 1 is a configuration diagram illustrating an inspection device 1 according to embodiment 1.
  • Fig. 2 is a plan view illustrating a stage 10 in the inspection device 1 according to embodiment 1.
  • Fig. 3 is a configuration diagram illustrating an imaging unit 20 in the inspection device 1 according to embodiment 1.
  • the stage 10 places the wafer WF on it.
  • the stage 10 has a stage surface 11.
  • the stage 10 places the wafer WF flat on the stage surface 11.
  • the back surface of the wafer WF contacts the stage surface 11.
  • the stage 10 may have a predetermined set position on the stage surface 11 where the wafer WF is set. For example, when inspecting the wafer WF, the wafer WF may first be fixed at the set position.
  • the wafer WF has a wafer surface WF1.
  • an ⁇ Cartesian coordinate system is introduced.
  • the plane parallel to the stage surface 11 is defined as the ⁇ plane.
  • the direction perpendicular to the stage surface 11 is defined as the ⁇ axis direction.
  • the stage 10 has, for example, a rotation axis C1.
  • the rotation axis C1 extends, for example, in the ⁇ -axis direction.
  • the rotation axis C1 passes through the wafer WF placed on the stage surface 11. Therefore, the stage 10 rotates the wafer WF around the rotation axis C1.
  • the stage 10 may be connected to a drive unit 12 such as a motor.
  • the drive unit 12 rotates the stage 10 around the rotation axis C1.
  • the rotation angle from a predetermined set position is called ⁇ .
  • the imaging unit 20 captures an inspection image of the edge WFE of the wafer WF.
  • the imaging unit 20 includes, for example, an objective lens 21, a drive unit 22, and a light receiving unit 23.
  • the imaging unit 20 may further include a light source 24, a pinhole 25, a beam splitter 26, an optical element 27, an optical element 28, and a sensor 29. Note that, as long as the imaging unit 20 can capture an inspection image of the edge WFE of the wafer WF, any of the above optical elements may be replaced with other optical elements, or other optical elements may be included in addition to the above optical elements.
  • the imaging unit 20 captures an inspection image of the edge WFE from reflected light R1 reflected by the edge WFE of the wafer WF.
  • the drive unit 22 changes the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21.
  • the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21 is called the Z position.
  • the drive unit 22 moves the position of the objective lens 21 in the optical axis direction.
  • the drive unit 22 may change the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21 by changing the shape, position, or attitude of the optical element, or by changing the focusing distance of the objective lens 21.
  • the light receiving unit 23 receives the reflected light R1 focused by the objective lens 21. As a result, the imaging unit 20 captures an inspection image from the reflected light R1 reflected by the end WFE of the wafer WF.
  • the light receiving unit 23 may include a first light receiving unit 23a and a second light receiving unit 23b. Furthermore, the light receiving unit 23 may include a third light receiving unit 23c in addition to the first light receiving unit 23a and the second light receiving unit 23b.
  • the imaging unit 20 may have a confocal optical system. Therefore, the imaging unit 20 can capture an image so that the edge WFE of the wafer WF is in focus. However, in this embodiment, the imaging unit 20 may move the Z position of the objective lens 21 during imaging without paying attention to the focus. The brightness of the reflected light R1 from the position of the edge WFE in focus increases. This makes it possible to identify the Z position in focus. Therefore, after imaging, various data information such as the in-focus Z position, brightness information, rotation angle ⁇ , optical axis angle ⁇ , and sampling time of the image is referenced. A profile of the edge WFE of the wafer WF can be formed using the various data information referenced. In this way, the imaging unit 20 can perform three-dimensional measurement.
  • the imaging unit 20 is connected to the image processing device 30 via a communication line that includes at least one of wireless and wired lines. Specifically, the imaging unit 20 is connected in a state where it can transmit information including image data, Z position, rotation angle ⁇ , optical axis angle ⁇ , sampling time, etc. to the image processing device 30. The imaging unit 20 outputs information such as captured image data, Z position, rotation angle ⁇ , optical axis angle ⁇ , sampling time, etc. to the image processing device 30.
  • the acquisition unit 31 acquires inspection images of the edge WFE of the wafer WF at multiple Z positions from the reflected light R1 collected by the objective lens 21 by rotating the wafer WF around the rotation axis C1 on the stage 10 and moving the Z position.
  • the inspection image includes pixels of the edge WFE corresponding to the rotation angle ⁇ at each Z position.
  • FIG. 5 is a diagram illustrating an inspection image acquired by the acquisition unit 31 in the image processing device 30 according to embodiment 1, where the horizontal axis indicates the rotation angle ⁇ and the vertical axis indicates the pixels capturing the edge WFE of the wafer WF.
  • the acquisition unit 31 acquires multiple inspection images each with a different Z position.
  • Each inspection image corresponds the rotation angle ⁇ of the edge WFE of the wafer WF to the pixels capturing the edge WFE at the rotation angle ⁇ .
  • the rotation angle ⁇ in the inspection image may include a range of 0° to 360° so as to correspond to the entire circumference of the edge WFE of the wafer WF, or may include a predetermined partial range.
  • the acquisition unit 31 may acquire inspection images when the optical axis angle ⁇ is changed.
  • the optical axis angle ⁇ of the inspection image may include a range from -90° to +90°.
  • the acquisition unit 31 may acquire the inspection image from the first light receiving unit 23a, which receives light of the first wavelength.
  • the inspection image acquired from the first light receiving unit 23a is called the first inspection image. Therefore, in this case, the acquisition unit 31 acquires the first inspection image.
  • the acquisition unit 31 may acquire a second inspection image and a third inspection image similar to those in FIG. 5.
  • the inspection image acquired from the second light receiving unit 23b that receives light of the second wavelength in the reflected light R1 is called the second inspection image
  • the inspection image acquired from the third light receiving unit 23c that receives light of the third wavelength in the reflected light R1 is called the third inspection image.
  • the first inspection image, second inspection image, and third inspection image may each include multiple inspection images with different Z positions.
  • the identification unit 32 identifies the positions of pixels in the inspection image where the brightness is above a predetermined level.
  • the positions of pixels where the brightness is above a predetermined level are called pixel positions P. Note that some reference numerals have been omitted in Figure 5 to avoid cluttering the illustration.
  • the inspection image includes multiple inspection images with different Z positions.
  • the identification unit 32 may also identify pixel positions P where the brightness is above a predetermined level in multiple inspection images with different optical axis angles ⁇ .
  • the above is an example of a process for identifying pixel positions P where the brightness is above a predetermined level in multiple inspection images with different Z positions and different object rotation angles ⁇ .
  • the identifying unit 32 may identify pixel positions P where the luminance is equal to or greater than a predetermined value in multiple inspection images that correspond to Z positions and pixels capturing images of the end portions at the Z positions, each of which has a different Z position and a different rotation angle ⁇ of the object, based on multiple inspection images for each of multiple rotation angles ⁇ .
  • the identifying unit 32 may identify pixel positions P where the quality of an evaluation parameter for a pixel is equal to or greater than a predetermined value.
  • a luminance equal to or greater than a predetermined value is an example of the quality of an evaluation parameter for a pixel being equal to or greater than a predetermined value.
  • the evaluation parameters for a pixel may include, in addition to luminance, parameters obtained by normalizing luminance, etc. Furthermore, identifying pixel positions where the quality of an evaluation parameter for a luminance or pixel is equal to or greater than a predetermined value may also include identifying pixel positions where the quality of an evaluation parameter for a luminance or pixel is less than a predetermined value and excluding those pixels from processing.
  • Whether the quality of the brightness or pixel evaluation parameters is above a predetermined level may be determined by using the brightness or pixel evaluation parameters when an image is formed in a state where the object is focused to a predetermined degree as a threshold.
  • a state where the object is focused to a predetermined degree may mean just focus, or may include a state including an amount of defocus that is acceptable for the design.
  • the identification unit 32 may identify a first-wavelength pixel position P1 in a plurality of first inspection images acquired by the acquisition unit 31, each having a different Z position.
  • the identification unit 32 may identify a second-wavelength pixel position in a plurality of second inspection images acquired by the acquisition unit 31, each having a different Z position, or may identify a third-wavelength pixel position in a plurality of third inspection images, each having a different Z position.
  • a pixel position in the first inspection image where the brightness is greater than or equal to a predetermined value is referred to as a first-wavelength pixel position P1
  • a pixel position in the second inspection image where the brightness is greater than or equal to a predetermined value is referred to as a second-wavelength pixel position P2 (not shown)
  • a pixel position in the third inspection image where the brightness is greater than or equal to a predetermined value is referred to as a third-wavelength pixel position P3 (not shown).
  • the storage unit 33 stores the identified pixel position P together with the Z position, rotation angle ⁇ , and sampling time in each inspection image as high-brightness information.
  • the storage unit 33 may also store the optical axis angle ⁇ together with the pixel position P as high-brightness information.
  • the storage unit 33 may also store high-brightness information for multiple optical axis angles ⁇ .
  • the storage unit 33 may store the first-wavelength pixel position P1 together with the Z position, rotation angle ⁇ , and sampling time as information when the first wavelength is high-brightness in each first inspection image. Furthermore, the storage unit 33 may store the second-wavelength pixel position P2 together with the Z position, rotation angle ⁇ , and sampling time as information when the second wavelength is high-brightness in each second inspection image, and may store the third-wavelength pixel position P3 together with the Z position, rotation angle ⁇ , and sampling time as information when the third wavelength is high-brightness in each third inspection image.
  • Generating image information based on high-brightness information by the generation unit 34 may also include the generation unit 34 obtaining statistical values such as the average and standard deviation based on the brightness at multiple pixel positions P indicated by the multiple pieces of high-brightness information, or obtaining interpolated values based on the brightness at multiple pixel positions P, and outputting this as brightness level information or height information converted into physical height.
  • FIG. 6 is a diagram illustrating a profile of the edge WFE of the wafer WF generated by the generation unit 34 in the image processing device 30 according to the first embodiment.
  • FIG. 6 also shows a case where the optical axis angle ⁇ is multiple values ⁇ 1 to ⁇ 5. Note that an optical axis angle ⁇ of -90° to +90° is one example. Another example is dividing the optical axis angle ⁇ of -90° to +90° into ⁇ 1 to ⁇ 5.
  • the generation unit 34 may generate a profile of the cross section of the edge WFE as image information.
  • the generation unit 34 may generate a color image of the edge WFE of the wafer WF expressed in multiple colors based on the first wavelength high brightness information and the second wavelength high brightness information.
  • the generation unit 34 may also generate a color image of the edge WFE of the wafer WF expressed in multiple colors based on the first wavelength high brightness information, the second wavelength high brightness information, and the third wavelength high brightness information.
  • the generation unit 34 may generate a reference image of the edge WFE of the wafer WF based on the image information.
  • the reference image is an image of the edge WFE of an ideal wafer WF (a wafer WF that can be said to be defect-free) used to inspect the edge WFE of the wafer WF for defects, etc.
  • the control unit 35 controls the operation of the acquisition unit 31, identification unit 32, memory unit 33, and generation unit 34 in the image processing device 30.
  • the control unit 35 may also control the operation of the drive unit 12 of the stage 10, the drive unit 22 of the imaging unit 20, the light receiving unit 23, and the light source 24 in the inspection device 1.
  • the control unit 35 may inspect the wafer WF for defects, etc. based on the image information.
  • the inspection target wafer WF may be inspected by comparing image information of an ideal wafer WF with image information of the inspection target wafer WF.
  • the inspection target wafer WF may be inspected using a profile, color image, or image of the edge WFE generated based on the image information.
  • the identification unit 32 is caused to identify pixel position P.
  • the control unit 35 causes the identification unit 32 to identify pixel position P where the brightness is equal to or greater than a predetermined value in a plurality of inspection images each having a different Z position.
  • the control unit 35 may cause the identification unit 32 to identify first-wavelength pixel position P1 in a plurality of first inspection images, to identify second-wavelength pixel position P2 in a plurality of second inspection images, and to identify third-wavelength pixel position P3 in a plurality of third inspection images.
  • the control unit 35 may cause the identification unit to identify pixel positions where the brightness is equal to or greater than a predetermined value in a plurality of inspection images each having a different Z position and a different rotation angle ⁇ of the object.
  • the control unit 35 stores the identified pixel position P in the memory unit 33 as high-brightness information, along with the Z position, rotation angle ⁇ , and sampling time in each inspection image.
  • the control unit 35 may store the optical axis angle ⁇ together with the pixel position P in the memory unit 33 as high-brightness information.
  • the control unit 35 may also store high-brightness information for multiple optical axis angles ⁇ in the memory unit 33.
  • the control unit 35 causes the generation unit 34 to generate image information regarding the edge WFE of the wafer WF based on the stored multiple pieces of high-brightness information.
  • the control unit 35 may cause the generation unit 34 to generate image information based on the high-brightness information at multiple optical axis angles ⁇ relative to the wafer WF.
  • the control unit 35 may also cause the generation unit 34 to generate image information regarding the rotation angle ⁇ of the entire circumference of the edge WFE of the wafer WF based on the high-brightness information.
  • control unit 35 may cause the generation unit 34 to generate a color image of the edge WFE of the wafer WF displayed in multiple colors based on the first-wavelength high-brightness information, second-wavelength high-brightness information, and third-wavelength high-brightness information.
  • Fig. 8 is a flow chart illustrating an inspection method using the inspection device 1 according to the first embodiment.
  • step S22 the reflected light R1 reflected by the edge WFE of the wafer WF is focused by the objective lens 21.
  • the control unit 35 drives the drive unit 22 to control the position of the objective lens 21 so that the objective lens 21 focuses the reflected light R1.
  • step S23 the Z position is moved by the drive unit 22.
  • the control unit 35 drives the drive unit 22 to move the Z position in the optical axis direction.
  • the imaging unit 20 captures an inspection image.
  • the control unit 35 causes the light receiving unit 23 to receive reflected light and capture the inspection image.
  • step S25 the inspection image is subjected to image processing by the image processing device 30.
  • the image processing method using the image processing device 30 is as described above. Note that a step of inspecting the object based on the image information may also be included.
  • the image processing device 30 of this embodiment acquires an inspection image of the edge WFE by changing the Z position of the objective lens 21, which collects reflected light R1 from the edge WFE of the wafer WF, while rotating the wafer WF around the rotation axis C1.
  • the image processing device 30 then identifies pixel positions P from the inspection image where the brightness is above a predetermined level, and generates image information of the edge WFE based on these pixel positions P.
  • This allows the image processing device 30 to obtain profiles of the edge WFE for multiple rotation angles ⁇ . Specifically, it is possible to obtain profile information of the edge WFE around the entire periphery of the wafer WF. Therefore, the image processing device 30 can measure information about objects such as the wafer WF quickly and accurately.
  • the image processing device 30 of this embodiment stores data on multiple inspection images acquired by the acquisition unit 31 in the memory unit 33. Therefore, if there is a singularity in the profile of the edge WFE due to a measurement error or the like, corrections such as interpolation can be made to the singularity based on the data surrounding the singularity stored in the memory unit 33, thereby improving the accuracy of the profile of the edge WFE of the wafer WF.
  • the image processing device 30 of this embodiment scans the objective lens 21 in the optical axis direction regardless of the focal length of the objective lens 21. This makes it possible to obtain an image in which the object is in focus. This makes an autofocus function unnecessary.
  • each light receiving element In related devices that capture color images, light receiving elements corresponding to multiple colors are used to obtain the color image. In this case, each light receiving element must be in focus at the same time. If each light receiving element is not in focus at the same time, chromatic aberration will cause the colors to be inaccurate when a color image is generated.
  • the image processing device 30 of this embodiment generates a color image based on the focused first wavelength high brightness information, second wavelength high brightness information, and third wavelength high brightness information at each light receiving unit 23 (first light receiving unit 23a, second light receiving unit 23b, and third light receiving unit 23c) while scanning the objective lens 21 in the optical axis direction.
  • the first inspection image, second inspection image, and third inspection image are combined to generate a color image. This makes it possible to generate a color image quickly and with high accuracy.
  • the image processing device 30 and inspection device 1 of this embodiment use a device equipped with a confocal optical microscope, and can simultaneously generate high-resolution color images and cross-sectional profiles of the wafer WF.
  • the inspection device 1 may be configured to include a confocal optical imaging unit 20, a stage 10 having a rotation axis C1, and multiple light receiving units 23 that detect different wavelengths. While rotating the stage 10 at high speed, scanning is performed by moving the objective lens 21 in the direction of the optical axis C2 relative to the focus position.
  • the image processing device and inspection device 1 of this embodiment can be expected to have the following effects.
  • the first is the cost reduction effect. Because the wafer WF profile information and the inspection image used for defect inspection are captured simultaneously, the number of devices used can be reduced, resulting in reduced labor and costs.
  • the second is to obtain a profile of unevenness that is larger than the focal length.
  • the image processing device 30 of this embodiment can obtain focused images even for wafers WF with large uneven shapes.
  • Recent semiconductor devices tend to be formed three-dimensionally, and many wafers WF have surface unevenness of several tens of ⁇ m or more. For this reason, it is difficult to obtain an image in focus over the entire surface with the focal length of a typical confocal optical microscope.
  • the third is the reduction of chromatic aberration in color images.
  • brightness increases where the image is in focus and decreases where it is not. For this reason, if there is a discrepancy in the timing at which multiple cameras achieve focus, the colors will not be displayed correctly.
  • an image in which each light-receiving element 23 is in focus can be obtained by scanning along the optical axis. Therefore, by combining the images of each color later, a color image in which all colors are in focus can be generated.
  • an image processing device Next, an image processing device according to a second embodiment will be described. In the following, the description of the same configuration as in the first embodiment may be omitted.
  • the image processing device of this embodiment may be called an inspection device.
  • the inspection device 1 of the first embodiment described above may be called an image processing device.
  • the image processing device 30 will be called an image processing unit 30a.
  • the ⁇ image processing device> and ⁇ image processing unit> will be described first, followed by the ⁇ image processing method>.
  • FIG. 9 is a configuration diagram illustrating an image processing device 2 according to the second embodiment.
  • FIG. 10 is a cross-sectional view illustrating an example of an object in the image processing device 2 according to the second embodiment.
  • FIG. 11 is a plan view illustrating an example of an object in the image processing device 2 according to the second embodiment.
  • the image processing device 2 includes a support unit 40, a drive unit 12, a drive unit 22, an imaging unit 20, and an image processing unit 30a.
  • the image processing device 2 processes, for example, captured images of the object.
  • the image processing device 2 may inspect the object. In that case, the captured images include an inspection image.
  • the image processing device 2 may process captured images of the edge of the object and captured images of the main surface of the object.
  • the object includes, for example, a wafer WF.
  • the edge of the object includes the edge WFE of the wafer WF.
  • the main surface of the object includes the wafer surface WF1 of the wafer WF. Note that the main surface of the object does not exclude including the back surface of the wafer WF.
  • the object may be referred to as a wafer WF.
  • the object is not limited to a wafer WF, and may include plate-like objects such as semiconductor chips and printed circuit boards, as long as it has an edge and a main surface, or may be an object other than a plate-like object.
  • an r ⁇ polar coordinate system centered on the rotation axis C1 is introduced for the ⁇ plane in the ⁇ Cartesian coordinate system.
  • the radial position r indicates the distance from the rotation axis C1.
  • the rotation angle ⁇ indicates the angle between the ⁇ axis and the - ⁇ axis direction.
  • the support unit 40 supports the object.
  • the support unit 40 may include, for example, a stage 10.
  • the stage 10 supports a wafer WF placed on a stage surface 11.
  • the stage 10 may be connected to a drive unit 12 such as a motor.
  • the drive unit 12 rotates the stage 10 around a rotation axis C1.
  • the stage 10 may have a three-axis adjustment mechanism or the like that adjusts the position of the stage 10 and the gradient of the stage surface 11.
  • the drive unit 12 may move the stage 10 in a direction perpendicular to the rotation axis C1.
  • the drive unit 12 may also move the stage 10 in a direction parallel to the rotation axis C1.
  • the drive unit 12 may be able to change the relative position between the object and the focusing position of the objective lens 21 in a direction parallel to the rotation axis C1.
  • the support unit 40 supports the object so that it can rotate around the rotation axis C1.
  • the support unit 40 also supports the object so that it can move in a direction perpendicular to the rotation axis C1.
  • the support unit 40 supports the object so that it can move in the ⁇ -axis direction.
  • the support unit 40 may sense the amount of movement in the ⁇ -axis direction using the sensor 13. The amount of movement of the support unit 40 in the ⁇ -axis direction corresponds to the radial position r at which the imaging unit 20 images the wafer surface WF1.
  • the support unit 40 supports the object so that it can move in a direction parallel to the rotation axis C1.
  • the support unit 40 supports the object so that it can move in the ⁇ -axis direction.
  • the support unit 40 may sense the Z position from the amount of movement in the ⁇ -axis direction using the sensor 13.
  • the support unit 40 is not limited to the stage 10, as long as it can support the object so that it can rotate around the rotation axis C1 and so that it can move in a direction perpendicular to and parallel to the rotation axis C1.
  • the support unit 40 is connected to the image processing unit 30a via a communication line that can be either wireless or wired.
  • the support unit 40 is connected in a state where it can transmit information including data on the radial position r, rotation angle ⁇ , and Z position to the image processing unit 30a.
  • the support unit 40 outputs this information.
  • the driver 12 rotates the stage 10 around the rotation axis C1. Therefore, the driver 12 rotates the object around the rotation axis C1.
  • the driver 12 may change the position of the stage 10 by driving a three-axis adjustment mechanism or the like.
  • the driver 12 can change the relative position (called the radial position r) between the object and the focusing position of the objective lens 21 in a direction perpendicular to the rotation axis C1.
  • the driver 12 moves the stage 10 in the ⁇ -axis direction perpendicular to the rotation axis C1. Therefore, the driver 12 moves the object in the ⁇ -axis direction.
  • the driver 12 also moves the stage 10 in the ⁇ -axis direction parallel to the rotation axis C1. Therefore, the driver 12 moves the object in the ⁇ -axis direction.
  • the driver 12 may fix the Z position when changing the radial position r in a direction perpendicular to the rotation axis C1.
  • the driver 12 may fix the position of the objective lens 21 in the ⁇ -axis direction when changing the radial position r in a direction perpendicular to the rotation axis C1.
  • the driver 12 may fix the Z position to a first position, a second position, etc. when changing the radial position r in a direction perpendicular to the rotation axis C1.
  • the driver 12 may fix the position of the objective lens 21 in the ⁇ -axis direction to a first position, a second position, etc.
  • the driver 12 when changing the radial position r in a direction perpendicular to the rotation axis C1. Specifically, as shown by the support unit 40 on the left side of Figure 11, the driver 12 fixes the Z position to the first position when moving the support unit 40 in the - ⁇ -axis direction while rotating it. In contrast, as shown by the support unit 40 on the right side of Figure 11, when the driver 12 rotates the support unit 40 and moves it in the + ⁇ -axis direction, the driver 12 fixes the Z position at the second position.
  • the first and second positions may be different in the ⁇ -axis direction. The difference in the ⁇ -axis direction between the first and second positions is preferably within the focal depth of the optical system including the objective lens 21.
  • the driver 12 may set the optical axis angle ⁇ to approximately 90°. That is, when the driver 12 changes the radial position r in a direction perpendicular to the rotation axis C1, the driver 12 may set the optical axis angle ⁇ so that the optical axis C2 of the objective lens 21 and the rotation axis C1 are approximately parallel. Note that, in the above description, the driver 12 changes the radial position r in a direction perpendicular to the rotation axis C1, fixes the Z position at the first position, the second position, etc., or sets the optical axis angle ⁇ . However, instead of or in addition to this, the driver 22 may perform these operations.
  • the imaging unit 20 captures an image of an object.
  • the imaging unit 20 captures an image of the edge WFE and wafer surface WF1 of the wafer WF.
  • the imaging unit 20 includes, for example, an objective lens 21, a drive unit 22, and a light receiving unit 23.
  • the imaging unit 20 may further include a light source 24, a pinhole 25, a beam splitter 26, an optical element 27, an optical element 28, and a sensor 29. Note that, as long as the imaging unit 20 can capture images of the edge WFE and wafer surface WF1 of the wafer WF, any of the above optical components may be replaced with other optical components, or other optical components may be included in addition to the above optical components.
  • the imaging unit 20 captures an image of the edge WFE from the reflected light R1 reflected by the end WFE of the wafer WF.
  • the imaging unit 20 also captures an image of the wafer surface WF1 from the reflected light R1 reflected by the wafer surface WF1.
  • this may be simply described as the imaging unit 20 capturing images of the edge WFE and wafer surface WF1 from the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF.
  • the reflected light R1 reflected by at least one of the edge WFE and wafer surface WF1 of the wafer WF may be simply described as the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF.
  • the objective lens 21 focuses the reflected light R1 reflected by the object. Specifically, for example, the objective lens 21 focuses the reflected light R1 reflected by the edge WFE and wafer surface WF1 of the wafer WF.
  • the reflected light R1 may be illumination light L1 emitted from the light source 24, reflected by at least one of the edge WFE and wafer surface WF1 of the wafer WF.
  • the objective lens 21 has an optical axis C2. The direction of the optical axis C2 is called the optical axis direction.
  • At least one of the drive units 12 and 22 can change the Z position, which is the relative position between the object and the focusing position of the objective lens 21 in the optical axis direction of the objective lens 21.
  • the drive unit 22 moves the position of the objective lens 21 in the optical axis direction.
  • At least one of the drive units 12 and 22 can also change the optical axis angle ⁇ , which is the angle of the optical axis C2 of the objective lens 21 with respect to a plane perpendicular to the rotation axis C1 of the stage 10.
  • at least one of the drive units 12 and 22 can change the radial position r, which is the relative position between the object and the focusing position of the objective lens 21 in a direction perpendicular to the rotation axis C1 of the stage 10.
  • the drive unit 12 changes the radial position r of the objective lens 21 by moving the stage 10 in the ⁇ -axis direction.
  • FIG. 12 and 13 are diagrams illustrating positions G1 and G2 at which the imaging unit 20 captures an image when the support unit 40 rotates in an image processing device 2 according to embodiment 2.
  • information about position G1 at which the imaging unit 20 captures an image includes sampling time t1, radial position r1, and rotation angle ⁇ 1.
  • rotation of the support unit 40, such as the stage 10 causes the rotation angle ⁇ to change from ⁇ 1 to ⁇ 2.
  • movement of the support unit 40, such as the stage 10 in the - ⁇ -axis direction causes the radial position r to change from r1 to r2.
  • Information about position G2 at which the imaging unit 20 captures an image includes sampling time t2, radial position r2, and rotation angle ⁇ 2.
  • the imaging unit 20 may have a confocal optical system. This allows the imaging unit 20 to capture images so that the edge WFE and wafer surface WF1 of the wafer WF are in focus.
  • the imaging unit 20 and the like may move the Z position of the objective lens 21 during imaging without paying attention to focus.
  • the brightness of reflected light R1 from the focused positions of the edge WFE and wafer surface WF1 increases. Therefore, the focused Z position can be identified. Therefore, after imaging, various data information such as the focused radial position r, Z position, brightness information, evaluation parameters, rotation angle ⁇ , optical axis angle ⁇ , and sampling time of the image are referenced. Profiles of the edge WFE and wafer surface WF1 of the wafer WF can be formed using the referenced various data information. In this way, the imaging unit 20 enables three-dimensional measurement.
  • the imaging unit 20 is connected to the image processing unit 30a via a communication line that includes at least one of wireless and wired connections. Specifically, the imaging unit 20 is connected in a state where it can transmit information including image data, radial position r, Z position, rotation angle ⁇ , optical axis angle ⁇ , sampling time, etc. to the image processing unit 30a. The imaging unit 20 outputs this information to the image processing unit 30a.
  • FIG. 14 is a block diagram illustrating the image processing unit 30a according to the second embodiment.
  • the image processing unit 30a includes an acquisition unit 31, an identification unit 32, a storage unit 33, a generation unit 34, and a control unit 35, similar to the image processing device 30 described above. Note that the image processing unit 30a does not necessarily have to include the storage unit 33.
  • the acquisition unit 31 acquires an image of the object via the objective lens 21.
  • the acquisition unit 31 acquires an image of the end or main surface of the object via the objective lens 21 by changing the Z position while rotating the object around the rotation axis C1 using at least one of the drive units 12 and 22.
  • the acquisition unit 31 acquires multiple images with different Z positions and different rotation angles ⁇ of the object.
  • FIG. 15 is a diagram illustrating an example of an image acquired by the acquisition unit 31 in the image processing unit 30a according to embodiment 2, where the horizontal axis represents the rotation angle ⁇ and the vertical axis represents the pixels capturing an image of an area on the wafer surface WF1 of the wafer WF at a predetermined radial position r.
  • the acquisition unit 31 acquires multiple images with different Z positions and radial positions r. Each image corresponds to the rotation angle ⁇ of position G on the wafer surface WF1 of the wafer WF captured by the imaging unit 20, and the pixels capturing the image of the wafer surface WF1 at the rotation angle ⁇ .
  • the rotation angle ⁇ in the captured image may include a range of 0° to 360° so as to correspond to the entire circumference surrounding the central axis C1 of the wafer surface WF1 of the wafer WF, or may include a predetermined partial range.
  • the acquisition unit 31 may acquire multiple captured images with the radial position r changed. For example, the acquisition unit 31 may acquire each captured image when the radial position r is changed from r1 to r3 or more. Furthermore, the acquisition unit 31 may acquire multiple captured images when the Z position is moved in the ⁇ -axis direction at a predetermined distance within the focal depth of the optical system including the objective lens 21, such as from the first position Z1 to the second position Z2 to the third position Z3, etc.
  • the Z position when the Z position is the first position Z1, multiple captured images may be acquired from the radial positions r1 to r3 or more; when the Z position is the second position Z2, multiple captured images may be acquired from the radial positions r1 to r3 or more; and when the Z position is the third position Z3, multiple captured images may be acquired from the radial positions r1 to r3 or more.
  • the horizontal axis represents the rotation angle ⁇
  • the vertical axis represents the pixels capturing the area on the wafer surface WF1 of the wafer WF at the predetermined radial position r.
  • FIG. 16 is a diagram illustrating the correspondence between the positions of pixels included in each captured image and the positions on the wafer surface WF1 of the wafer WF when the radial position r increases at each sampling time t in the image processing unit 30a according to the second embodiment.
  • the pixels included in the captured image may correspond to positions on the wafer surface WF1 of the wafer WF at a radial position between r01 and r02 and at a rotation angle between 0° and 360°.
  • radial position r01 may be the radial position at which the measurement target area closest to the center of the wafer surface WF1 will be included in the captured image when the radial position r is increased by a small amount from radial position r01 at a rotation angle of 0°.
  • radial position r0E may be the radial position at which the measurement target area closest to the edge WFE of the wafer WF will be included in the captured image when the radial position is decreased by a small amount from radial position r0E at a rotation angle of 0°. This allows the measurement target area on the wafer surface WF1 to be included in the captured image without any deficiencies.
  • Figure 16 the pixels included in the captured image are arranged in a line extending in one direction, but this is not limited to this. Also, Figure 16 shows an example where the radial position r increases sequentially, but if the radial position r decreases sequentially, the above explanation can be followed in reverse.
  • the acquisition unit 31 can acquire the captured image shown in FIG. 15 based on captured images of the wafer surface WF1 of the wafer WF at multiple sampling times t, as shown in the example of FIG. 16.
  • the acquisition unit 31 may acquire captured images from the first light receiving unit 23a that receives light of a first wavelength.
  • the captured image acquired from the first light receiving unit 23a is referred to as the first captured image.
  • the acquisition unit 31 may acquire captured images from the second light receiving unit 23b that receives light of a second wavelength.
  • the captured image acquired from the second light receiving unit 23b is referred to as the second captured image.
  • the acquisition unit 31 may acquire captured images from the third light receiving unit 23c that receives light of a third wavelength.
  • the captured image acquired from the third light receiving unit 23c is referred to as the third captured image.
  • the acquisition unit 31 may acquire second and third inspection images similar to those in FIG. 5.
  • the first, second, and third captured images may each include multiple inspection images with different radial positions r and Z positions.
  • the control unit 35 controls the support unit 40 and drive units 12 and 22, and causes the acquisition unit 31 to acquire multiple captured images.
  • the control unit 35 also controls the operation of the acquisition unit 31, identification unit 32, memory unit 33, and generation unit 34.
  • the control unit 35 causes the acquisition unit 31 to acquire multiple captured images of the end of the object by changing the Z position using at least one of the drive units 12 and 22 while rotating the object around the rotation axis C1 using the support unit 40.
  • the control unit 35 rotates the object around the rotation axis C1 using the support unit 40, while causing the drive unit 22 to change the radial position r in a first direction perpendicular to the rotation axis, and while maintaining the Z position at the first position, causing the acquisition unit 41 to acquire multiple captured images of the object's main surface. Furthermore, while rotating the object around the rotation axis C1 using the support unit 40, the control unit 35 changes the radial position r in a second direction perpendicular to the rotation axis using the drive unit 22, and while maintaining the Z position at the second position, causing the acquisition unit 41 to acquire multiple captured images of the object's main surface.
  • the second direction is the opposite direction to the first direction.
  • the second position is a position different from the first position. Furthermore, the difference between the first position and the second position is within the focal depth of the optical system of the imaging unit 20, including the objective lens 21. Note that in this embodiment, the drive unit 12 may be driven instead of or in addition to the drive unit 22.
  • the identification unit 32 identifies pixel position P, which is the position of a pixel where the brightness is above a predetermined level, in a plurality of captured images that each have a different Z position and a different rotation angle ⁇ of the object.
  • the identification unit 32 may also identify pixel position P, which is the position of a pixel where the brightness is above a predetermined level, in a plurality of captured images that each have a different radial position r, Z position, rotation angle ⁇ , and optical axis angle ⁇ . Note that, as described above, brightness above a predetermined level is an example of the quality of the evaluation parameters for a pixel being above a predetermined level.
  • the identification unit 32 may identify a first-wavelength pixel position P1 in a plurality of first captured images acquired by the acquisition unit 31, each having a different radial position r or Z position.
  • the identification unit 32 may identify a second-wavelength pixel position in a plurality of second captured images acquired by the acquisition unit 31, each having a different radial position r or Z position, or may identify a third-wavelength pixel position in a plurality of third captured images, each having a different radial position r or Z position.
  • a pixel position in the first captured image where the brightness is greater than or equal to a predetermined value is referred to as a first-wavelength pixel position P1
  • a pixel position in the second captured image where the brightness is greater than or equal to a predetermined value is referred to as a second-wavelength pixel position P2 (not shown)
  • a pixel position in the third captured image where the brightness is greater than or equal to a predetermined value is referred to as a third-wavelength pixel position P3 (not shown).
  • the storage unit 33 stores the identified pixel position P together with the Z position and rotation angle ⁇ in each captured image as high-brightness information.
  • the storage unit 33 may also store the identified pixel position P together with the radial position r, Z position, rotation angle ⁇ , optical axis angle ⁇ , and sampling time in each captured image as high-brightness information.
  • the generation unit 34 may generate image information of the object based on multiple pieces of high-brightness information that have been stored.
  • the generation unit 34 may also generate image information of the object based on pixels with a predetermined or higher brightness.
  • the generation unit 34 may generate image information of the object at a specific rotation angle ⁇ based on pixels with a predetermined or higher brightness in multiple captured images that have different Z positions at a specific rotation angle ⁇ .
  • the generation unit 34 may generate image information of the object at multiple specific rotation angles ⁇ based on pixels whose brightness is a predetermined value or higher in multiple captured images whose Z positions are different and whose rotation angle ⁇ is a first rotation angle ⁇ 1, and pixels whose brightness is a predetermined value or higher in multiple captured images whose Z positions are different and whose rotation angle ⁇ is a second rotation angle ⁇ 2.
  • the generation unit 34 may generate the above-mentioned image information for multiple radial positions r. In this way, the generation unit 34 may generate image information of the main surface of the object.
  • the image information may include a profile of the main surface of the object.
  • FIG. 17 is a diagram illustrating a profile of the wafer surface WF1 of the wafer WF generated by the generation unit 34 in the image processing device 2 according to the second embodiment.
  • FIG. 17 also shows the case where the radial position r is multiple values r1 to r5.
  • the generation unit 34 may generate a profile of the cross section of the wafer surface WF1 of the wafer WF as image information.
  • a pixel whose brightness (goodness of the pixel evaluation parameter) is equal to or greater than a predetermined value can be considered to be appropriately focused on the surface of the edge WFE or the wafer surface WF1 at that Z position, e.g., just focused. Therefore, the generation unit 34 can generate a profile of the cross section of the wafer surface WF1 or the edge WFE of the wafer WF based on the Z position at that time.
  • the generation unit 34 may generate image information about the rotation angle ⁇ of the entire circumference of the main surface of the object based on the high-brightness information or pixels with a brightness equal to or greater than a predetermined value. For example, the generation unit 34 may generate a cross-sectional profile about the rotation angle ⁇ of the entire circumference of the wafer surface WF1 of the wafer WF.
  • the generation unit 34 may generate image information of the end of the object, as in the first embodiment described above. That is, the generation unit 34 generates image information of the end of the object at a specific rotation angle ⁇ based on pixels whose brightness is above a predetermined level in multiple captured images where the rotation angle ⁇ is a specific rotation angle ⁇ , the optical axis angle ⁇ is a first optical axis angle, and the Z positions are different from each other, and pixels whose brightness is above a predetermined level in multiple captured images where the rotation angle ⁇ is a specific rotation angle ⁇ , the optical axis angle ⁇ is a second optical axis angle, and the Z positions are different from each other.
  • Generating image information based on high-brightness information or pixels whose brightness is above a predetermined level may include the generation unit 34 outputting brightness level information for the brightness of pixels whose brightness is above a predetermined level, or height information indicating the physical height based on the Z position at that time. Furthermore, the generation unit 34 generating image information based on high-brightness information or pixels with a predetermined or higher brightness may include the generation unit 34 obtaining statistical values such as the average or standard deviation of multiple brightnesses for multiple pixels with a predetermined or higher brightness, or obtaining values interpolated based on the multiple brightnesses, and outputting this as brightness level information or height information converted into physical height. The generation unit may generate a color image of the object represented in multiple colors based on pixels with a predetermined or higher brightness in the first captured image, the second captured image, etc.
  • the control unit 35 may inspect the wafer WF for defects, etc. based on the image information.
  • the inspection target wafer WF may be inspected by comparing image information of an ideal wafer WF with image information of the inspection target wafer WF.
  • the inspection target wafer WF may be inspected using a profile, color image, or image of the edge WFE generated based on the image information.
  • Fig. 18 is a flow chart illustrating an image processing method using the image processing device 2 according to the second embodiment.
  • the reflected light R1 is focused by the objective lens 21.
  • the control unit 35 drives the drive unit 22 to control the position of the objective lens 21 so that the reflected light R1 reflected by the object is focused by the objective lens 21.
  • step S34 the acquisition unit 31 is caused to acquire the captured image.
  • the control unit 35 controls the acquisition unit 31 to acquire the captured image.
  • the control unit 35 causes the acquisition unit 31 to acquire the captured image of the object via the objective lens 21.
  • the control unit 35 may also cause the acquisition unit 31 to acquire multiple captured images.
  • the control unit 35 may cause the acquisition unit 31 to acquire multiple captured images of the end of the object by rotating the object around the rotation axis C1 using the support unit 40 and changing the Z position using at least one of the drive units 12 and 22.
  • the control unit 35 may rotate the object around the rotation axis C1 using the support unit 40, change the radial position in a first direction perpendicular to the rotation axis using the drive unit, and keep the Z position constant at the first position, causing the acquisition unit 31 to acquire multiple captured images of the main surface of the object.
  • the control unit 35 may rotate the object around the rotation axis C1 using the support unit 40, change the radial position in a second direction perpendicular to the rotation axis using the drive unit, and keep the Z position constant at the second position, causing the acquisition unit 31 to acquire multiple captured images of the main surface of the object.
  • the control unit 35 causes the identification unit 32 to identify pixel position P, which is the position of a pixel where the brightness is equal to or greater than a predetermined value, in multiple captured images each having a different Z position and a different rotation angle ⁇ of the object.
  • the image processing method of this embodiment may include, between steps S35 and 36, a step of storing the identified pixel position P in the memory unit 33 as high-brightness information, together with the radial position r, Z position, rotation angle ⁇ , optical axis angle ⁇ , and sampling time in each captured image.
  • (Appendix 1) an acquisition unit that acquires inspection images of the end of the object at a plurality of Z positions from the reflected light collected by the objective lens by rotating the object around the rotation axis on a stage having the rotation axis and moving a Z position, which is a position of the objective lens in the optical axis direction of the objective lens that collects the reflected light reflected by the end of the object; an identifying unit that identifies pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of inspection images each having a different Z position and a different rotation angle of the object; a storage unit that stores the identified pixel position together with the Z position and the rotation angle in each inspection image as high-brightness information; a generating unit that generates image information of the edge of the object based on the stored plurality of pieces of high-brightness information;
  • An image processing device comprising: (Appendix 2) the storage unit stores, as the high-brightness information, an optical axis angle, which is an angle
  • the image processing device of claim 1 (Appendix 3) the target object includes a plate-like object, the storage unit stores the high-brightness information at a plurality of the optical axis angles, the generation unit generates the image information based on the high-brightness information at a plurality of the optical axis angles with respect to the object. 3.
  • the image processing device according to claim 2. (Appendix 4) the generation unit generates the image information regarding the rotation angle of the entire circumference at the end of the object based on the high-brightness information. 2.
  • the acquisition unit acquiring a first inspection image from a first light receiving unit that receives light of a first wavelength corresponding to a first color in the reflected light; acquiring a second inspection image from a second light receiving unit that receives light of a second wavelength corresponding to a second color in the reflected light;
  • the identification unit identifying first-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the plurality of first inspection images each having a different Z position; identifying second-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the second inspection images each having a different Z position;
  • the storage unit storing the first-wavelength pixel position together with the Z position and the rotation angle as first-wavelength high-brightness information in each first inspection image; storing the second-wavelength pixel positions together with the Z position and the rotation angle as second-wavelength high-brightness information in each second inspection image;
  • the generation unit
  • the image processing device of claim 1. (Appendix 6)
  • the acquisition unit further acquiring a third inspection image from a third light receiving unit that receives light of a third wavelength corresponding to a third color in the reflected light;
  • the identification unit further identifying third-wavelength pixel positions, which are pixel positions where the luminance is equal to or greater than a predetermined value, in the third inspection images each having a different Z position;
  • the storage unit The third-wavelength pixel position is further stored as third-wavelength high-brightness information in each third inspection image, together with the Z position and the rotation angle;
  • the generation unit generating a color image of the edge of the object represented in a plurality of colors based on the first wavelength high brightness information, the second wavelength high brightness information, and the third wavelength high brightness information; 6.
  • the image processing device according to claim 5.
  • the object includes a wafer; the end portion includes an edge of the wafer; 2.
  • the image processing device of claim 1. (Appendix 8) a control unit that inspects the object based on the image information; 8.
  • An image processing device according to any one of claims 1 to 7.
  • (Appendix 9) a first step of rotating an object around a rotation axis on a stage having the rotation axis, while moving a Z position, which is a position of the objective lens in the optical axis direction of the objective lens that collects reflected light reflected at an end of the object, and having an acquisition unit acquire inspection images of the end of the object at a plurality of the Z positions from the reflected light collected by the objective lens; a second step of causing an identifying unit to identify pixel positions where the luminance is equal to or greater than a predetermined value in the plurality of inspection images each having a different Z position and a different rotation angle of the object; a third step of storing the identified pixel position together with the Z position and the rotation angle in each inspection image in a storage unit as high-brightness information; a fourth step of causing a generation unit to generate image information of the edge of the object based on the stored plurality of pieces of high-brightness information;
  • An image processing method comprising: (Appendix 10)
  • the target object includes a plate-like object
  • the third step storing the high-brightness information for a plurality of the optical axis angles
  • the fourth step generating the image information based on the high-brightness information at a plurality of the optical axis angles with respect to the object; 11.
  • An image processing method according to claim 10. (Appendix 12) In the fourth step, generating the image information for the rotation angle of the entire circumference of the end portion of the object based on the high-brightness information; 10.

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Microscoopes, Condenser (AREA)
  • Image Input (AREA)
  • Image Processing (AREA)

Abstract

Le dispositif de traitement d'images (2) selon la présente divulgation comprend : une unité de support (40) qui supporte de manière rotative un objet autour d'un axe de rotation (C1) ; une lentille d'objectif (21) qui condense une lumière réfléchie (R1) réfléchie par l'objet ; des unités d'entraînement (12, 22) qui peuvent modifier une position Z, qui est la position relative de la lentille d'objectif (21) par rapport à l'objet, dans une direction d'axe optique (C2) de la lentille d'objectif (21) ; une unité d'acquisition (31) qui acquiert une image capturée de l'objet par l'intermédiaire de la lentille d'objectif (21) ; une unité de commande (35) qui amène l'unité d'acquisition (31) à acquérir une pluralité d'images capturées ; une unité de spécification (32) qui spécifie, dans une pluralité d'images capturées ayant différentes positions Z et différents angles de rotation (thêta) de l'objet, une position de pixel qui est la position d'un pixel ayant une luminance supérieure ou égale à une luminance prescrite ; une unité de stockage (33) qui stocke la position de pixel spécifiée (P) en tant qu'information de temps de luminance élevée avec la position Z et l'angle de rotation (thêta) dans chaque image capturée ; et une unité de génération (34) qui génère des informations d'image de l'objet sur la base des informations de temps de luminance élevée.
PCT/JP2025/002226 2024-02-14 2025-01-24 Dispositif de traitement d'images et procédé de traitement d'images Pending WO2025173500A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2025506014A JP7748600B1 (ja) 2024-02-14 2025-01-24 画像処理装置及び画像処理方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2024020110 2024-02-14
JP2024-020110 2024-02-14

Publications (1)

Publication Number Publication Date
WO2025173500A1 true WO2025173500A1 (fr) 2025-08-21

Family

ID=96772923

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2025/002226 Pending WO2025173500A1 (fr) 2024-02-14 2025-01-24 Dispositif de traitement d'images et procédé de traitement d'images

Country Status (3)

Country Link
JP (1) JP7748600B1 (fr)
TW (1) TW202542483A (fr)
WO (1) WO2025173500A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251143A (ja) * 2006-02-15 2007-09-27 Olympus Corp 外観検査装置
WO2009072483A1 (fr) * 2007-12-05 2009-06-11 Nikon Corporation Appareil de contrôle et procédé de contrôle
JP2010060532A (ja) * 2008-09-08 2010-03-18 Raytex Corp 表面検査装置
JP2010519772A (ja) * 2007-02-23 2010-06-03 ルドルフテクノロジーズ インコーポレイテッド エッジビード除去プロセスを含む、ウェハ製造モニタリング・システム及び方法
JP2013160687A (ja) * 2012-02-07 2013-08-19 Ohkura Industry Co Ltd 検査装置
JP2021110551A (ja) * 2020-01-06 2021-08-02 東レエンジニアリング株式会社 基板エッジ検査装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007251143A (ja) * 2006-02-15 2007-09-27 Olympus Corp 外観検査装置
JP2010519772A (ja) * 2007-02-23 2010-06-03 ルドルフテクノロジーズ インコーポレイテッド エッジビード除去プロセスを含む、ウェハ製造モニタリング・システム及び方法
WO2009072483A1 (fr) * 2007-12-05 2009-06-11 Nikon Corporation Appareil de contrôle et procédé de contrôle
JP2010060532A (ja) * 2008-09-08 2010-03-18 Raytex Corp 表面検査装置
JP2013160687A (ja) * 2012-02-07 2013-08-19 Ohkura Industry Co Ltd 検査装置
JP2021110551A (ja) * 2020-01-06 2021-08-02 東レエンジニアリング株式会社 基板エッジ検査装置

Also Published As

Publication number Publication date
TW202542483A (zh) 2025-11-01
JPWO2025173500A1 (fr) 2025-08-21
JP7748600B1 (ja) 2025-10-02

Similar Documents

Publication Publication Date Title
TWI551855B (zh) 檢測晶圓之系統與方法以及由該系統讀取的程式儲存裝置
EP1785714B1 (fr) Dispositif d'évaluation de lentilles
TWI575625B (zh) 檢測晶圓之系統及方法
JP5672240B2 (ja) ウェーハを検査するためのシステム及び方法
US10142621B2 (en) Mass production MTF testing machine
JP2013011856A (ja) 撮像システムおよびその制御方法
US8810799B2 (en) Height-measuring method and height-measuring device
JP2009283633A (ja) 表面検査装置及び表面検査方法
US10551174B2 (en) Calibration method of image measuring device
JP2008175818A (ja) 表面検査装置及び方法
JPWO2009125839A1 (ja) 検査装置
KR102874281B1 (ko) 디스플레이 화소 자동 검사 시스템 및 방법
WO2014208193A1 (fr) Dispositif d'inspection d'aspect de tranche
JP2006242821A (ja) 光学パネルの撮像方法、光学パネルの検査方法、光学パネルの撮像装置、光学パネルの検査装置
JP2011145160A (ja) マルチフォーカス検査装置及びマルチフォーカス検査方法
JP7748600B1 (ja) 画像処理装置及び画像処理方法
JP4136635B2 (ja) 分析装置
US20070133969A1 (en) System and method for measuring and setting the focus of a camera assembly
JP2010243212A (ja) 傾斜検出方法、および傾斜検出装置
JP5191265B2 (ja) 光学顕微鏡装置及び光学顕微鏡用データ処理装置
JP2005274156A (ja) 欠陥検査装置
NL2028376B1 (en) Method of and arrangement for verifying an alignment of an infinity-corrected objective.
JP5217093B2 (ja) 検査装置及び検査方法
JP5073346B2 (ja) 画像読取装置用のレンズ検査方法及びレンズ検査装置
JP2022047893A (ja) ワーク撮像装置およびワーク撮像方法

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2025506014

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2025506014

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25754827

Country of ref document: EP

Kind code of ref document: A1