WO2012101717A1 - パターンマッチング装置、及びコンピュータープログラム - Google Patents
パターンマッチング装置、及びコンピュータープログラム Download PDFInfo
- Publication number
- WO2012101717A1 WO2012101717A1 PCT/JP2011/006834 JP2011006834W WO2012101717A1 WO 2012101717 A1 WO2012101717 A1 WO 2012101717A1 JP 2011006834 W JP2011006834 W JP 2011006834W WO 2012101717 A1 WO2012101717 A1 WO 2012101717A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- design data
- pattern matching
- pattern
- matching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N23/00—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
- G01N23/22—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by measuring secondary emission from the material
- G01N23/225—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by measuring secondary emission from the material using electron or ion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- H10P74/00—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10056—Microscopic image
- G06T2207/10061—Microscopic image from scanning electron microscope
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30148—Semiconductor; IC; Wafer
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L22/00—Testing or measuring during manufacture or treatment; Reliability measurements, i.e. testing of parts without further processing to modify the parts as such; Structural arrangements therefor
- H01L22/10—Measuring as part of the manufacturing process
- H01L22/12—Measuring as part of the manufacturing process for structural parameters, e.g. thickness, line width, refractive index, temperature, warp, bond strength, defects, optical inspection, electrical measurement of structural dimensions, metallurgic measurement of diffusions
-
- H10P74/203—
Definitions
- the present invention relates to a pattern matching apparatus and a computer program that causes a computer to execute pattern matching, and in particular, an apparatus that performs alignment between an image obtained based on design data and an image obtained based on imaging, and Regarding computer programs.
- a template matching method is known as a method for searching for a specific pattern existing on a target sample.
- SEM scanning Electron Microscope
- Patent Document 1 describes a method of creating a template based on semiconductor device design data and performing pattern matching using the template.
- Patent Document 2 discloses a method for determining a line part (convex part) or a space part (concave part) for a pattern in which line patterns called line and space patterns are arranged at substantially the same interval.
- Japanese Patent Application Laid-Open No. 2004-228561 describes a method of performing alignment between design data to which an evaluation region is added and a pattern image, and evaluating the shape of a pattern included in the evaluation region after alignment.
- JP 2007-5818 (corresponding US Pat. No. 7,235,782) JP 2003-90719 A (corresponding US Pat. No. 6,872,943) JP 2007-121181 A (corresponding US Pat. No. 7,787,687)
- Patent Document 3 it is possible to set an evaluation region at a desired evaluation location, but between the design data and the pattern (actual pattern) shape on the image acquired by SEM or the like. If there is a divergence and pattern matching fails, an evaluation area is set at an incorrect position.
- Patent Document 1 discloses a technique for performing a smoothing process on both the design data and the actual pattern image to suppress the shape difference between the two. However, there is no disclosure about how to deal with matching failure.
- the line or space pattern can be specified with a certain degree of accuracy.
- the method disclosed in Patent Document 2 is an unevenness determination method, and is not for performing pattern matching on a periodic pattern.
- the following describes a computer program and a pattern matching device that are used to properly perform matching even when a matching process fails or a matching process fails on an image including a repetitive pattern. To do.
- a computer program or a pattern matching apparatus for causing a computer to perform pattern matching on an image using a template formed based on design data as follows, and a pattern matching process Then, a computer program or a pattern matching device for determining whether or not image information included in a predetermined evaluation area is a predetermined condition is proposed.
- the edge information obtained based on the design data is compared with the region of interest (Region Of Interest: ROI) obtained by performing expansion processing on the pattern edge image obtained based on the actual image.
- ROI region of interest
- the flowchart which shows the process of producing
- the flowchart which shows the process of producing
- the figure which shows the example of the image used when performing the alignment process with respect to the image which imaged the hole array.
- the figure which shows an example of the display screen which displays the result of the alignment process.
- the flowchart which shows the process of producing
- the figure which shows an example of the semiconductor measurement and test
- the figure which shows an example of the line segment image based on the design data contained in ROI (Region
- the flowchart which shows the process of verifying matching by making the number of ROIs which satisfy
- the flowchart which shows the process of specifying a correct matching position from a some matching candidate.
- the figure which shows the example which evaluates ROI using a some template.
- the flowchart which shows the process of performing ROI evaluation using a some template.
- inspection an inspection and measurement (hereinafter simply referred to as inspection) process.
- An inspection apparatus using a general SEM is illustrated in FIG.
- the SEM mainly includes an electron microscope body 101 and a sample stage 105 included in a vacuum chamber.
- the SEM scans a sample mounted on the sample stage 105 with an electron beam based on an instruction from a computer 104 built in the control device 103, and electrons emitted from the electron beam scanning region are detected by a detector.
- a detector By detecting and arranging the detected electron signals in synchronization with the scanning signal of the electron beam, one-dimensional information (line profile) and a two-dimensional image are formed and displayed on the display device 109.
- FIG. 13 is a diagram showing the measurement and inspection system including the SEM in more detail.
- This system includes an SEM main body 1301, an A / D converter 1304, and a control device (including an image processing device) 1305.
- the SEM main body 1301 irradiates a sample such as a wafer on which an electronic device is manufactured with an electron beam, captures electrons emitted from the sample with a detector 1303, and converts them into a digital signal with an A / D converter 1304.
- the digital signal is input to the control device 1305 and stored in the memory 1307, and image processing according to the purpose is performed by image processing hardware such as a CPU, ASIC, and FPGA built in the image processing unit 1306.
- the image processing unit 1306 also has a function of creating a line profile based on the detection signal and measuring a dimension between peaks of the profile.
- control device 1305 is connected to an input device 1315 having an input means, and a GUI for displaying images, inspection results, etc. to the operator on a display device provided in the input device 1108 or an external display 1109. (Graphical User Interface) etc.
- control and processing in the control device 1305 can be assigned to a CPU or an electronic computer equipped with a memory capable of storing images and processed and controlled.
- the input device 1315 stores an imaging recipe including the coordinates of an electronic device required for inspection, a pattern matching template used for positioning, imaging conditions, etc. manually or in a design data storage medium 1314 of the electronic device. It also functions as an imaging recipe creation device that creates by utilizing the designed data.
- the input device 1315 includes a template creation unit that cuts out a part of a diagram image formed based on design data and uses it as a template.
- the created template is a matching processing unit built in the image processing unit 1306.
- a template matching template in 1306 is registered in the memory 1305.
- Template matching is a method for identifying a location where a captured image to be aligned and a template match based on matching determination using a normalized correlation method or the like, and the matching processing unit 1308 performs matching determination. Based on the above, a desired position of the captured image is specified.
- the system illustrated in FIG. 13 includes a simulator 1316.
- the simulator 1316 is a device that estimates the pattern quality based on the design data stored in the design data storage medium 1314.
- the embodiment described below relates to pattern matching between edge information obtained mainly based on design data and captured images taken by SEM or the like, and edge information obtained based on design data.
- the design data is expressed in, for example, the GDS format or the OASIS format, and is stored in a predetermined format.
- the design data can be of any type as long as the software that displays the design data can display the format and can handle the data as graphic data.
- a control device mounted on the SEM or a control device (input device 1315) connected to the SEM via a communication line or the like will be described as an example.
- the processing described below may be performed by a computer program using a general-purpose arithmetic device that executes image processing.
- the technique described later can be applied to other charged particle beam apparatuses such as a focused ion beam (FIB) apparatus.
- FIB focused ion beam
- the embodiments described below relate to an apparatus for performing pattern matching, a program for causing a computer to execute pattern matching, and a storage medium for storing the program.
- the alignment based on the pattern matching may not be performed properly. This is because in an image in which the same pattern is arranged, even if alignment is performed with a shift of one pitch, the degree of coincidence is almost the same as the accurate position. In such an image including a pattern with few features or an image including a pattern having a complicated shape, alignment may fail in a normal matching method. As a result, the operation of the measurement and inspection system is stopped or delayed, and the operation time by the operator increases. It is also difficult to correctly determine that the alignment has failed.
- Means for storing design data of a semiconductor circuit corresponding to a position on a wafer or an exposure mask; means for storing a design data image obtained by imaging the design data; and relative shapes included in the design data image Means for generating a design data ROI image obtained by imaging a region of interest obtained from the density relationship, and an alignment unit for aligning the inspection image and the design data image, and using the design data ROI image Specifying the position where the inspection image and the design data image match, or a device or system for calculating the degree of matching, or a computer that causes the computer to execute the above processing.
- the design data ROI image generated from the design data of the semiconductor circuit is used to focus only on the characteristic area excluding the area having a high shape density. A difference can be made in the degree of coincidence of the inspection images.
- the design data in the ROI and the inspection image are usually different, so that the correct and incorrect alignment can be correctly determined.
- an operator-free semiconductor inspection device can be provided.
- FIG. 4 is a diagram illustrating an example of the alignment determination process.
- 2 is a captured image to be subjected to pattern matching
- FIG. 3 is a template image generated based on design data
- FIG. 5 is a diagram showing a matching process step between two images.
- a template 4 is stored in the design data storage medium 1314 illustrated in FIG. 13, and is temporarily stored in the memory 1307 in the control device 1305 at the time of template creation (design data image 403).
- design data image 403 design data image 403
- the memory 1307 also stores a captured image (inspection image) to be subjected to pattern matching (inspection image 401) and is used for matching processing in the matching processing unit 1308.
- the inspection image 412 and the design data image 414 are aligned by the alignment unit 406.
- the design data ROI generation unit 404 creates a design data ROI image 405 based on the design data image 414.
- Alignment result 407 and design data ROI image 405 are input to alignment determination unit 408.
- the determination of the alignment result 407 is performed by the alignment determination unit 408 calculating the degree of coincidence 409 while paying attention to the ROI of the design data ROI image 405.
- the alignment determination unit 408 is the same as the alignment determination unit 1310 in FIG.
- the alignment matching degree 409 obtained based on this alignment determination is displayed on the display device 109. Note that a generally known correlation value or the like can be used as the alignment matching degree 409.
- the alignment matching degree 409 is sent to the display device 109 and displayed together with the inspection image 412, the design data image 414, and the design data ROI image 405.
- an imaging re-execution instruction 411 is sent to the inspection image 401.
- the alignment is performed again using the inspection image 412 obtained by performing the imaging again.
- the design data ROI image 405 is generated based on the design data image 502, and the inspection image 501 and the position of the design data image 502 are used using the design data ROI image 405. Judge success or failure.
- the inspection image 501 and the design data image 502 in FIG. 5 at least an appropriate position between the feature region 503 in the inspection image and the feature region 506 in the design data image. It may be determined whether or not the alignment is performed. In the present embodiment, an example will be described in which the feature regions 507 and 508 on the design data image 502 and the feature regions 504 and 505 of the inspection image 501 are also determined.
- the feature region 503 is a portion in which the periodicity is interrupted in the design data including the periodic pattern.
- This part is a part where the degree of coincidence at the time of a matching error is relatively low (or the degree of disagreement is high) compared to a case where the entire inspection image 501 and the entire design data image 502 are compared. In other words, this is a region where the difference in the degree of coincidence between the success and failure of the matching appears remarkably, and if this portion is selectively evaluated, the success or failure of the matching can be accurately verified.
- the design data image 502 is input to the design data ROI generator 404 to generate the design data ROI image 405.
- ROIs 509, 5010, and 511 corresponding to 506, 507, and 508 are generated.
- the design data ROI generation unit 404 corresponds to the ROI generation unit 1312 of FIG. 13 and generates an ROI based on an expansion process or the like described later.
- the alignment result 407 is an image obtained by combining the inspection image 501 and the design data image 502 based on the alignment result.
- the alignment determination result 409 and the design data ROI image 405 are logically ANDed with the alignment determination unit 408 to obtain the alignment determination result 409.
- the alignment determination unit 408 determines the ratio between the edge information of the inspection image 501 included in the alignment determination result 409 and the design information of the design data image 502, it is determined whether the alignment is correct or unsuccessful.
- FIG. 14 shows an evaluation area (ROI) 1401 obtained based on the design data, a captured image area 1402 that overlaps with the evaluation area 1401 when matching fails, and an evaluation area 1401 that overlaps when matching is successful.
- ROI evaluation area
- the success or failure of matching can be determined.
- success / failure based on the ratio of pixels as described above, it is determined how many pixels of the predetermined luminance included in the evaluation area 1401 are included in the total number of pixels in the evaluation area 1401.
- a predetermined value condition for example, when the ratio is within the predetermined ratio range or when the predetermined ratio range is not satisfied
- the matching is determined to be successful or unsuccessful. It is good to do so.
- success / failure may be determined based on a comparison between an absolute value of the number of pixels and a threshold value instead of a relative value.
- the success or failure of the alignment may be determined based on a comparison between the design data in the evaluation area that is smaller than the entire template size, the degree of coincidence with the captured image, and the threshold value.
- a profile waveform is formed in the evaluation window 1405 provided in the evaluation region 1404.
- the difference in the predetermined number of pixels based on the presence / absence of the edge in the evaluation window becomes clearer. Can be.
- the operator can know whether the alignment is correct or unsuccessful.
- the design data image 502 is stored in the memory 1307 as the (N-1) th time (step 801).
- the expansion processing unit 1311 performs expansion processing to thicken a line segment indicating the contour of the pattern, and sets the processing count N to +1. For example, when at least one pixel corresponding to an edge exists in a pixel adjacent to the target pixel, the dilation processing is performed by setting the target pixel to the same gradation as the edge.
- step 807 a portion that has not been filled is extracted.
- step 808 correction processing such as expanding the image is performed so that the unfilled portion has an appropriate position and area as a region of interest (ROI), and output.
- ROI region of interest
- FIG. 12 is a diagram illustrating a process of extracting the ROI based on the expansion process.
- the ROI generation unit 1312 extracts the ROI using the image subjected to the expansion processing by the expansion processing unit 1311. It is assumed that a feature point 1202 and feature points 1203 and 1204 exist as characteristic portions in the input design data image 502. When the design data image 502 is expanded, white portions increase like an image 1206. Further expansion processing results in an image 1208, and another expansion processing results in an image 1210.
- a region of interest (ROI) image 1220 can be obtained by extracting the shape from the image 1208 of the (N-1) th time in processing 1212 and performing processing such as expansion.
- the white region 1221 included in the region of interest (ROI) image 1220 corresponds to the feature point 1202
- the white region 1222 corresponds to the feature point 1203
- the white region 1223 corresponds to the feature point 1204.
- the region of interest (ROI) can be obtained from the relative density relationship of the shape rather than judging the shape included in the design data.
- FIG. 11 is a diagram showing a display example of the display device 109 for displaying the alignment result, and the display data is created by the output data creation unit 1313 illustrated in FIG.
- a menu 1101 is an ROI method switch for designating whether to obtain an ROI.
- An Auto switch 1102 designates that an area of interest (ROI) is automatically obtained using the processing flow of FIG.
- a window 1105 displays an inspection image.
- a window 1107 displays a design data image.
- a window 1109 is an ROI display unit that displays the design data ROI image 405 generated by the design data ROI generation unit 404.
- the window 1111 is a result display unit that displays the alignment result, and displays the alignment determination result 409. Further, the composite images 1105, 1107, and 1109 may be displayed. By viewing the window 1111, it is possible to know which area in the design data image displayed in 1107 is being determined.
- the measurement target position is positioned in the electron beam irradiation area (step 1601), and an image of the target pattern is acquired based on the electron beam scanning (step 1602).
- pattern matching is executed using a template such as the design data image 502 (step 1603).
- the ROI is evaluated, and if it is determined that the target region (for example, the portion where the periodic pattern is interrupted) is included in the ROI, the next processing (beam scanning for measurement, etc.) Is executed (step 1605), and if the target region is not included in the ROI, that is, it is determined that the alignment has failed, correct alignment is executed by the matching process using the ROI (step 1606). ). Even though the alignment has failed, if it can be determined that there is a correct position at a close position, a template including a selection area 1701 including three ROIs and a single ROI 1702 is used as illustrated in FIG. To perform matching processing in a narrow range. In this case, for example, by limiting the movement range of the template to 3 pitches of the line and space pattern (that is, searching a narrow range compared to the movement amount of the template of the first template matching), the processing speed is increased. It is possible to improve efficiency.
- FIG. 6 is a block diagram illustrating a process of performing matching for comparing ROI regions by performing expansion processing on both image data generated based on design data and image data generated based on a captured image. is there.
- FIG. 7 is a diagram illustrating an example of an image from which both image data and ROI are extracted.
- both the inspection image 701 and the design data image 702 are stored in the memory 1307, and the inspection image ROI generation unit 601 and the design data ROI generation unit 404 (both in the case of FIG. 13, both ROI generation units 1312).
- the ROI is generated at
- the inspection image ROI image 602 and the design data ROI image 605 are input to the alignment unit 406 and alignment is performed by a generally known method such as normalized correlation.
- a generally known method such as normalized correlation.
- the contour of ROI is extracted, and matching processing can be performed not by comparing pattern shapes on a sample but by comparing ROI shapes.
- the output data creation unit 1313 generates position error information 704 between the inspection image ROI image 602 and the design data ROI image 605 as the alignment result 409 and displays them on the display device 109. it can.
- the output data creation unit 1313 creates output data for displaying matching success / failure, position error information, and the like on the display device.
- the display device 109 includes additional information such as a recipe name, image acquisition time, and sample coordinate information. Display together.
- the alignment result 704 after the position correction may be displayed on the display device 109 based on the position error information.
- FIG. 9 is a processing flow of the inspection image ROI generation unit 601.
- the difference from the flowchart of FIG. 8 is that the information to be input to the processing flow of FIG. 8 is created from the inspection image.
- image processing such as noise removal and edge enhancement is performed on the inspection image 701 to improve image quality.
- image processing is performed by the image processing unit 1306.
- the inspection image with improved image quality is converted into a binarized image.
- a region of interest (ROI) is obtained from the inspection image in the same procedure as in FIG. 8, and the inspection image ROI image 602 is generated.
- ROI region of interest
- FIG. 10 is a diagram showing an outline of matching processing when a hole array which is an aspect of a repetitive pattern is used as a target image.
- the hole array shape is a repetitive pattern, there may be a shift of one row or one column of the hole array.
- the degree of coincidence is expressed by a simple expression of Expression 1, where B is the total number of holes in the inspection image 1001, and A is the number of holes where the holes in the design data image and the inspection image do not overlap. In this case, A is 5 and B is 30 in the failure image 1005 of FIG.
- the degree of coincidence 1 ⁇ (A / B) ⁇ 100 “%” Equation 1
- the degree of coincidence may be obtained only for the inside of the ROI.
- a design data ROI image 1007 is generated by cutting out a design data image based on the alignment result.
- a matching degree calculation image 1010 is obtained by superimposing the ROI 1008 included in the design data ROI image 1007 on the alignment result 1006.
- Expression 1 is used for the coincidence calculation area 1011, A is 5 and B is 9, so the coincidence is 44%. From this, if the design data ROI is used, it is possible to accurately determine whether the alignment is correct or unsuccessful.
- the pattern matching method and the pattern matching verification method described above can be applied particularly to a pattern inspection apparatus, a length measuring apparatus, and a semiconductor measurement and inspection system for inspecting a circuit pattern of a semiconductor device wafer or an exposure mask. .
- FIG. 18 is an example of a design data image in which a pattern based on design data is composed of vertical and horizontal line segments at equal intervals.
- the vertical pattern component 1802 is selectively expanded left and right until it overlaps with the left and right adjacent patterns.
- the boundary between the expansion areas 1804 of the horizontal pattern 1803 and the vertical pattern 1802 is a pattern density relationship, and a characteristic area 1807 is obtained.
- the horizontal pattern component 1803 is selectively expanded vertically until it overlaps with the upper and lower adjacent patterns.
- the boundary between the expansion areas 1806 of the vertical pattern 1802 and the horizontal pattern 1803 is in a dense relationship, and a characteristic area 1807 is obtained.
- FIG. 19 shows an example of performing weighted matching using the design data ROI in gray scale.
- the image indicating the design data ROI can be grayscale design data ROI 1901 instead of a black and white binary image.
- the gray scale design data ROI 1901 means that the gray scale pixel value is the weight of the matching correct position. It is expected to match the center of the gray scale design data ROI 1901, and the regions 1911, 1912, 1913, and 1914 having the same pixel value are set to pixel values in ascending order from the periphery of the gray scale design data ROI 1901 toward the center. .
- the matching position is shifted upward by the width of the horizontal pattern 1905.
- a matching correct result image 1906 obtained by matching the design data image 1902 and the captured SEM image 1903, the design data pattern and the edge of the SEM image overlap.
- the pixel where the pattern of the design data and the edge of the SEM image overlap is defined as a plus point, and the pixel which is deviated is defined as a minus point.
- the pixel values of 1911, 1912, 1913, and 1914 where the pixels are located are multiplied. Matching failures can be reduced by setting the higher score from the points obtained by multiplying the degree of overlap between the design data pattern and the edge of the SEM image by the pixel value of the gray scale design data ROI 1901 as the matching correct position.
- FIG. 21 is a diagram for explaining a method in which the region of the design data ROI is selected as a selection region for evaluating the similarity between the inspection image and the design data in the method of pattern matching between the inspection image and the design data.
- a region in which the similarity evaluation between the inspection image and the design data is not performed is referred to as a mask processing region
- a matching method in which the similarity evaluation is not performed in the mask processing region is referred to as pattern matching with mask processing.
- FIG. 21A shows a processing flow of this method.
- the pattern matching with mask processing 2107 receives the inspection image 2101, the design data 2102, and the mask data 2106, and outputs a matching position 2108.
- the mask data 2106 is generated by the following process. First, the ROI image 2104 is generated by the design data ROI generation processing 2103 described in FIG. 8 based on the design data, and then the mask data is generated by the mask area generation processing 2105 based on the generated ROI image 2104.
- FIG. 21B is a diagram illustrating an image generated by pattern matching with mask processing.
- pattern matching an area having the same size as the design data used as a template for matching processing is extracted from the inspection image 2120, and the similarity between the extracted area and the design data is evaluated. The position of this cutout area is moved on the inspection image (2122), the similarity at each position is calculated, and the position with the maximum similarity is set as the matching position.
- the matching correct answer position is a rectangular broken line portion 2121.
- a round broken line portion in the inspection image 2120 is a characteristic region in the main image (other regions are repetitive patterns of line and space patterns).
- a characteristic area in the design data 2130 is a circular broken line portion in the design data 2130, and this area can be extracted by the ROI generation method described in FIG.
- the extracted ROI is an area indicated by a white circle area in the ROI image 2131 (design data in the area is drawn for reference only to indicate the position of the area, and is not actually present).
- a flag for performing similarity evaluation is attached to the ROI of the ROI image in the matching process with mask processing (for example, 1), and flags other than that (mask area) are not subjected to similarity evaluation. Add (for example, 0).
- the matching process receives the inspection image 2120, the design data 2130, and the mask data 2140 as inputs. By using the mask data 2140, it is possible to perform evaluation using only characteristic areas in the design data for similarity evaluation with the inspection image.
- the flag of the mask area is not limited to binary, but may be gray scale (multi-value) as described in FIG.
- the similarity evaluation of pattern matching is a correlation calculation, it is possible to multiply a gray scale (multi-value) weight as a contribution rate to a correlation value for each pixel.
- FIG. 22 is a diagram for explaining a method of generating mask data by a different method in the method of performing pattern matching with mask processing using the design data ROI described in FIG. 21 (Example 6).
- the process until the ROI is generated from the design data 2211 by the ROI generation method described in FIG. 8 is the same as that in FIG. 21 (Example 6).
- the extracted ROI (white circle area of the ROI image 2212)
- the other areas are set as mask areas.
- the vicinity of the broken line that is the design data in the ROI is set as the similarity evaluation region (white portion). This area is generated, for example, by expanding the area of the design data section.
- the white area is an area for similarity evaluation
- the other black area is a mask area.
- the excessive pattern 2242 can be excluded from the similarity evaluation by mask processing.
- pattern matching with high robustness that is not adversely affected by the excessive pattern 2242 can be performed.
- FIG. 23 is a diagram for explaining a method for automatically selecting a template region from design data using an ROI generated from the design data. With this method, a template that enables more robust matching can be automatically selected from design data.
- FIG. 23A is a diagram for explaining the processing flow for template selection.
- the ROI image 2303 is generated from the design data 2301 by the ROI generation processing 2302 described in FIG.
- a template 2305 is selected from the ROI image 2303 by template selection processing 2304.
- FIG. 23B is a diagram for explaining an example of template selection.
- a region 2302 indicated by a broken line is an ROI.
- a template is selected so that many ROI regions are included in the template.
- the region 2303 includes the most ROI, so the region 2303 is selected as a template.
- pattern matching if there are many characteristic areas in the template, matching tends to be robust. By using an area that contains many ROIs that are characteristic areas as in this method as a template, Become robust. If the template size is determined, a template position that includes more ROIs in the template region may be selected.
- the template size and the template position may be selected so that more ROIs are included in the template within the allowable range of the template size.
- the determination of whether or not many templates are included is not limited to the total area of ROIs, but may be selected so that the ratio of the ROI area to the template area is maximized.
- the weighted sum may be the weighted ROI described in FIG.
- FIG. 24 is a diagram for explaining a method of generating an ROI by a method different from that in FIG.
- FIG. 24A is a diagram for explaining the processing flow.
- a corner point extraction process 2402 is performed on the design data 2401, and a corner point peripheral area extraction process 2403 is performed on the result, thereby generating an ROI image 2404.
- FIG. 24B illustrates the contents of this process.
- a corner point 2406 (circled portion) is extracted as a characteristic region from the design data 2401.
- corner extraction methods by image processing include Harris corner detection method and SUZAN corner detection method, but are not limited to this method, and any method may be used as long as it is a corner detection method.
- a corner point peripheral area 2407 (circled portion) is generated based on the corner point 2406. For example, this area can be generated by expanding the corner points.
- the generation method is not limited to the expansion process, and it is sufficient that a corner peripheral region can be generated.
- a corner peripheral area 2407 is an ROI (white area) of the ROI image 2404.
- FIG. 20 is a flowchart showing a process of extracting the ROI based on the distance conversion image.
- an edge image or a simulation image of design data is read from the design data storage medium 1314 (step 2001), and a distance conversion image is formed (step 2002).
- the distance conversion image used in this example is formed by giving a signal according to the distance from the edge to the pixel with the edge position as a reference. To form.
- the distance conversion image forming method itself is a known one, and various methods can be applied.
- the position farthest from the edge is a portion where the periodic pattern is interrupted, and one or a plurality of ROI candidate positions farthest from the edge by forming a distance conversion image Can be specified.
- a position farthest from the edge, or a portion separated by a predetermined value or more from the edge is selected as the ROI center (step 2003).
- the distance conversion image there is a point that it can be applied even when the distance between the edges of the discontinuity (y direction) is smaller than the distance between the lines (x direction).
- the discontinuous portion is filled in first, so it is difficult to set the ROI in the discontinuous portion.
- a portion having a specific distance from the edge can be selectively extracted based on the distance image, for example, a portion closest to the edge or a predetermined distance condition (for example, a distance between lines) By selectively extracting a portion having different distance information, an appropriate ROI center can be selected.
- a predetermined size is set as an ROI from this ROI center position (step 2004), and this measurement condition is registered in the recipe as an automatic measurement condition (step 2005).
- FIG. 25 is a flowchart showing the process.
- a plurality of ROI evaluations are performed (step 2502).
- the number of ROIs whose evaluation values (matching degree etc.) satisfy a predetermined condition is determined (step 2503), and it is determined whether or not the number satisfies a predetermined condition (step 2504). For example, it is determined whether there are a predetermined number or more of ROIs with a matching degree equal to or greater than a predetermined value, or whether the number of ROIs with a mismatching degree equal to or greater than a predetermined value is equal to or less than a predetermined value.
- step 2505 If it can be determined that the number of ROIs satisfies the predetermined condition, it is determined that the matching is successful (step 2505). If it is determined that the number of ROIs does not satisfy the predetermined condition, it is determined that the matching has failed. (Step 2506). If it is determined that the matching has failed, the process returns to step 2501 again to select a next candidate or generate an error message or the like.
- the verification of matching is not simply performed using the evaluation value in the ROI, but a plurality of ROIs are provided, and the number of ROIs that satisfy the conditions is used as a criterion, for example, the pattern is partially deformed. Even when the evaluation value shows an abnormal value, matching can be verified properly.
- FIG. 26 is a flowchart showing the process.
- a plurality of matching candidate positions for example, the matching degree is equal to or higher than a predetermined value or the top n matching positions are selected
- one matching position is identified from a plurality of matching candidate positions based on the determination in the ROI (step 2603). In this case, the best match between the first matching and the statistical value of the evaluation value of the ROI may be selected, or only the evaluation value in the ROI may be used for determination.
- ROI evaluation can be used not only for verification of matching but also for selecting an appropriate one from a plurality of matching candidates.
- FIG. 27 is a diagram showing an ROI 2701 including two line ends 2702 and 2703.
- the relative position of the two line ends can change.
- Optical Proximity Correction Optical Proximity Correction
- pattern matching is executed after preparing templates 2704 and 2705 for evaluating ROI.
- normal pattern matching is executed (step 2801), and then pattern matching using the templates 2704 and 2705 is executed in the ROI 2701 (step 2802).
- a predetermined condition it is determined whether a matching result using a plurality of (or three or more) templates satisfies a predetermined condition. If the matching result satisfies a predetermined condition, it is determined that the matching performed in step 2801 is successful (step 2805), and the process proceeds to the next process such as pattern measurement. If it is determined that the matching result does not satisfy the predetermined condition, it is determined that the pattern matching in step 2801 has failed (step 2806), and the next candidate is selected or an error message is generated. Execute processing such as
- whether or not the matching result satisfies a predetermined condition is determined based on whether or not a location having a matching degree equal to or higher than a predetermined value can be identified by matching using two templates, for example. According to such a determination, even if the two pattern positions are relatively displaced, it is possible to determine the presence or absence of the two patterns, and the matching verification can be performed stably regardless of the pattern formation conditions. It can be performed. Since one pattern may be formed out of the ROI, it may be determined that matching is successful when one pattern is detected, or one pattern is detected. In such a case, the verification may be performed by lowering the threshold value of the other matching degree.
- the area for pattern matching is set to ROI. According to the limiting method, since the types of pattern shapes within the search target are limited, matching verification using a simple shape template can be performed.
- a plurality of verification templates having different relative positions between patterns may be prepared in advance, and when a matching degree equal to or higher than a predetermined value is obtained in any of them, it may be determined that matching is successful.
- Electron Microscope 103 Control Device 104 Computer 109 Display Device 201 Inspection Image 301 Design Data Image 402 Design Data 404 Design Data ROI Generation Unit 405 Design Data ROI Image 408 Registration Determination Unit 601 Inspection Image ROI Generation Unit 602 Inspection Image ROI Image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Chemical & Material Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Image Analysis (AREA)
- Testing Or Measuring Of Semiconductors Or The Like (AREA)
- Image Processing (AREA)
- Length-Measuring Devices Using Wave Or Particle Radiation (AREA)
Abstract
Description
この場合、ROIの内側だけについて一致度を求めれば良い。具体的なステップとして、まず、位置合わせ結果に基づいて設計データ画像を切り出して設計データROI画像1007を生成する。設計データROI画像1007に含まれるROI1008を位置合わせ結果1006に重ねたものが一致度算出画像1010である。一致度算出領域1011について式1を用いると、Aは5、Bは9となり一致度は44%となる。このことから、設計データROIを用いれば精度よく位置合わせの正解と失敗を判断することができる。
103 制御装置
104 計算機
109 表示装置
201 検査画像
301 設計データ画像
402 設計データ
404 設計データROI生成部
405 設計データROI画像
408 位置合わせ判定部
601 検査画像ROI生成部
602 検査画像ROI画像
Claims (25)
- 設計データに基づいて形成されたテンプレートを用いて、画像上でパターンマッチングを実行する画像処理部を備えたパターンマッチング装置において、
前記画像処理部は、前記パターンマッチングを実行した後、前記テンプレート内の所定の評価領域内に含まれる前記画像が所定の条件を満たすか否かの判定を行うことを特徴とするパターンマッチング装置。 - 請求項1において、
前記設計データに基づいて形成されるテンプレートは、前記設計データに基づいて得られる画像に対し、膨張処理を施すことによって形成されるものであることを特徴とするパターンマッチング装置。 - 請求項2において、
前記評価領域は、前記膨張処理を施すことによって、塗り潰されなかった部分を抽出したものであることを特徴とするパターンマッチング装置。 - 請求項3において、
前記評価領域は、前記塗り潰されなかった部分を、膨張することによって得られるものであることを特徴とするパターンマッチング装置。 - 請求項1において、
前記画像処理部は、前記評価領域内に、所定のパターンが含まれているか否かによって、前記判定を行うことを特徴とするパターンマッチング装置。 - 請求項1において、
前記テンプレートには、繰り返しパターンが含まれることを特徴とするパターンマッチング装置。 - 請求項6において、
前記評価領域には、前記繰り返しパターンの周期性が途切れた部分が含まれることを特徴とするパターンマッチング装置。 - 請求項1において、
前記評価領域は、前記テンプレートと比較して小さいことを特徴とするパターンマッチング装置。 - 請求項1において、
前記テンプレートは、前記設計データに基づいて得られる画像に対し、縦方向、または横方向に選択的に膨張処理を施すことによって形成されるものであることを特徴とするパターンマッチング装置。 - 請求項1において、
前記画像処理部は、前記評価領域を、周囲と中心で重みに相当する画素値が異なるよう生成することを特徴とするパターンマッチング装置。 - 請求項10において、
前記画像処理部は、前記評価領域内に、所定のパターンが含まれているか否かを、前記評価領域の画素値との演算を行った値で判定することを特徴とするパターンマッチング装置。 - 請求項1において、
前記評価領域は、設計データから抽出したコーナ点を膨張処理して抽出したものであることを特徴とするパターンマッチング装置。 - 設計データに基づいて形成されたテンプレートを用いて、撮像画像上でパターンマッチングを実行する画像処理部を備えたパターンマッチング装置において、
前記画像処理部は、前記設計データに基づいて得られる画像と、前記撮像画像の双方に膨張処理を施し、当該膨張処理によって塗り潰されなかった領域を、マッチング対象として比較することによって、前記パターンマッチングを実行することを特徴とするパターンマッチング装置。 - 請求項13において、
前記マッチング対象は、前記塗り潰されなかった部分を、膨張することによって得られるものであることを特徴とするパターンマッチング装置。 - 請求項13において、
前記マッチング対象の位置は、繰り返しパターンの周期性が途切れた位置に相当することを特徴とするパターンマッチング装置。 - 設計データに基づいて形成されたテンプレートを用いて、画像上でパターンマッチングをコンピューターに実行させるコンピュータープログラムにおいて、
当該プログラムは、前記コンピューターに前記パターンマッチングを実行させた後、前記テンプレート内の所定の評価領域内に含まれる前記画像が所定の条件を満たすか否かの判定を行わせることを特徴とするコンピュータープログラム。 - 請求項16において、
前記プログラムは、前記コンピューターに、前記設計データに基づいて得られる画像に対し、膨張処理を施させることによって前記テンプレートを生成させることを特徴とするコンピュータープログラム。 - 請求項17において、
前記プログラムは、前記コンピューターに、前記膨張処理を施すことによって、塗り潰されなかった部分を抽出したものを、前記評価領域とさせることを特徴とするコンピュータープログラム。 - 請求項18において、
前記プログラムは、前記コンピューターに、前記塗り潰されなかった部分を、膨張させることによって前記評価領域を生成させることを特徴とするコンピュータープログラム。 - 設計データに基づいて形成されたテンプレートを用いて、画像上でパターンマッチングをコンピューターに実行させるコンピュータープログラムにおいて、
当該プログラムは、前記コンピューターに、前記設計データに基づいて得られる画像と、前記撮像画像の双方に膨張処理を施させ、当該膨張処理によって塗り潰されなかった領域を、マッチング対象として比較させることによって、前記パターンマッチングを実行させることを特徴とするコンピュータープログラム。 - 請求項20において、
前記マッチング対象は、前記塗り潰されなかった部分を、膨張することによって得られるものであることを特徴とするコンピュータープログラム。 - 請求項21において、
前記マッチング対象の位置は、繰り返しパターンの周期性が途切れた位置に相当することを特徴とするコンピュータープログラム。 - 設計データに基づいて形成されたテンプレートを用いて、撮像画像上でパターンマッチングを実行する画像処理部を備えたパターンマッチング装置において、
前記画像処理部は、前記設計データに基づいて得られる画像に膨張処理を施し、当該膨張処理によって塗り潰されなかった領域について選択的に、前記設計データに基づいて得られる画像と前記撮像画像との間で前記パターンマッチングを実行することを特徴とするパターンマッチング装置。 - 請求項22において、
前記膨張処理によって塗り潰されなかった領域において、設計データでパターンのデザインがある領域周辺について選択的に、前記設計データに基づいて得られる画像と前記撮像画像とで前記パターンマッチングを実行することを特徴とするパターンマッチング装置。 - 設計データに基づいて形成されたテンプレートを用いて、撮像画像上でパターンマッチングを実行する画像処理部を備えたパターンマッチング装置において、
前記画像処理部は、前記設計データに基づいて得られる画像に膨張処理を施し、当該膨張処理によって塗り潰されなかった領域が含まれる割合が大きなテンプレートを用いて、前記設計データに基づいて得られる画像と前記撮像画像とで前記パターンマッチングを実行することを特徴とするパターンマッチング装置。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/981,963 US10535129B2 (en) | 2011-01-26 | 2011-12-07 | Pattern matching apparatus and computer program |
| JP2012554506A JP5707423B2 (ja) | 2011-01-26 | 2011-12-07 | パターンマッチング装置、及びコンピュータープログラム |
| KR1020137019857A KR101522804B1 (ko) | 2011-01-26 | 2011-12-07 | 패턴 매칭 장치 및 기록 매체 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011-013562 | 2011-01-26 | ||
| JP2011013562 | 2011-01-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012101717A1 true WO2012101717A1 (ja) | 2012-08-02 |
Family
ID=46580330
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2011/006834 Ceased WO2012101717A1 (ja) | 2011-01-26 | 2011-12-07 | パターンマッチング装置、及びコンピュータープログラム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US10535129B2 (ja) |
| JP (1) | JP5707423B2 (ja) |
| KR (1) | KR101522804B1 (ja) |
| WO (1) | WO2012101717A1 (ja) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015213749A (ja) * | 2014-04-21 | 2015-12-03 | 株式会社東芝 | X線コンピュータ断層撮影装置及び撮影条件設定支援装置 |
| JP2016014994A (ja) * | 2014-07-01 | 2016-01-28 | セイコーエプソン株式会社 | ロボットシステム |
| JP2016033694A (ja) * | 2014-07-30 | 2016-03-10 | 東芝テック株式会社 | 物体認識装置及び物体認識プログラム |
| JP2017033365A (ja) * | 2015-08-04 | 2017-02-09 | 富士通セミコンダクター株式会社 | 検出方法、検出装置及び検出プログラム |
| WO2020195304A1 (ja) * | 2019-03-22 | 2020-10-01 | Tasmit株式会社 | パターンマッチング方法 |
| JP2022013853A (ja) * | 2020-06-30 | 2022-01-18 | シック アイヴィピー エービー | 物体マッチングにおいて使用するための第1の物体モデルに基づく第2の物体モデルの生成 |
| JP2023032759A (ja) * | 2021-08-27 | 2023-03-09 | 株式会社Screenホールディングス | 描画システム、描画方法およびプログラム |
| JP2023094661A (ja) * | 2021-12-24 | 2023-07-06 | トヨタ自動車株式会社 | 検査用プログラム、検査方法、検査装置、学習済みモデル、機械学習モデルの学習方法、機械学習モデルの学習用プログラム |
| WO2024142323A1 (ja) * | 2022-12-27 | 2024-07-04 | 株式会社日立ハイテク | コンピュータシステム、試料観察方法、およびプログラム |
Families Citing this family (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10483081B2 (en) * | 2014-10-22 | 2019-11-19 | Kla-Tencor Corp. | Self directed metrology and pattern classification |
| TWI684225B (zh) * | 2015-08-28 | 2020-02-01 | 美商克萊譚克公司 | 自定向計量和圖樣分類 |
| EP3367166A1 (en) * | 2017-02-24 | 2018-08-29 | ASML Netherlands B.V. | Method of measuring variation, inspection system, computer program, and computer system |
| JP7080771B2 (ja) * | 2018-08-29 | 2022-06-06 | 株式会社ミマキエンジニアリング | 加工用データ生成プログラム、加工用データ生成システムおよび加工用データ生成方法 |
| CN113557140B (zh) * | 2019-03-11 | 2024-09-13 | 王山国际有限公司 | 用于装饰层压板制造中的配准对准精度的系统及方法 |
| US11023770B2 (en) * | 2019-09-23 | 2021-06-01 | Hong Kong Applied Science And Technology Research Institute Co., Ltd. | Systems and methods for obtaining templates for tessellated images |
| US20230117237A1 (en) * | 2020-02-13 | 2023-04-20 | Asml Netherlands B.V. | Contour extraction method from inspection image in multiple charged-particle beam inspection |
| WO2021166142A1 (ja) * | 2020-02-20 | 2021-08-26 | 株式会社日立ハイテク | パターンマッチング装置、パターン測定システムおよび非一時的コンピュータ可読媒体 |
| CN115380207A (zh) * | 2020-04-17 | 2022-11-22 | 塔斯米特株式会社 | 图案匹配方法 |
| WO2022009357A1 (ja) * | 2020-07-09 | 2022-01-13 | 株式会社日立ハイテク | パターンマッチング装置、パターン測定システム、パターンマッチングプログラム |
| CN114202578A (zh) * | 2020-09-18 | 2022-03-18 | 长鑫存储技术有限公司 | 一种晶圆的对准方法和装置 |
| US12062164B2 (en) * | 2020-12-30 | 2024-08-13 | Samsung Electronics Co., Ltd. | Pattern analysis system and method of manufacturing semiconductor device using the same |
| JP7729146B2 (ja) * | 2021-09-28 | 2025-08-26 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置、情報処理システム、およびプログラム |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06215139A (ja) * | 1993-01-13 | 1994-08-05 | Kobe Steel Ltd | 図形認識方法 |
| JP2002267441A (ja) * | 2001-03-07 | 2002-09-18 | Fuji Heavy Ind Ltd | ステレオマッチング方法および監視装置 |
| JP2007121181A (ja) * | 2005-10-31 | 2007-05-17 | Toshiba Corp | パターン形状評価方法およびパターン形状評価プログラム |
| JP2010009437A (ja) * | 2008-06-30 | 2010-01-14 | Hitachi High-Technologies Corp | パターンの消失に対応したマッチング方式及びそれを用いた検査装置 |
| WO2010114117A1 (ja) * | 2009-04-03 | 2010-10-07 | 株式会社日立ハイテクノロジーズ | 合成画像作成方法及び装置 |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6987873B1 (en) * | 1998-07-08 | 2006-01-17 | Applied Materials, Inc. | Automatic defect classification with invariant core classes |
| JP4199939B2 (ja) | 2001-04-27 | 2008-12-24 | 株式会社日立製作所 | 半導体検査システム |
| JP4215454B2 (ja) | 2001-07-12 | 2009-01-28 | 株式会社日立製作所 | 試料の凹凸判定方法、及び荷電粒子線装置 |
| JP4564728B2 (ja) * | 2003-07-25 | 2010-10-20 | 株式会社日立ハイテクノロジーズ | 回路パターンの検査装置 |
| JP4926116B2 (ja) * | 2008-04-16 | 2012-05-09 | 株式会社日立ハイテクノロジーズ | 画像検査装置 |
| JP5357725B2 (ja) * | 2009-12-03 | 2013-12-04 | 株式会社日立ハイテクノロジーズ | 欠陥検査方法及び欠陥検査装置 |
-
2011
- 2011-12-07 WO PCT/JP2011/006834 patent/WO2012101717A1/ja not_active Ceased
- 2011-12-07 US US13/981,963 patent/US10535129B2/en active Active
- 2011-12-07 KR KR1020137019857A patent/KR101522804B1/ko active Active
- 2011-12-07 JP JP2012554506A patent/JP5707423B2/ja active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06215139A (ja) * | 1993-01-13 | 1994-08-05 | Kobe Steel Ltd | 図形認識方法 |
| JP2002267441A (ja) * | 2001-03-07 | 2002-09-18 | Fuji Heavy Ind Ltd | ステレオマッチング方法および監視装置 |
| JP2007121181A (ja) * | 2005-10-31 | 2007-05-17 | Toshiba Corp | パターン形状評価方法およびパターン形状評価プログラム |
| JP2010009437A (ja) * | 2008-06-30 | 2010-01-14 | Hitachi High-Technologies Corp | パターンの消失に対応したマッチング方式及びそれを用いた検査装置 |
| WO2010114117A1 (ja) * | 2009-04-03 | 2010-10-07 | 株式会社日立ハイテクノロジーズ | 合成画像作成方法及び装置 |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015213749A (ja) * | 2014-04-21 | 2015-12-03 | 株式会社東芝 | X線コンピュータ断層撮影装置及び撮影条件設定支援装置 |
| JP2016014994A (ja) * | 2014-07-01 | 2016-01-28 | セイコーエプソン株式会社 | ロボットシステム |
| JP2016033694A (ja) * | 2014-07-30 | 2016-03-10 | 東芝テック株式会社 | 物体認識装置及び物体認識プログラム |
| JP2017033365A (ja) * | 2015-08-04 | 2017-02-09 | 富士通セミコンダクター株式会社 | 検出方法、検出装置及び検出プログラム |
| WO2020195304A1 (ja) * | 2019-03-22 | 2020-10-01 | Tasmit株式会社 | パターンマッチング方法 |
| JP7201751B2 (ja) | 2020-06-30 | 2023-01-10 | シック アイヴィピー エービー | 物体マッチングにおいて使用するための第1の物体モデルに基づく第2の物体モデルの生成 |
| JP2022013853A (ja) * | 2020-06-30 | 2022-01-18 | シック アイヴィピー エービー | 物体マッチングにおいて使用するための第1の物体モデルに基づく第2の物体モデルの生成 |
| US11928184B2 (en) | 2020-06-30 | 2024-03-12 | Sick Ivp Ab | Generation of a second object model based on a first object model for use in object matching |
| JP2023032759A (ja) * | 2021-08-27 | 2023-03-09 | 株式会社Screenホールディングス | 描画システム、描画方法およびプログラム |
| JP7701216B2 (ja) | 2021-08-27 | 2025-07-01 | 株式会社Screenホールディングス | 描画システム、描画方法およびプログラム |
| JP2023094661A (ja) * | 2021-12-24 | 2023-07-06 | トヨタ自動車株式会社 | 検査用プログラム、検査方法、検査装置、学習済みモデル、機械学習モデルの学習方法、機械学習モデルの学習用プログラム |
| JP7687201B2 (ja) | 2021-12-24 | 2025-06-03 | トヨタ自動車株式会社 | 検査用プログラム、検査方法および検査装置 |
| WO2024142323A1 (ja) * | 2022-12-27 | 2024-07-04 | 株式会社日立ハイテク | コンピュータシステム、試料観察方法、およびプログラム |
| TWI888975B (zh) * | 2022-12-27 | 2025-07-01 | 日商日立全球先端科技股份有限公司 | 電腦系統、試料觀察方法及程式 |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20130102116A (ko) | 2013-09-16 |
| JP5707423B2 (ja) | 2015-04-30 |
| US20140023265A1 (en) | 2014-01-23 |
| JPWO2012101717A1 (ja) | 2014-06-30 |
| KR101522804B1 (ko) | 2015-05-26 |
| US10535129B2 (en) | 2020-01-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5707423B2 (ja) | パターンマッチング装置、及びコンピュータープログラム | |
| US10937146B2 (en) | Image evaluation method and image evaluation device | |
| JP5313939B2 (ja) | パターン検査方法、パターン検査プログラム、電子デバイス検査システム | |
| JP5543872B2 (ja) | パターン検査方法およびパターン検査装置 | |
| JP5639925B2 (ja) | パターンマッチング装置、及びコンピュータープログラム | |
| US9141879B2 (en) | Pattern matching method, image processing device, and computer program | |
| TWI475187B (zh) | Image processing devices and computer programs | |
| CN105074896B (zh) | 图案测定装置以及半导体测量系统 | |
| US8577125B2 (en) | Method and apparatus for image generation | |
| JP5948138B2 (ja) | 欠陥解析支援装置、欠陥解析支援装置で実行されるプログラム、および欠陥解析システム | |
| US20180005363A1 (en) | Pattern Matching Device and Computer Program for Pattern Matching | |
| US20130117723A1 (en) | Pattern shape evaluation method, pattern shape evaluation device, pattern shape evaluating data generation device and semiconductor shape evaluation system using the same | |
| JP2020004443A (ja) | 多層構造体の層間のオーバレイを測定する技法 | |
| KR20120068128A (ko) | 패턴의 결함 검출 방법 및 이를 수행하기 위한 결함 검출 장치 | |
| JP2009198338A (ja) | 電子顕微鏡システム及びそれを用いたパターン寸法計測方法 | |
| WO2014129018A1 (ja) | 文字認識装置、文字認識方法及び記録媒体 | |
| JP5144415B2 (ja) | 欠陥レビュー装置、欠陥レビュー方法及び欠陥レビュー実行プログラム | |
| JP2013140468A (ja) | パターンマッチング装置、検査システム、及びコンピュータプログラム | |
| US8045807B2 (en) | Pattern edge detecting method and pattern evaluating method | |
| JP2013200319A (ja) | 電子顕微鏡システム及びそれを用いたパターン寸法計測方法 | |
| KR20220123467A (ko) | 패턴 매칭 장치, 패턴 측정 시스템 및 비일시적 컴퓨터 가독 매체 | |
| JP2006226676A (ja) | マーク認識装置およびマーク認識方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11857053 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2012554506 Country of ref document: JP Kind code of ref document: A |
|
| ENP | Entry into the national phase |
Ref document number: 20137019857 Country of ref document: KR Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13981963 Country of ref document: US |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11857053 Country of ref document: EP Kind code of ref document: A1 |