WO2010114117A1 - 合成画像作成方法及び装置 - Google Patents
合成画像作成方法及び装置 Download PDFInfo
- Publication number
- WO2010114117A1 WO2010114117A1 PCT/JP2010/056062 JP2010056062W WO2010114117A1 WO 2010114117 A1 WO2010114117 A1 WO 2010114117A1 JP 2010056062 W JP2010056062 W JP 2010056062W WO 2010114117 A1 WO2010114117 A1 WO 2010114117A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pattern
- imaging
- images
- deformation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/22—Treatment of data
- H01J2237/221—Image processing
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01J—ELECTRIC DISCHARGE TUBES OR DISCHARGE LAMPS
- H01J2237/00—Discharge tubes exposing object to beam, e.g. for analysis treatment, etching, imaging
- H01J2237/26—Electron or ion microscopes
- H01J2237/28—Scanning microscopes
- H01J2237/2813—Scanning microscopes characterised by the application
- H01J2237/2817—Pattern inspection
Definitions
- the present invention relates to a technique for inspecting a pattern with an electron microscope or the like, and more particularly, to a technique for performing panoramic synthesis that generates a single image by combining a plurality of images.
- CD-SEM Short-Dimension Scanning Electron Microscope
- OPC optical / Proximity / Correction
- the captured image fed back to the OPC simulation needs to have a high magnification range of 2 microns * 2 microns to 8 microns * 8 microns. This is several thousand pixels when the resolution is 1 nm per pixel.
- the edges of the images are joined so as to overlap with the edges of adjacent images. Therefore, the electron beam is irradiated a plurality of times in the area on the imaging target corresponding to the bonded area of the image.
- the resist shrinks due to overlapping irradiation of electron beams, and the pattern is deformed. Even if such image data is fed back to the OPC simulation, the accuracy of the OPC simulation cannot be improved.
- Such pattern deformation differs depending on the electron dose, resist material, pattern shape, etc., and is difficult to predict.
- An object of the present invention is to provide a composite image creation method and apparatus capable of avoiding deformation of a pattern caused by overlapping irradiation of electron beams when panoramic composition is performed on images captured by being divided into a plurality of times. is there.
- a single image is generated by joining a plurality of images
- a single image is generated by superimposing the joining regions at the edges of two adjacent images.
- the joining area of the image with the earlier imaging order is left, and the joining area of the image with the later imaging order is deleted.
- the number of electron beam irradiations is smaller than in the joining region of an image with a slow imaging order, so that the pattern deformation due to the electron beam irradiation is small.
- the deformation of the pattern due to the electron beam irradiation is corrected in the joining region of the images with the fast imaging order.
- the relationship between the number of electron beam irradiations and the amount of pattern deformation is determined in advance. Based on the pattern deformation information, the pattern in the bonding area is corrected.
- an imaging apparatus and an imaging method capable of avoiding pattern deformation in a region caused by overlapping irradiation of electron beams when panoramic synthesis is performed on images captured by being divided into a plurality of times. It is in. Corresponding to the deformation of the pattern due to imaging, a linked image with high accuracy can be obtained.
- FIG. It is a figure which shows the concept which calculates
- binarization processing unit 253 ... expansion processing unit 254 ... matching processing unit 255 ... template pattern storage unit 256 ... expansion processing unit 261 ... smoothing processing unit 262 ... Binarization processing unit 263 ... Area extraction unit 264 ... Area copy deformation unit 265 ... Image pasting processing unit 266 ... Connected component extraction unit 267 ... Closed figure filling 268 ... Expansion processing unit 2671 ... Connected component selection unit 2672 ... Closed figure generation unit 2673 ... Filling unit 2641 ... Image selection unit 2642 ... Bilinear interpolation unit 2643 ... Storage unit s0 ... Position information s1 ... Image data s2 ... Pattern part s3 ... Deformation measurement position
- the configuration of the first example of the composite image creation apparatus of the present invention will be described with reference to FIG.
- the composite image creation device of this example includes an imaging device 11, an imaging control device 12, and an image processing unit 13.
- the image processing unit 13 includes an image memory 1, an image composition unit 2, an imaging order information storage unit 3, a deformation information storage unit 4, an imaging position information storage unit 5, and a design data storage unit 6.
- the imaging device 11 may be a scanning electron microscope (SEM) or a length measurement scanning electron microscope (CD-SEM).
- SEM scanning electron microscope
- CD-SEM length measurement scanning electron microscope
- the scanning electron microscope irradiates a mask or wafer to be imaged with an electron beam, detects secondary electrons emitted therefrom, and acquires image data.
- the pattern to be imaged is divided and imaged multiple times, and is combined to generate one image. Therefore, a plurality of divided images are obtained as image data. When one pattern is imaged by dividing it into nine, data of nine divided images is obtained.
- the imaging control device 12 sets the imaging position and imaging order of a plurality of divided images.
- the image memory 1 stores image data acquired by the imaging device 11. For example, when one pattern is imaged by dividing it into nine, nine divided images are stored in the image memory 1.
- the imaging position information storage unit 5 stores the imaging positions of a plurality of divided images provided from the imaging control device 12.
- the imaging order information storage unit 3 stores the imaging order of a plurality of divided images provided from the imaging control device 12.
- An example of information stored in the imaging position information storage unit 5 and the imaging order information storage unit 3 will be described with reference to FIG.
- the deformation information storage unit 4 stores resist deformation information caused by electron beam irradiation during imaging.
- An example of deformation information stored in the deformation information storage unit 4 will be described later with reference to FIG.
- the design data storage unit 6 stores design data that is a basis of a wiring pattern. That is, a wide range of pattern information including the wiring pattern to be imaged is stored.
- the image composition unit 2 is stored in the image memory 1 based on information stored in the imaging position information storage unit 5, the imaging order information storage unit 3, the deformation information storage unit 4, and the design data storage unit 6. One sheet of image data is combined to generate one sheet of image data. Furthermore, the image composition unit 2 corrects the wiring pattern in the joint area of the divided images when compositing the images. The wiring pattern information corrected in this way is stored in the deformation information storage unit 4. Details of the image composition unit 2 will be described with reference to FIG.
- the image processing unit 13 of the present invention may be configured by a computer or an arithmetic device. Further, the processing in the image processing unit 13 may be executed using software. That is, the software may be executed by a computer, or may be incorporated into an LSI and performed by hardware processing.
- FIG. 2 shows the configuration of a second example of the composite image creation apparatus of the present invention.
- the imaging position information storage unit 5 is not provided. Instead, imaging position information is stored in the imaging order information storage unit 3.
- the configuration of the third example of the composite image creating apparatus of the present invention will be described with reference to FIG.
- the composite image creation device of this example includes an imaging device 11, an imaging control device 12, and an image processing unit 13.
- the image processing unit 13 includes an image memory 1, an imaging order information storage unit 3, a deformation information storage unit 4, a design data storage unit 6, and a deformation information generation unit 7.
- the deformation information generation unit 7 inputs the imaging sequence information from the imaging sequence information storage unit 3, receives the image data from the image memory 1, and corrects the wiring pattern in the junction region of the divided images.
- the wiring pattern information corrected in this way is stored in the deformation information storage unit 4.
- the process in the deformation information generation unit 7 is the same as the process for correcting the wiring pattern in the joint area of the divided images in the image synthesis unit 2 shown in FIGS.
- the image composition process in the image composition unit 2 may be the same as the process in the correction unit illustrated in FIG. 13, for example.
- the imaging order information storage unit 3 stores an imaging order information table 301.
- the imaging order information table 301 can be realized by a memory table that stores image file names using the order of the imaging order as an address.
- the imaging position information storage unit 5 stores an imaging position information table 302.
- the imaging position information table 302 can be realized by a memory table that stores imaging positions (x, y coordinate values) at addresses associated with image file names.
- the imaging order information and imaging position information table 303 has an imaging order, an image file name, and an imaging position. Since the imaging order corresponds to the image file name on a one-to-one basis, the imaging order information and the imaging position information can be stored in one table 303.
- the imaging order information and imaging position information table 303 of this example may be stored in the imaging order information storage unit 3 of the second example of the composite image creation apparatus of the present invention shown in FIG.
- FIG. 4 shows an example of pattern deformation information stored in the deformation information storage unit 4.
- the deformation information storage unit 4 stores a deformation information table.
- the deformation information table includes, for example, the position of the bonding region, the number of electron beam irradiations, the pattern portion, the deformation amount measurement position, and the pattern deformation amount.
- the position of the bonding area indicates the position of the bonding area in the image, and is, for example, an upper part, a lower part, a right part, or a left part.
- the number of times of electron beam irradiation represents the number of times of electron beam irradiation with respect to the bonding region.
- the pattern portion represents the pattern position of the pattern, and is, for example, an end point (end portion of the line), a corner portion, a straight line portion, a rectangular portion, a hatched portion, or the like.
- the deformation amount measurement position represents the measurement position of the deformation amount of the pattern. For example, in the case of an end point, it is a width or the like. In the case of a pattern, for example, the width is up, down, left, and right.
- the measurement position may be a distance from the reference point to a specific position of the pattern. A method for measuring the deformation will be described with reference to FIGS. 12A and 12B.
- the pattern deformation amount indicates the deformation dimension of the pattern, and is usually a reduction amount. When image composition is performed, the deformation information table is used for joining, that is, pasting.
- the bonding area has 3 bits
- the number of times of electron beam irradiation is 2 bits
- the pattern part has 3 bits
- the position in the pattern has 3 bits, for a total of 11 bits.
- the test pattern is imaged a plurality of times, and each part of the test pattern image is measured for each imaging.
- reference points such as the center line of each pattern are used.
- a difference value of the length measurement value is obtained from the results of the previous and current imaging. This is the amount of deformation.
- the deformation of the pattern is basically reduction.
- FIG. 12A shows the case of a pattern shape.
- a contour line indicated by a broken line indicates an image before electron beam irradiation
- a contour line indicated by a solid line indicates an image after electron beam irradiation.
- the difference in distance from the reference point c before and after electron beam irradiation is obtained.
- the upper difference D1 is obtained at the upper left corner
- the lower difference D2 and the left difference D3 are obtained at the lower left corner.
- the right side difference D3 is obtained.
- FIG. 12B shows the case of the end point of the line.
- a line a1 indicated by a broken line indicates an image before electron beam irradiation
- a line a2 indicated by a solid line indicates an image after electron beam irradiation.
- a difference in distance from the reference point c before and after the electron beam irradiation to the contour of the line is obtained.
- the line end point difference D1 may be obtained, but the line width differences D3 and D4 may be obtained. If it is unclear whether the outline a2 represents a pattern or an end point of the line, it is determined by measuring the line width L.
- FIGS. 7A and 7B a case where a single panoramic image is generated by joining a plurality of images in the composite image creating apparatus according to the present invention will be described.
- the image capturing order and the image joining order will be described.
- Two areas 701 and 702 arranged vertically are set in the area 700 on the imaging target.
- the region 701 includes a region a and a region b
- the region 702 includes a region b and a region c.
- the two areas 701 and 702 overlap in the area b.
- images are taken in the order of areas 701 and 702.
- one electron beam irradiation is performed.
- each of the areas a and b is irradiated with one electron beam, but when the area 702 is imaged next, the area a is irradiated with the second electron beam.
- the region b is irradiated with one electron beam. In the area b, the pattern may be deformed.
- Reference numeral 703 indicates images 711 and 712 obtained by imaging the regions 701 and 702, respectively.
- the image 711 includes a non-bonded area 711a and a bonded area 711b
- the image 712 includes a non-bonded area 712a and a bonded area 712b.
- the bonding region 712b of the image 712 is an image portion corresponding to the region b, and is thus an image obtained by two electron beam irradiations. Accordingly, there is a possibility that an image of a deformed pattern is obtained in the joint region 712b of the image 712.
- the two images 711 and 712 are joined to generate panoramic images 704 and 705.
- the panoramic image 704 is synthesized by joining the image 712 captured later on the image 711 captured earlier. In this case, in the bonding area of the two images, the bonding area 711b is erased and the bonding area 712b remains. Accordingly, the panoramic image 704 includes a joint region 712b of the image 712 captured later. Therefore, when joining, it is necessary to correct the pattern in the joining region 712b of the image 712.
- the panoramic image 705 is synthesized by joining the previously captured image 711 so as to overlap the image 712 captured later. In this case, in the overlapping area of the two images, the bonding area 711b remains and the bonding area 712b is erased. Accordingly, the panoramic image 705 includes a joint region 711b of the image 711 captured earlier.
- the panoramic image 705 is composed of images obtained by one electron beam irradiation. Therefore, when joining, it is not necessary to correct the pattern in the joining region of the two images.
- two areas 721 and 722 arranged side by side are set in the area 720 on the imaging target.
- the region 721 includes a region a and a region b
- the region 722 includes a region b and a region c.
- the two areas 721 and 722 overlap in the area b.
- Reference numeral 723 indicates images 731 and 732 obtained by imaging the regions 721 and 722, respectively.
- the two images 731 and 732 are joined to generate panoramic images 724 and 725.
- the panoramic image 724 is synthesized by joining the previously captured image 731 so as to overlap the image 732 captured later. In this case, in the overlapping area of the two images, the bonding area 731b remains and the bonding area 732b is erased. Accordingly, the panoramic image 724 includes the joint region 731b of the image 731 captured previously.
- the panoramic image 724 is composed of images obtained by one electron beam irradiation. Therefore, when joining, it is not necessary to correct the pattern in the joining region of the two images.
- the panoramic image 725 is synthesized by joining the image 732 picked up later on the image 731 picked up earlier.
- the junction region 731b is erased and the junction region 732b remains in the overlapping region of the two images. Accordingly, the panoramic image 725 includes a joint region 732b of the image 732 captured later. Therefore, when joining, it is necessary to correct the pattern in the joining region 732b of the image 732.
- FIGS. 8A and 8B the relationship between the imaging sequence and the number of times of electron beam irradiation in the bonding area will be described.
- Nine areas a to i are set on the imaging target 800.
- the dimensions of all the regions a to i are the same, the horizontal dimension is Lx, and the vertical dimension is Ly.
- Two adjacent areas each have an overlapping area. That is, each area includes a non-overlapping area and an overlapping area.
- the width of the overlapping region between two horizontally adjacent regions is ⁇ x
- the width of the overlapping region between two vertically adjacent regions is ⁇ y.
- the horizontal dimension of the imaging target is 3Lx-2 ⁇ x
- the vertical dimension is 3Ly-2 ⁇ y.
- one image is generated by one electron beam irradiation.
- Nine areas a to i are imaged in alphabetical order to obtain nine images A to I.
- the non-overlapping regions 11, 12, 13, 21, 22, 23, 31, 32, and 33 are irradiated with an electron beam once.
- the electron beam is irradiated twice.
- the electron beam is irradiated four times.
- the overlapping area 32 extending in the horizontal direction has a length of Mx
- the overlapping area 23 extending in the vertical direction has a length of My.
- the length of the overlapping regions 34 and 54 extending in the horizontal direction is Nx
- the length of the overlapping regions 43 and 45 extending in the vertical direction is Ny.
- the four overlapping regions 66, 70, 106, 110 have a horizontal dimension of ⁇ x and a vertical dimension of ⁇ y.
- Table 801 in FIG. 8B shows the number of times of electron beam irradiation in the overlapping areas of the areas a to i when nine images A to I are generated in alphabetical order.
- This table 801 shows the relationship between captured images, overlapping regions, and the number of electron beam irradiations.
- an image A is generated by first irradiating the region a with an electron beam.
- the number of times of electron beam irradiation in the overlapping regions 23, 32, and 66 is one.
- an image B is generated by irradiating the region b with an electron beam.
- the number of times of electron beam irradiation in the overlapping regions 23 and 66 is two.
- the number of times of electron beam irradiation in the overlapping regions 34, 70, and 25 is one.
- FIG. 9 a process for calculating the number of times of electron beam irradiation in each overlapping region in the composite image creating apparatus according to the present invention will be described. That is, a table showing the number of electron beam irradiations in each overlapping region shown in FIG. 8B is created.
- the imaging order information table 303 shown in FIG. 3 is given in advance. That is, the imaging order and imaging position are set in advance for all images.
- FIG. 8 it is assumed that nine images are obtained by imaging in the alphabetical order of the areas a to i.
- the positions of the areas a to i are given in advance. For example, the areas a to i are arranged in ascending order of X coordinate and Y coordinate.
- the images A to I are sequentially assigned.
- k 0
- a memory area for storing the number of times of electron beam irradiation is provided for each overlapping area and cleared. Then, 1 is set to the counter value n corresponding to the imaging order.
- the imaging order information table 303 shown in FIG. 3 it is possible to determine which of the nine images A to I is the first image in the imaging order. In this example, assuming that images are captured in alphabetical order, the first image in the imaging order is the image A.
- step S13 1 is added as the number of times of electron beam irradiation to the memory areas corresponding to all the overlapping areas included in the area corresponding to the imaging order n in the imaging order information table.
- 1 is added as the number of times of electron beam irradiation to the memory areas corresponding to the overlapping areas 23, 32, and 66 included in the area a.
- step S14 1 is stored in the column of the number of times of electron beam irradiation corresponding to the overlapping areas 23, 32, and 66 included in the area a of the table of FIG.
- step S16 it is determined whether or not the imaging order n is larger than the final imaging order value. In this example, since nine images are generated, the imaging order final value is nine. If the imaging order n is larger than the final value of the imaging order, the process ends. Otherwise, the process returns to step S12, and the processes in steps S12 to S15 are repeated. Thus, when the process of step S15 ends, the table shown in FIG. 8B is generated.
- the image composition unit 2 of this example includes a corrected image selection unit 21, a deformation correction unit 22, and an image pasting unit 23.
- the corrected image selection unit 21 inputs an imaging order from the imaging order information storage unit 3 and reads two images from the image memory 1. Of the two images, the corrected image selection unit 21 outputs the image with the later imaging order to the deformation correction unit 22 and outputs the image with the earlier imaging order to the image pasting unit 23. In this example, the image with the later imaging order is corrected, and the image with the earlier imaging order is not corrected.
- the corrected image selection unit 21 may be configured by a selector that switches based on the imaging order. That is, two selectors are provided, and one selector selects an image with a later imaging order and outputs it to the deformation correction unit 22, and the other selector selects an image with an earlier imaging order, To 23.
- the deformation correction unit 22 inputs an imaging sequence from the imaging sequence information storage unit 3, inputs resist deformation information resulting from electron beam irradiation from the deformation information storage unit 4, and inputs design data from the design data storage unit 6. .
- the deformation correction unit 22 uses these pieces of information to correct the wiring pattern in the junction region in the image with the later imaging order from the correction image selection unit 21. As described with reference to FIG. 7, when the adjacent areas on the imaging target are sequentially imaged, the image captured later includes an image portion irradiated with overlapping electron beams. Therefore, the deformation correction unit 22 of this example corrects the image of the joint region for the image with the later imaging order.
- the deformation correction unit 22 outputs the correction image to the image pasting unit 23 and simultaneously feeds it back to the deformation information storage unit 4 as a template image.
- the image pasting unit 23 inputs the image having the slower imaging order from the corrected image selecting unit 21, inputs the corrected image of the image having the later imaging order from the deformation correcting unit 22, and further, the imaging sequence information storage unit The imaging order is input from 3.
- the image pasting unit 23 performs matching processing of the images of the joining regions for the two images, detects the joining position, and synthesizes the images. Details of the processing in the image pasting unit 23 will be described later with reference to FIG.
- the deformation correction unit 22 of this example may correct an image with a slow imaging order so that the number of times of electron beam irradiation is the same in the joint region between the two images. For this purpose, a difference in the number of times of electron beam irradiation in the junction region is calculated between an image with a fast imaging order and an image with a slow imaging order. The image of the bonded area may be corrected based on the deformation amount corresponding to this difference.
- the deformation correction unit 22 in this example corrects an image with a slow imaging order, but may also correct an image 21a with a fast imaging order.
- the pattern correction is performed so that the number of times of electron beam irradiation is the same in the joint region in both the early and late imaging images.
- the pattern may be deformed so that the number of times of electron beam irradiation corresponds to the first time in the joining region of two images.
- the deformation correction unit 22 includes a joint region detection unit 24, a pattern part detection unit 25, a correction unit 26, an imaging number calculation unit 27, and an image storage unit 28.
- the joint area detection unit 24 inputs an image from the image memory 1 and inputs an imaging order s0 from the imaging order information storage unit 3, and detects a joint area in the image.
- the bonding area is an image portion of an area where the number of times of electron beam irradiation may be plural.
- the bonding area detection unit 24 outputs the image data s1 of the bonding area to the pattern site detection unit 25, the correction unit 26, and the electron beam irradiation frequency calculation unit 27.
- the pattern part detection unit 25 inputs the image data s1 of the bonding region from the bonding region detection unit 24 and the design data from the design data storage unit 6.
- the pattern part detection unit 25 detects the pattern part s2 and the deformation measurement position s3 from the image data s1 of the joining area, and outputs them to the correction unit 26.
- the pattern part s2 and the deformation amount measurement position s3 have been described with reference to FIG. That is, the pattern part s2 is an end point, a corner, a straight line, a rectangle, a diagonal line, or the like.
- the deformation amount measurement position s3 differs depending on the type of the pattern part s2, and for example, when the pattern part is an end point, it is a width, a distance from the reference position to the upper end, and the like. Details of the pattern part detection unit 25 will be described with reference to FIGS.
- the correction unit 26 includes the image data s1 of the joint region from the joint region detection unit 24, the pattern part s2 and the deformation amount measurement position s3 from the pattern part detection unit 25, the deformation amount from the deformation information storage unit 4, and the design.
- the design data from the data storage unit 6 is input, the image data of the joining area is corrected, and stored in the image storage unit 28. Details of the correction processing in the correction unit 26 will be described with reference to FIG.
- correction processing may be repeated for each pattern.
- the image data after the correction processing may be read from the image storage unit 28 and only the corrected pattern part may be overwritten, or the corrected pattern part may be pasted on the existing image data. Also good.
- the electron beam irradiation frequency calculation unit 27 inputs the image data s1 of the bonding region from the bonding region detection unit 24, inputs the imaging order from the imaging sequence information storage unit 3, and calculates the electron beam irradiation frequency of each bonding region. .
- the electron beam irradiation frequency calculation unit 27 stores the electron beam irradiation frequency of each bonding region in the table of the deformation information storage unit 4.
- the pattern part detection unit 25 of this example includes a smoothing processing unit 251, a binarization processing unit 252, two expansion processing units 253 and 256, a matching processing unit 254, and a template pattern generation unit 255.
- the smoothing processing unit 251 receives the image data s1 of the joining area from the joining area detection unit 24 and smoothes it.
- the binarization processing unit 252 binarizes the joint region image data s 1 from the smoothing processing unit 251 and outputs the binarized image data s 1 to the expansion processing unit 253.
- the expansion processing unit 253 expands the binarized data from the binarization processing unit 252 by expansion processing.
- the template pattern generation unit 255 reads design data corresponding to the bonding region from the design data storage unit 6 and creates a template image from the pattern in the bonding region.
- the template pattern generation unit 255 outputs the template image to the deformation information storage 4 and the expansion processing unit 256.
- the expansion processing unit 256 expands the template image by expansion processing.
- the matching processing unit 254 matches the binarized / expanded data obtained from the image data s1 of the joining area with the expanded data of the template image, and detects a pattern part.
- the matching processing unit 254 outputs the pattern part s2 and the deformation amount measurement position s3 to the correction unit 26.
- the matching processing unit 254 may use matching using normalized correlation processing. However, the matching processing unit 254 of this example performs matching between binarized images. Therefore, it may be determined whether the pattern is identical to the pattern of the template image by simply obtaining the number of matching black and white pixels and comparing it with a predetermined threshold value.
- the smoothing processing unit 251 of this example may smooth the input data using a Gaussian filter.
- the binarization processing unit 252 may binarize the input data by a general binarization process. That is, a pixel value larger than the threshold is set to 1, and a pixel value smaller than the threshold is set to 0.
- the expansion processing units 253 and 256 may binarize the input data by a general expansion process. For example, if there is one black pixel, all the adjacent eight pixels around it are black. By repeating such processing, the pattern is expanded.
- FIGS. 11A, 11B, and 11C An example of a template pattern generation method in the template pattern generation unit 255 will be described with reference to FIGS. 11A, 11B, and 11C. If the image pickup position of the picked-up image and the position of the joint region in the top, bottom, left and right of the picked-up image are known, the coordinate range on the design data of the joint region of the picked-up image can be found. The design data including the coordinate range is converted into binary image data, and a corner is detected using a corner detection filter.
- FIG. 11A shows an example of a pattern of design data corresponding to the bonding area.
- the pattern of this example includes a band-shaped protrusion having a width L.
- the pattern of this example includes two end points (line ends) P1 and P2, two corner portions P3 and P4, and a straight line portion between adjacent end points.
- FIG. 11B shows an example of a corner detection filter.
- the end points P1 and P2 and the corner portions P3 and P4 can be detected.
- the contour of the pattern portion including the end point P1 of the design data matches the filter F1, it is determined that there is a corner having a shape corresponding to the filter F1.
- the detected contour line of the pattern portion is used as a template image.
- the template pattern Since the template pattern is generated from the design data, it includes a right angle shape as shown in FIG. 11A. However, the captured images at the end points and corner portions are not actually perpendicular. Therefore, when an end point or a corner is detected, the template pattern may be replaced with a pattern interpolated into a contour shape close to the actual pattern, as shown in FIG. 11C.
- the correction unit 26 includes a smoothing processing unit 261, a binarization processing unit 262, an area extraction unit 263, an area copy deformation unit 264, and an image pasting processing unit 265.
- the smoothing processing unit 261 receives the image data s1 of the joining area from the joining area detection unit 24, and smoothes it.
- the binarization processing unit 262 binarizes the joint region image data s 1 from the smoothing processing unit 261 and outputs the binarized image data s 1 to the region extraction unit 263.
- the region extraction unit 263 receives the binarized data from the binarization processing unit 262, the design data from the design data storage unit 6, and the pattern site s2 and the deformation amount measurement position s3 from the pattern site detection unit 25. .
- the area extraction unit 263 extracts a pattern area and outputs the image data of the pattern area to the area copy deformation unit 264. Details of the region extraction unit 263 will be described with reference to FIG.
- the area copying / deforming unit 264 includes pattern region image data from the region extracting unit 263, resist deformation information resulting from electron beam irradiation from the deformation information storage unit 4, and a bonding region image from the bonding region detection unit 24.
- the data s1 is input and the pattern image is copied and corrected. Details of the area copy deformation unit 264 will be described with reference to FIG.
- the area copy deforming unit 264 outputs the corrected pattern image to the image pasting processing unit 265.
- the image pasting processing unit 265 inputs the joint region image data s1 from the joint region detection unit 24 and the correction pattern image from the region copy deformation unit 264, and pastes the correction pattern image to the joint region image data s1.
- the area extraction unit 263 includes a connected component extraction unit 266, a closed figure filling unit 267, and an expansion processing unit 268.
- the connected component extraction unit 266 receives the binarized data of the joint region image data s1 from the binarization processing unit 262, and extracts the connected components of the black pixels.
- the connected component extraction 266 may extract the connected components using a generally known 8-connected method.
- the closed figure filling unit 267 includes the black pixel connected component from the connected component extracting unit 266, the design data from the design data storage unit 6, the pattern site s2 and the deformation amount measurement position s3 from the pattern site detector 25. Input, create a closed figure, and fill the inside of the closed figure.
- the expansion processing unit 268 expands the filled closed figure.
- the expansion processing in the expansion processing unit 268 may be the same as the expansion processing in the expansion processing units 253 and 256 of the pattern part detection unit 25 described with reference to FIG.
- the edge portion of the pattern obtained by binarization varies depending on the threshold value. For this reason, the edge of the original pattern may not completely enter the closed figure. Therefore, a margin is provided so that the edge of the pattern surely enters the closed figure by expanding the closed figure.
- the closed graphic filling unit 267 includes a connected component selection unit 2671, a closed graphic generation unit 2672, and a painting unit 2673.
- the connected component selection unit 2671 selects the connected component of the black pixel of the binarized data of the image data s1 of the joint region from the connected component extraction unit 266, the pattern site S2 and the deformation amount measurement position s3 from the pattern site detection unit 25. input.
- the connected component selection unit 2671 selects a connected component including the correction target pattern 1601 among the connected components input from the connected component extraction unit 266 as follows, for example.
- the connected component selection unit 2671 first obtains the distance between each pixel of the connected component 1603 and the pixel position where the correction target pattern 1601 exists. Based on this distance, a connected component 1604 including a pixel closest to the pixel position where the correction target pattern 1601 exists is selected from the connected components 1603.
- the closed figure generation unit 2672 generates a closed figure composed of connected components including the correction target pattern among the connected components selected by the connected component selecting unit 2671.
- FIG. 16A shows a case where the connected component 1603 input from the connected component extraction unit 266 is a corner pattern. As in the example shown in the figure, the connected component 1603 can be divided into two regions, a portion in the correction target pattern 1601 and a portion outside the correction target pattern 1601. Design data 1602 of the connected component 1603 is obtained from the design data storage unit 6.
- the closed figure generation unit 2672 uses the design data 1602 corresponding to the connected component 1603 to select a portion in the correction target pattern 1601 from the two areas. A portion in the correction target pattern 1601 selected in this way is set as one closed figure.
- the painting part 2673 paints the closed figure black. In this way, as shown in FIG. 16B, a black closed figure 1604 is obtained.
- the area copy deformation unit 264 includes an image selection unit 2641, a bilinear interpolation unit 2642, and a storage unit 2643.
- the image selection unit 2641 receives the pattern region image data from the region extraction unit 263 and the joint region image data s1 from the joint region detection unit 24, and selects the joint region image data s1 corresponding to the pattern region. And stored in the storage unit 2643.
- the image data of the pattern area input from the area extraction unit 263 is a closed figure painted black as shown in FIG.
- the image selection unit 2641 sets a pixel of a closed figure painted black as 1 and a pixel of a portion not painted black as 0.
- a value obtained by multiplying the pixel value by the pixel coordinate may be stored in the storage unit 2643.
- the bilinear interpolation unit 2642 receives the image data s1 of the joint region corresponding to the pattern region from the storage unit 2643, and inputs the deformation information of the resist from the deformation information storage unit 4.
- the bilinear interpolation unit 2642 corrects, that is, enlarges, the image data s1 of the joint region by bilinear interpolation using the deformation information.
- the pattern 1802 is enlarged by 3 pixels on the left side and enlarged by 2 pixels on the right side.
- the width of the pattern 1804 is 152 pixels from the 50th pixel to the 202nd pixel in the x direction in the m-th line.
- Point 1803 moves to the left by 3 pixels to become point 1805.
- FIG. 18B shows a pattern 1804 enlarged by bilinear interpolation processing.
- the deformation information from the deformation information storage unit 4 is the amount of enlargement in the X direction has been described. The same applies when the deformation information is the amount of enlargement in the X direction.
- FIG. 19 shows an example of the image pasting unit 23 of the image compositing unit 2 according to the present invention.
- the image pasting unit 23 includes a matching processing unit 231 and a combining unit 232.
- the matching processing unit 231 inputs an image having a later imaging order from the corrected image selecting unit 21 and inputs a deformed image of an image having a later imaging order from the deformation correcting unit 22.
- the image composition unit 2 basically deforms the image of the joined region of the image with the later imaging order, but does not deform the image with the earlier imaging order. .
- the matching processing unit 231 of this example detects the position of the image in the junction region in the image with the earlier imaging order using the image with the later imaging order, that is, the corrected and deformed image as a template.
- the deformation correction unit 22 performs positioning using an image whose deformation has been corrected as a template, highly accurate matching is possible.
- the synthesizing unit 232 includes position information from the matching processing unit 231, an image having a slower imaging order from the corrected image selecting unit 21, an image having a slower imaging order from the deformation correcting unit 22, and an imaging order information storage unit The imaging order from 3 is input.
- the combining unit 232 joins and combines the two images based on the position information detected by the matching processing unit 231.
- the method of the joining process in the image bonding part 23 of the image composition part 2 by this invention is demonstrated.
- Four regions a to d are set for the imaging target 2201. It is assumed that the dimensions of all the regions a to d are the same. As shown in the figure, adjacent regions are provided with overlapping regions indicated by broken lines.
- the imaging order will be described. When imaging in alphabetical order, the area b is imaged after the area a is imaged. Accordingly, in the overlapping region of the region a and the region b, the second electron beam irradiation is performed after a short time after the first electron beam irradiation.
- the four images 2202 are obtained by imaging in the order of region a, region d, region b, and region c.
- the number added to the alphabet represents the imaging order.
- an image with one electron beam irradiation is obtained in all regions.
- an image with two electron beam irradiations is obtained in the square junction region indicated by a broken line, but an image with one electron beam irradiation is obtained in other regions.
- image B an image with two electron beam irradiations is obtained in the elongated bonding region, an image with three electron beam irradiations is obtained in the square bonding region, and the number of electron beam irradiations in the other regions.
- an image with an earlier imaging order has a smaller number of electron beam irradiations.
- the lower joining region is erased and the upper joining region remains in the joining region that is overlapped.
- an image with a slow imaging order is placed on the lower side
- an image with a fast imaging order is placed on the upper side, What is necessary is just to make it the joining area
- the panoramic image 2203 is obtained by superimposing and joining the joining areas in the order of the image A, the image B, the image D, and the image C.
- the numbers added to the alphabet represent the order of superposition.
- the panoramic image 2204 represents the number of times of electron beam irradiation in each joint area of the panoramic image 2203. In the elongated bonding area, an image with two electron beam irradiations is formed, in the square bonding area, an image with four electron beam irradiations is formed, and in other areas, the image is irradiated with one electron beam. .
- the panoramic image 2205 is a composite image obtained by superimposing the joining regions in the order of the image C, the image D, the image B, and the image A.
- the numbers added to the alphabet represent the order of superposition.
- the panorama image 2206 represents the number of times of electron beam irradiation in each joint area of the panorama image 2205. In the elongated joint region between the image B and the image D, an image with the number of electron beam irradiations is two, and in the other region, the image with the number of electron beam irradiations is one.
- the panoramic image 2207 is obtained by superimposing and joining the joining regions in the order of the image C, the image B, the image D, and the image A.
- the numbers added to the alphabet represent the order of superposition.
- the panoramic image 2208 represents the number of times of electron beam irradiation in each joint area of the panoramic image 2207. In all the areas, the number of times of electron beam irradiation is one.
- FIG. 20 a total of 16 areas 1 to 16 of 4 horizontal and 4 vertical are set on the imaging target, and images 1 to 16 obtained by imaging them are joined to generate one panoramic image. To do.
- adjacent areas are not continuously imaged.
- the overlapping area may be charged, and an image may be distorted or a pattern may not be visible.
- the coordinates of all joining areas are first obtained based on the position coordinates of the area.
- all the images are pasted together based on the imaging order.
- step S21 the joint position coordinates of each image are initialized.
- step S22 the first images of the 16 images 1 to 16 corresponding to the regions 1 to 16 are read.
- the second image 2 is read, and the position of the joint area between the first image 1 and the second image 2 is obtained in step S24, and the coordinates are stored.
- step S25 it is determined whether or not the current image is the final image. If not, the 1st plus image is read. In this way, step S25 and step S26 are repeated, and the position of the junction area of the final image is obtained.
- step S27 the joint coordinates of the images 1 to 16 are mapped to the composite image area.
- the position of the joint area between two images is obtained, but by repeating this, the coordinate values of each joint area when all the images are arranged in the composite image area can be obtained.
- the joining positions of the images 1 to 16 in the composite image area are expressed by the coordinates in the x direction from the origin to the left end of each image.
- the joining position of image 1 is 0, the joining position of image 2 is 80, and the joining position of image 3 is 150.
- the joining positions of the images 1 to 16 are obtained as mapped coordinates. Next, it joins using the order which picturizes.
- step S31 the confirmation flag for all images in the composite image area is cleared to 0. 1 is set as the imaging order.
- step S32 an image corresponding to the value set in the imaging order is read. Here, the image of the imaging order 1 is read.
- step S33 the joint position in the composite image area corresponding to the read image is read out.
- step S34 an image is written only for the pixels having a confirmation flag of 0 from the joint position. When writing an image corresponding to the imaging order 1, all confirmation flags are 0. Therefore, all the pixels in the image area corresponding to the imaging order 1 are written.
- step S35 1 is written in the confirmation flag corresponding to all the pixel positions written in step S34. This is a process for preventing overwriting. Since all the image areas corresponding to the imaging order 1 are written, 1 is written to the confirmation flag for all the image areas corresponding to the imaging order 1.
- the confirmation flag is set to 1, so that the pixel is not overwritten thereafter.
- the images are written in the order of imaging, and the written images are not overwritten, so that the first written image, that is, the image with the earliest imaging order remains.
- step S36 +1 is added to the imaging order, and in step S37, it is determined whether the imaging order is greater than the final value. If it is larger, the process ends. Otherwise, the processes in steps S32 to S37 are repeated. When the imaging order reaches the final value, the combined image is completed, and any pixel has the first electron beam irradiation data.
- a pattern for forming a panoramic image is extracted from an imaging region that is considered to have the least amount of beam irradiation in forming an image with a high imaging order, that is, a panoramic image.
- a method for extracting a panoramic image forming pattern based on other criteria will be described.
- FIG. 25 another example of panoramic image composition will be described.
- four imaging areas (a first imaging area 2501, a second imaging area 2502, a third imaging area 2503, and a fourth imaging area 2504) are set, and a panoramic image is formed.
- the first imaging region 2501 is imaged first, and thereafter, imaging (beam scanning by SEM) is performed in the second, third, and fourth order.
- superimposition areas 2511 to 2514 are set for panoramic image synthesis.
- the pattern 2505 and the pattern 2506 for example, it is desirable to leave a pattern edge included in the first imaging region 2505 from the viewpoint of pattern deformation.
- this is not necessarily the case for the pattern 2507, for example.
- most of the pattern 2507 is included in the second imaging region 2502, and the overlapping region 2511 in which the first imaging region 2501 and the second imaging region 2502 overlap includes only a part of the pattern 2507. It is. In such a case, a part of the pattern 2507 included in the overlapping area 2511 is also extracted from the second imaging area 2502. As a result, a pattern image with few connected portions can be acquired for the entire pattern 2507.
- pattern extraction in one field of view (imaging area). For example, it is assumed that the dimension from the left end to the right end of the pattern 2507 is measured. It is preferable that there is no pattern joint between one end and the other end, which is a measurement standard. Therefore, it is preferable to design an algorithm that sets a flag in the pattern 2507 and determines an imaging region so that the number of connections of the pattern 2507 is as small as possible. Conversely, when the gap dimension between the pattern 2506 and the pattern 2510 is measured, it is desirable that the gap portion falls within one imaging region. In this case, two patterns may be extracted from the first imaging region 2501.
- the imaging region is selected so that a portion where the dimension value of the pattern is larger than a predetermined value or a portion where the distance between the patterns is smaller than the predetermined value fits in one field of view (imaging region).
- an algorithm that determines a field of view (imaging area) from which a pattern is to be extracted is necessary based on a determination criterion different from the imaging order.
- the field of view to be subjected to pattern extraction based on a judgment criterion different from the imaging order may be determined. For example, when the occupied areas in the imaging region 2501 and the imaging region 2502 are compared for the pattern 2507, most of the pattern 2507 is included in the imaging region 2502. In such a case, pattern extraction may be performed from the imaging region 2502. Accordingly, an image with very few connection portions can be formed for the pattern 2507.
- the imaging area can be set on the layout data of the design data. Thereby, the area of the pattern included in the imaging region or the overlapping region can be calculated. This can be used as a determination criterion different from the imaging order described above.
- the position where measurement is desired may be configured to be set in advance based on the design data. Thereby, the above-described determination can be automatically performed.
- the patterns 2508 and 2509 it is desirable to select a pattern extraction field of view (imaging area) according to the imaging order unless there are other circumstances.
- the pattern 2510 a part of the pattern 2510 is in the overlapping region 2515 across the four imaging regions. In this case, if the deformation of the pattern is disliked as much as possible, the pattern extraction is performed from the imaging region 2501 for the portion belonging to the overlapping region 2511, and the pattern extraction is performed from the imaging region 2502 for the other portions and synthesized. That's fine. If it is desired to extract a pattern from only one imaging area, a pattern may be extracted from the imaging area 2502. Since the conditions to be set vary depending on the type of pattern and the measurement purpose of the user of the scanning electron microscope, it is desirable to allow arbitrary settings.
- FIG. 26 is a flowchart for explaining a process of determining an imaging region from which a pattern is to be extracted and forming a panoramic image.
- pattern matching S2602 is performed for each overlapping region following the start of the joining process (S2601).
- the pattern included in the overlapping area is certified (S2603).
- it is determined whether the recognized pattern is a pattern for which a predetermined condition is set (S2604).
- the predetermined condition is a reference condition set in advance for the measurement position, occupied area, and the like described above.
- an imaging area in which the area occupied by the pattern is a predetermined value or more is selected, a pattern is extracted from one imaging area, and the like.
- a pattern for which a predetermined condition is set an imaging region is selected based on the predetermined condition, and the pattern is extracted (S2606). If the recognized pattern is not a pattern for which a predetermined condition is set, an imaging region with a small number of imaging is selected based on the imaging order (S2605).
- a pattern is extracted from the selected imaging region, and a bonding pattern is formed (S2607).
- a bonding pattern is formed (S2608). If there is a pattern for which the bonding pattern has not been formed, the processes of S2604 to S2607 are performed again. When there is no pattern for which the bonding pattern has not been formed, finally, a panoramic image is completed (S2609). According to the configuration as described above, it is possible to automatically determine an imaging region to be subjected to pattern extraction based on various conditions.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Testing Or Measuring Of Semiconductors Or The Like (AREA)
- Length-Measuring Devices Using Wave Or Particle Radiation (AREA)
Abstract
Description
27…電子線照射回数算出部 28…画像記憶部 231…マッチング処理部 232…合成部 251…平滑化処理部 252…2値化処理部 253…膨張処理部 254…マッチング処理部 255…テンプレートパターン記憶部 256…膨張処理部 261…平滑化処理部 262…2値化処理部 263…領域抽出部 264…領域複写変形部 265…画像貼付処理部 266…連結成分抽出部 267…閉図形塗潰し部 268…膨張処理部 2671…連結成分選択部 2672…閉図形生成部 2673…塗潰し部 2641…画像選択部 2642…バイリニア補間部 2643…記憶部 s0…位置情報 s1…画像データ s2…パターン部位 s3…変形量測定位置
Claims (20)
- 走査電子顕微鏡による画像を繋ぎ合わせて1つの画像を生成する合成画像作成方法において、
電子デバイスのパターンを含む撮像対象を複数の領域に分割し、該領域毎に画像を撮像し、該撮像した画像を画像メモリに格納するステップと、
前記画像の撮像位置を記憶するステップと、
前記画像の撮像順序を記憶するステップと、
前記撮像位置及び前記撮像順序に基づいて前記画像メモリから取り出した複数の画像を接合する画像合成ステップと、
を有し、
前記画像合成ステップでは、隣接する2つの画像の縁に設けた接合領域同士が重畳するように、且つ、隣接する2つの画像のうち、撮像順序が遅い画像の接合領域が消え、撮像順序が早い画像の接合領域が残るように、画像を接合することを特徴とする合成画像作成方法。 - 請求項1記載の合成画像作成方法において、
前記撮像順序は、撮像対象上の複数の領域のついて、隣接する領域を連続して撮像しないように、設定されていることを特徴とする合成画像作成方法。 - 請求項1記載の合成画像作成方法において、前記画像合成ステップにおける画像の接像順序は、前記撮像順序とは逆の順であることを特徴とする合成画像作成方法。
- 請求項1記載の合成画像作成方法において、
電子線の照射に起因した電子デバイスのパターンの変形に関する情報を変形情報記憶装置に記憶するステップと、
前記パターンの変形に関する情報に基づいて、前記画像の接合領域における前記パターンを補正するパターン補正ステップと、
を有する合成画像作成方法。 - 請求項4記載の合成画像作成方法において、
前記パターン補正ステップにおいて、前記2つの画像のうち、撮像順序が遅い画像の接合領域におけるパターンの変形が、撮像順序が早い画像におけるパターンの変形量と同一となるように、前記撮像順序が遅い画像のパターンを補正することを特徴とする合成画像作成方法。 - 請求項4記載の合成画像作成方法において、
前記パターンの変形に関する情報は、電子線照射回数と前記パターンの変形量を含むことを特徴とする合成画像作成方法。 - 請求項4記載の合成画像作成方法において、
前記パターンの変形に関する情報は、複数回の撮像によって得たテストパターンの画像の差分情報に基づいて作成することを特徴とする合成画像作成方法。 - 請求項4記載の合成画像作成方法において、
前記パターンの変形に関する情報は、撮像順序の早い撮像画像の接合領域を基準に、撮像順序の遅い撮像画像の接合領域の差分情報に基づいて作成することを特徴とする合成画像作成方法。 - 請求項4記載の合成画像作成方法において、
前記パターン補正ステップは、
前記画像の接合領域における前記パターンの像の部位を検出するパターン部位検出ステップと、
前記パターン部位における変形量を前記変形情報記憶装置から読み出すステップと、
前記パターン部位における変形量に基づいて前記パターンを補正するステップと、を有することを特徴とする合成画像作成方法。 - 請求項4記載の合成画像作成方法において、
前記パターン部位検出ステップは、
設計データから前記パターンに対応するテンプレートパターンを生成するテンプレートパターン生成ステップと、
前記接合領域を2値化する2値化ステップと、
前記テンプレートパターン生成ステップによって得たテンプレートと前記2値化ステップによって得た2値化画像データをマッチングさせるマッチングステップと、
を有することを特徴とする合成画像作成方法。 - 請求項10記載の合成画像作成方法において、
前記パターン部位検出ステップは、更に、
前記テンプレートの輪郭を膨脹させる膨脹化ステップと、
前記2値化画像データの輪郭を膨脹させる膨脹化ステップと、
を有し、前記マッチングステップは、前記膨脹化した画像データをマッチングさせることを特徴とする合成画像作成方法。 - 電子デバイスのパターンを含む撮像対象の走査電子顕微鏡像を取得するための撮像装置と、該撮像装置によって取得した画像データを記憶する画像メモリと、該画像メモリに格納された画像を繋ぎ合わせて1つの画像を生成する画像処理部と、を有する合成画像作成装置において、
前記撮像装置によった撮像された画像の撮像位置及び撮像順序を記憶する記憶部と、前記パターンの設計データを記憶する設計データ記憶部と、前記画像メモリに格納されている複数の画像を接合して1枚の画像を生成する画像合成部と、を有し、
前記画像合成部は、前記撮像位置及び前記撮像順序に基づいて前記画像メモリから取り出した複数の画像を、隣接する2つの画像の縁に設けた接合領域同士が重畳するように、接合し、
更に、隣接する2つの画像のうち、撮像順序が遅い画像の接合領域が消え、撮像順序が早い画像の接合領域が残るように、画像を接合することを特徴とする合成画像作成装置。 - 請求項12記載の合成画像作成装置において、
前記撮像順序は、撮像対象上の複数の領域のついて、隣接する領域を連続して撮像しないように、設定されていることを特徴とする合成画像作成装置。 - 請求項12記載の合成画像作成装置において、前記画像合成部における画像の接像順序は、前記撮像順序とは逆の順であることを特徴とする合成画像作成装置。
- 請求項12記載の合成画像作成装置において、
前記記憶部は、電子線の照射に起因した電子デバイスのパターンの変形に関する情報を格納しており、
前記画像合成部は、前記パターンの変形に関する情報に基づいて、前記画像の接合領域における前記パターンを補正することを特徴とする合成画像作成装置。 - 請求項12記載の合成画像作成装置において、
前記画像合成部は、前記2つの画像のうち、撮像順序が遅い画像の接合領域におけるパターンの変形が、撮像順序が早い画像におけるパターンの変形量と同一となるように、前記撮像順序が遅い画像のパターンを補正することを特徴とする合成画像作成装置。 - 請求項15記載の合成画像作成装置において、
前記パターンの変形に関する情報は、電子線照射回数と前記パターンの変形量を含むことを特徴とする合成画像作成装置。 - 請求項15記載の合成画像作成装置において、
前記パターンの変形に関する情報は、複数回の撮像によって得たテストパターンの画像の差分情報に基づいて作成することを特徴とする合成画像作成装置。 - 請求項15記載の合成画像作成装置において、
前記パターンの変形に関する情報は、撮像順序の早い撮像画像の接合領域を基準に、撮像順序の遅い撮像画像の接合領域の差分情報に基づいて作成することを特徴とする合成画像作成装置。 - 走査電子顕微鏡と該走査電子顕微鏡による画像を繋ぎ合わせて1つの画像を生成する合成画像作成装置を備えた走査電子顕微鏡装置において、
前記合成画像作成装置は、電子デバイスのパターンを含む撮像対象の走査電子顕微鏡像を記憶する画像メモリと、該画像メモリに格納された画像を繋ぎ合わせて1つの画像を生成する画像処理部と、
前記走査電子顕微鏡によった撮像された画像の撮像位置及び撮像順序を記憶する記憶部と、前記パターンの設計データを記憶する設計データ記憶部と、前記画像メモリに格納されている複数の画像を接合して1枚の画像を生成する画像合成部と、を有し、
前記画像合成部は、前記撮像位置及び前記撮像順序に基づいて前記画像メモリから取り出した複数の画像を、隣接する2つの画像の縁に設けた接合領域同士が重畳するように、接合し、
更に、隣接する2つの画像のうち、撮像順序が遅い画像の接合領域が消え、撮像順序が早い画像の接合領域が残るように、画像を接合する走査電子顕微鏡装置。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/262,734 US20120092482A1 (en) | 2009-04-03 | 2010-04-02 | Method and device for creating composite image |
| JP2011507303A JP5198654B2 (ja) | 2009-04-03 | 2010-04-02 | 合成画像作成方法及び装置 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2009091390 | 2009-04-03 | ||
| JP2009-091390 | 2009-04-03 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010114117A1 true WO2010114117A1 (ja) | 2010-10-07 |
Family
ID=42828401
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2010/056062 Ceased WO2010114117A1 (ja) | 2009-04-03 | 2010-04-02 | 合成画像作成方法及び装置 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20120092482A1 (ja) |
| JP (1) | JP5198654B2 (ja) |
| WO (1) | WO2010114117A1 (ja) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2012101717A1 (ja) * | 2011-01-26 | 2012-08-02 | 株式会社 日立ハイテクノロジーズ | パターンマッチング装置、及びコンピュータープログラム |
| WO2013179825A1 (ja) * | 2012-05-30 | 2013-12-05 | 株式会社日立ハイテクノロジーズ | パターン評価装置およびパターン評価方法 |
| EP2642749A3 (de) * | 2012-03-23 | 2014-04-16 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Vorrichtung und Verfahren zur Optimierung der Bestimmung von Aufnahmebereichen |
| JP2019124517A (ja) * | 2018-01-15 | 2019-07-25 | 東芝Itコントロールシステム株式会社 | 放射線検査装置 |
| CN110969576A (zh) * | 2019-11-13 | 2020-04-07 | 同济大学 | 一种基于路侧ptz相机的高速公路路面图像拼接方法 |
| WO2020188781A1 (ja) * | 2019-03-20 | 2020-09-24 | 株式会社日立ハイテク | 荷電粒子線装置 |
| TWI738510B (zh) * | 2020-09-15 | 2021-09-01 | 倍利科技股份有限公司 | 半導體元件圖像疊合方法 |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5542478B2 (ja) * | 2010-03-02 | 2014-07-09 | 株式会社日立ハイテクノロジーズ | 荷電粒子線顕微鏡 |
| EP2736248A4 (en) * | 2011-07-22 | 2015-06-24 | Fujifilm Corp | PROCESS FOR GENERATING PANORAMIC IMAGES AND IMAGING APPARATUS |
| JP6572117B2 (ja) * | 2015-12-04 | 2019-09-04 | オリンパス株式会社 | 顕微鏡、画像貼り合わせ方法、プログラム |
| GB2545492B (en) * | 2015-12-18 | 2020-07-01 | Imagination Tech Ltd | Capturing an image of a scene from a captured sequence of scene portions |
| EP3236486A1 (en) * | 2016-04-22 | 2017-10-25 | Carl Zeiss Microscopy GmbH | Method for generating a composite image of an object and particle beam device for carrying out the method |
| KR102771303B1 (ko) * | 2019-07-19 | 2025-02-24 | 삼성전자 주식회사 | 폴더블 전자 장치 및 상기 폴더블 전자 장치에서 복수의 카메라들을 이용한 사진 촬영 방법 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004128197A (ja) * | 2002-10-02 | 2004-04-22 | Jeol Ltd | パターン接続精度検査方法 |
| JP2004163420A (ja) * | 2002-10-22 | 2004-06-10 | Nano Geometry Kenkyusho:Kk | パターン検査装置および方法 |
| JP2005277395A (ja) * | 2004-02-23 | 2005-10-06 | Nano Geometry Kenkyusho:Kk | パターン検査装置および方法 |
| JP2009157543A (ja) * | 2007-12-26 | 2009-07-16 | Hitachi High-Technologies Corp | 画像生成方法及びその画像生成装置 |
| JP2009283917A (ja) * | 2008-04-23 | 2009-12-03 | Hitachi High-Technologies Corp | 欠陥観察方法及び欠陥観察装置 |
| JP2010067516A (ja) * | 2008-09-11 | 2010-03-25 | Hitachi High-Technologies Corp | 走査荷電粒子顕微鏡を用いたパノラマ画像合成方法およびその装置 |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0786466B2 (ja) * | 1990-07-18 | 1995-09-20 | 大日本スクリーン製造株式会社 | プリント基板のパターン検査装置 |
| US6587581B1 (en) * | 1997-01-10 | 2003-07-01 | Hitachi, Ltd. | Visual inspection method and apparatus therefor |
| US6868175B1 (en) * | 1999-08-26 | 2005-03-15 | Nanogeometry Research | Pattern inspection apparatus, pattern inspection method, and recording medium |
| US6528787B2 (en) * | 1999-11-30 | 2003-03-04 | Jeol Ltd. | Scanning electron microscope |
| US7864369B2 (en) * | 2001-03-19 | 2011-01-04 | Dmetrix, Inc. | Large-area imaging by concatenation with array microscope |
| JP4065847B2 (ja) * | 2001-11-21 | 2008-03-26 | 株式会社日立ハイテクノロジーズ | 試料像形成方法及び荷電粒子線装置 |
| US7456377B2 (en) * | 2004-08-31 | 2008-11-25 | Carl Zeiss Microimaging Ais, Inc. | System and method for creating magnified images of a microscope slide |
| JP4338627B2 (ja) * | 2004-12-17 | 2009-10-07 | 株式会社日立ハイテクノロジーズ | 荷電粒子線装置と荷電粒子線顕微方法 |
| JP4490864B2 (ja) * | 2005-04-28 | 2010-06-30 | 株式会社日立ハイテクノロジーズ | 画像形成方法 |
| JP4769025B2 (ja) * | 2005-06-15 | 2011-09-07 | 株式会社日立ハイテクノロジーズ | 走査型電子顕微鏡用撮像レシピ作成装置及びその方法並びに半導体パターンの形状評価装置 |
| US8098956B2 (en) * | 2007-03-23 | 2012-01-17 | Vantana Medical Systems, Inc. | Digital microscope slide scanning system and methods |
-
2010
- 2010-04-02 WO PCT/JP2010/056062 patent/WO2010114117A1/ja not_active Ceased
- 2010-04-02 US US13/262,734 patent/US20120092482A1/en not_active Abandoned
- 2010-04-02 JP JP2011507303A patent/JP5198654B2/ja not_active Expired - Fee Related
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004128197A (ja) * | 2002-10-02 | 2004-04-22 | Jeol Ltd | パターン接続精度検査方法 |
| JP2004163420A (ja) * | 2002-10-22 | 2004-06-10 | Nano Geometry Kenkyusho:Kk | パターン検査装置および方法 |
| JP2005277395A (ja) * | 2004-02-23 | 2005-10-06 | Nano Geometry Kenkyusho:Kk | パターン検査装置および方法 |
| JP2009157543A (ja) * | 2007-12-26 | 2009-07-16 | Hitachi High-Technologies Corp | 画像生成方法及びその画像生成装置 |
| JP2009283917A (ja) * | 2008-04-23 | 2009-12-03 | Hitachi High-Technologies Corp | 欠陥観察方法及び欠陥観察装置 |
| JP2010067516A (ja) * | 2008-09-11 | 2010-03-25 | Hitachi High-Technologies Corp | 走査荷電粒子顕微鏡を用いたパノラマ画像合成方法およびその装置 |
Cited By (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10535129B2 (en) | 2011-01-26 | 2020-01-14 | Hitachi High-Technologies Corporation | Pattern matching apparatus and computer program |
| JP5707423B2 (ja) * | 2011-01-26 | 2015-04-30 | 株式会社日立ハイテクノロジーズ | パターンマッチング装置、及びコンピュータープログラム |
| KR101522804B1 (ko) * | 2011-01-26 | 2015-05-26 | 가부시키가이샤 히다치 하이테크놀로지즈 | 패턴 매칭 장치 및 기록 매체 |
| WO2012101717A1 (ja) * | 2011-01-26 | 2012-08-02 | 株式会社 日立ハイテクノロジーズ | パターンマッチング装置、及びコンピュータープログラム |
| EP2642749A3 (de) * | 2012-03-23 | 2014-04-16 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Vorrichtung und Verfahren zur Optimierung der Bestimmung von Aufnahmebereichen |
| US9843718B2 (en) | 2012-03-23 | 2017-12-12 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Device and method for optimizing determining recording regions |
| WO2013179825A1 (ja) * | 2012-05-30 | 2013-12-05 | 株式会社日立ハイテクノロジーズ | パターン評価装置およびパターン評価方法 |
| JP2013247104A (ja) * | 2012-05-30 | 2013-12-09 | Hitachi High-Technologies Corp | パターン評価装置およびパターン評価方法 |
| JP7018770B2 (ja) | 2018-01-15 | 2022-02-14 | 東芝Itコントロールシステム株式会社 | 放射線検査装置 |
| JP2019124517A (ja) * | 2018-01-15 | 2019-07-25 | 東芝Itコントロールシステム株式会社 | 放射線検査装置 |
| JPWO2020188781A1 (ja) * | 2019-03-20 | 2021-10-21 | 株式会社日立ハイテク | 荷電粒子線装置 |
| WO2020188781A1 (ja) * | 2019-03-20 | 2020-09-24 | 株式会社日立ハイテク | 荷電粒子線装置 |
| JP7059439B2 (ja) | 2019-03-20 | 2022-04-25 | 株式会社日立ハイテク | 荷電粒子線装置 |
| US11747292B2 (en) | 2019-03-20 | 2023-09-05 | Hitachi High-Tech Corporation | Charged particle beam apparatus |
| US12169182B2 (en) | 2019-03-20 | 2024-12-17 | Hitachi High-Tech Corporation | Charged particle beam apparatus |
| CN110969576A (zh) * | 2019-11-13 | 2020-04-07 | 同济大学 | 一种基于路侧ptz相机的高速公路路面图像拼接方法 |
| TWI738510B (zh) * | 2020-09-15 | 2021-09-01 | 倍利科技股份有限公司 | 半導體元件圖像疊合方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP5198654B2 (ja) | 2013-05-15 |
| JPWO2010114117A1 (ja) | 2012-10-11 |
| US20120092482A1 (en) | 2012-04-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5198654B2 (ja) | 合成画像作成方法及び装置 | |
| US7679055B2 (en) | Pattern displacement measuring method and pattern measuring device | |
| JP4951496B2 (ja) | 画像生成方法及びその画像生成装置 | |
| JP5030906B2 (ja) | 走査荷電粒子顕微鏡を用いたパノラマ画像合成方法およびその装置 | |
| JP4659004B2 (ja) | 回路パターン検査方法、及び回路パターン検査システム | |
| KR101522804B1 (ko) | 패턴 매칭 장치 및 기록 매체 | |
| JP4982544B2 (ja) | 合成画像形成方法及び画像形成装置 | |
| US20190228522A1 (en) | Image Evaluation Method and Image Evaluation Device | |
| JP5604067B2 (ja) | マッチング用テンプレートの作成方法、及びテンプレート作成装置 | |
| US8994815B2 (en) | Method of extracting contour lines of image data obtained by means of charged particle beam device, and contour line extraction device | |
| US20090202139A1 (en) | Pattern generating apparatus and pattern shape evaluating apparatus | |
| WO2013121939A1 (ja) | オーバーレイ計測方法、計測装置、走査型電子顕微鏡およびgui | |
| WO2011114644A1 (ja) | 走査荷電粒子顕微鏡を用いた画像生成方法及び装置、並びに試料の観察方法及び観察装置 | |
| JP7001494B2 (ja) | ウェハ観察装置 | |
| JPH1173501A (ja) | 参照画像作成方法およびパターン検査装置 | |
| JP5202110B2 (ja) | パターン形状評価方法,パターン形状評価装置,パターン形状評価データ生成装置およびそれを用いた半導体形状評価システム | |
| JP2003107669A (ja) | パターン欠陥検査装置 | |
| JP5198546B2 (ja) | 回路パターン検査方法、及び回路パターン検査システム | |
| US9224226B2 (en) | Image display device for direct drawing apparatus, and recording medium | |
| JPH11257940A (ja) | パターン評価方法及びパターン評価装置 | |
| JP2010145145A (ja) | 回路パターン検査装置および検査方法およびテストパターン | |
| JP4750084B2 (ja) | キャリブレーション方法およびキャリブレーション装置 | |
| JP2011171510A (ja) | 荷電粒子ビーム描画装置 | |
| WO2018138874A1 (ja) | 荷電粒子線装置 | |
| TWI867659B (zh) | 校正帶電粒子束掃描樣本時的影像誤差的方法、裝置和電腦程式 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10758884 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2011507303 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13262734 Country of ref document: US |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10758884 Country of ref document: EP Kind code of ref document: A1 |