WO2006082639A1 - マーク画像処理方法、プログラム及び装置 - Google Patents
マーク画像処理方法、プログラム及び装置 Download PDFInfo
- Publication number
- WO2006082639A1 WO2006082639A1 PCT/JP2005/001595 JP2005001595W WO2006082639A1 WO 2006082639 A1 WO2006082639 A1 WO 2006082639A1 JP 2005001595 W JP2005001595 W JP 2005001595W WO 2006082639 A1 WO2006082639 A1 WO 2006082639A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mark
- imaging
- image
- image processing
- predetermined range
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F9/00—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
- G03F9/70—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
- G03F9/7069—Alignment mark illumination, e.g. darkfield, dual focus
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F9/00—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
- G03F9/70—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
- G03F9/7088—Alignment mark detection, e.g. TTR, TTL, off-axis detection, array detector, video detection
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03F—PHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
- G03F9/00—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
- G03F9/70—Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically for microlithography
- G03F9/7092—Signal processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
- G06V10/7515—Shifting the patterns to accommodate for positional errors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30148—Semiconductor; IC; Wafer
Definitions
- the present invention relates to a mark image processing method, a program, and an apparatus for imaging a fine alignment mark formed on a substrate or a chip and detecting a mark position by image processing, and in particular, matching between an image and a template image.
- the present invention relates to a mark image processing method, program, and apparatus for recognizing alignment marks and detecting mark positions. Background art
- a workpiece such as a substrate or a chip is loaded onto the alignment stage and positioned.
- the alignment mark provided above is imaged with an imaging device such as a CCD camera, and the mark position is detected by recognizing the alignment mark by matching the alignment mark template with the image registered in advance.
- Such alignment marks are fine marks of, for example, about several tens of ⁇ m to one hundred ⁇ m, and are produced by fine processing such as edging processing of a substrate!
- the optimal illumination conditions and exposure time that have been adjusted in advance are used fixedly, the alignment mark is imaged, and the mark is recognized by image processing to detect the position. is doing.
- Patent Document 1 Japanese Patent Laid-Open No. 05-159997
- Patent Document 2 Japanese Patent Laid-Open No. 06-005502
- Patent Document 3 Japanese Patent Laid-Open No. 2001-338867
- the imaging conditions do not match the actual alignment mark situation, so that it is difficult to detect the alignment mark based on the image, or the mark position cannot be accurately detected even if it can be detected. There is.
- the present invention provides a mark image processing method, program, and apparatus for recognizing a mark position from an image under an optimum condition adapted to the situation at that time without being affected by the alignment mark formation state, illumination fluctuation, and the like.
- the purpose is to provide. Means for solving the problem
- the present invention provides a mark image processing method.
- the mark image processing method of the present invention includes an imaging control step of imaging a mark on a workpiece a plurality of times while changing imaging conditions of the imaging device,
- An image recognition step for calculating the correlation between a plurality of images and a template image of a pre-registered mark to detect an optimum mark position
- the mark is imaged a plurality of times while changing the illumination intensity within a predetermined range.
- the mark is imaged a plurality of times while changing the exposure time within a predetermined range.
- the mark may be imaged a plurality of times while changing the illumination intensity and exposure time of the illumination device within a predetermined range.
- the image recognition step calculates the correlation at each slide position while sliding the template image with respect to the image each time the mark is imaged while changing the imaging condition within a predetermined range. Slide position force that minimizes the correlation value
- the mark position is detected and saved with the correlation value, and the correlation value between the correlation values stored at the end of imaging with the imaging conditions changed within the specified range The smallest correlation value
- the mark is an alignment mark formed by microfabrication on a substrate or chip.
- the present invention provides a program for mark image processing.
- the program of the present invention is stored in a computer.
- the present invention provides a mark image processing apparatus.
- the mark image processing apparatus of the present invention includes an imaging control unit that images a mark on a workpiece a plurality of times while changing imaging conditions of the imaging device,
- An image recognizing unit that detects the optimum mark position by calculating a correlation between a plurality of images and a pre-registered mark template image
- the illumination intensity and / or the exposure time are changed within a preset range as imaging conditions, and each imaging condition is changed.
- the correlation between the captured image and the pre-registered template is calculated, the position where the medium force correlation value is minimized is determined as the mark position, and the mark position where the medium force correlation value of each image is the minimum is determined as the optimum solution.
- FIG. 1 is an explanatory diagram of an ultrasonic bonding apparatus in which the mark image processing apparatus of the present invention is used.
- FIG. 2 is an explanatory diagram of the functional configuration of the mark image processing apparatus of the present invention.
- FIG. 3 is an explanatory diagram of the image pickup apparatus of FIG. 2 provided with a lighting device.
- FIG. 4 is an explanatory diagram of a workpiece formed with alignment marks to be processed in the present invention.
- FIG. 5 is an explanatory diagram of correlation calculation performed by sliding a template image on a mark image.
- FIG. 6 is a flowchart of mark image recognition processing according to the first embodiment of the present invention in which an image is taken with varying illumination intensity.
- FIG. 7 is a flowchart of mark image recognition processing according to the second embodiment of the present invention for imaging with varying exposure time.
- FIG. 8 is a flowchart of a mark image recognition process according to a third embodiment of the present invention for imaging with varying illumination intensity and exposure time.
- FIG. 1 is an explanatory view of an ultrasonic bonding apparatus in which the mark image processing apparatus of the present invention is required.
- the ultrasonic bonding apparatus 10 has an alignment mechanism 12, and the alignment mechanism 12 is provided with a pressure mechanism 16 having an ultrasonic head 14 at the tip and an imaging device 18. Is connected to the mark image processing device 32 of the present invention.
- the alignment mechanism 12 has a workpiece 42 mounted on the alignment stage 40.
- the alignment mechanism 12 moves the alignment stage 40 in the horizontal direction X direction, Y direction, further up and down Z direction, It is equipped with a mechanism that tilts the stage surface at an angle of ⁇ with respect to the horizontal plane.
- the workpiece 42 mounted on the alignment stage 40 is formed with alignment marks for positioning the workpiece 42 at a predetermined processing position.
- the image is picked up by the image device 18, the alignment mark position is detected by the mark image processing device 32, the alignment stage 40 is driven by the alignment mechanism 12, and the workpiece 42 is positioned at a predetermined processing position with respect to the ultrasonic head 14. It is adjusting.
- An alignment mechanism control unit 24 is provided for the alignment mechanism 12 so that the alignment stage 40 can be driven in the direction of X, ⁇ , Z, and at an angle ⁇ with respect to the horizontal plane.
- An imaging device moving mechanism 20 is provided for the imaging device 18, and the imaging device moving mechanism 20 can move the imaging device 18 in the X direction and the Y direction orthogonal to each other on a horizontal plane by the imaging device moving mechanism control unit 30. Yes.
- An ultrasonic oscillation unit 28 is provided for the ultrasonic head 14, and the ultrasonic head 14 is driven by the output signal of the ultrasonic oscillator force provided in the ultrasonic oscillation unit 28, and the ultrasonic head 14 is moved to the workpiece.
- the joints of the workpiece are joined by ultrasonic vibration while being pressed mechanically on 42.
- the pressurizing mechanism 16 provided for the ultrasonic head 14 drives the ultrasonic head 14 in the vertical direction, ie, the Z direction, and presses the ultrasonic head 14 against the work 42 to change the ultrasonic signal. To join.
- the pressure mechanism 16 is controlled by a pressure control unit 26.
- the main controller 22 controls the alignment mechanism control unit 24, the pressurization control unit 26, the ultrasonic wave oscillation unit 28, the imaging device moving mechanism control unit 30 and the mark image processing device 32 according to a predetermined procedure, and performs ultrasonic coupling.
- the carry-in input of the workpiece 42 in the apparatus 10 also controls a series of operations from ultrasonic coupling to extraction.
- FIG. 2 is an explanatory diagram showing a functional configuration of the mark image processing apparatus of the present invention provided in the ultrasonic coupling device 10 of FIG.
- the imaging device 18 includes a CCD camera 34, a lens 36, and an illumination unit 38, and images the alignment mark 44 of the work 42 mounted on the alignment stage 40.
- the work image processing device 32 is provided with an imaging control unit 46 and an image recognition unit 48, and each is controlled by the controller 50 according to a predetermined processing procedure.
- the imaging control unit 46 is provided with an illumination intensity control unit 52 and an exposure time control unit 54.
- the illumination intensity control unit 52 provides illumination provided to the imaging device 18
- the alignment mark 44 is photographed several times while changing the illumination intensity of the unit 38 within a predetermined range.
- the exposure time of the CCD camera 34 by the exposure time control unit 54 is fixed to a preset optimum exposure time.
- the alignment mark 44 is imaged a plurality of times while changing the exposure time within a predetermined range by the exposure time control unit 54.
- the illumination intensity control unit 52 fixedly sets the optimal illumination intensity adjusted in advance.
- the illumination intensity control unit 52 and the exposure time control unit 54 are simultaneously controlled to change the illumination intensity within a predetermined range and to set the exposure time within the predetermined range.
- Alignment mark 44 is imaged multiple times while changing with.
- the image recognition unit 48 calculates the correlation between the image of the alignment mark 44 captured multiple times and the template image of the alignment mark registered in advance while changing the imaging conditions of the imaging device 18 by the imaging control unit 46. Thus, the optimum mark position is detected. Therefore, the image recognition unit 48 includes an image input unit 56, an image memory 58, a template file 60, a correlation calculation unit 62, a result storage memory 64, and an optimum solution extraction unit 66.
- the image input unit 56 inputs an image picked up by the image pickup device 18 according to a change in illumination intensity and exposure time by the image pickup control unit 46 and records it in the image memory 58.
- a template image including an image of the alignment mark 44 is registered in advance.
- the correlation calculation unit 62 calculates the correlation at each slide position while sliding the template image of the template file 60 with respect to the image stored in the image memory 56, and also provides the slide position force that minimizes the correlation value.
- the mark position is detected and stored in the result storage memory 64 together with the correlation value at that time.
- one alignment mark 44 is taken while changing the illumination intensity, for example, 10 images are taken, and accordingly, the image of the same alignment mark 44 is stored in the image memory 58.
- Power For example, 10 pictures are stored.
- the correlation calculation unit 62 performs a correlation calculation with each of the ten images, and detects the mark position of the slide position force of the template image that minimizes the correlation value, and the result together with the correlation value at that time. Stored in storage memory 64. Therefore, for example, for 10 images taken while changing the illumination intensity, the correlation calculation unit 62 performs the correlation operation. From the calculation, in the result storage memory 64, the 10 correlation values obtained for the 10 images are stored in the case with the work position.
- the optimal solution extraction unit 66 has the smallest correlation value among the correlation values stored in the result storage memory 64, for example, 10 images captured by changing the illumination intensity 10 times within a predetermined range.
- the mark position is extracted as an optimum value and output to the outside.
- the mark detection position as the optimum solution output to the outside is given to, for example, the alignment mechanism 12 in FIG. 1, and the work 42 on the alignment stage 40 is in a specified positional relationship with respect to the ultrasonic head 14.
- the alignment mechanism 12 is adjusted so that the adjustment is completed, and the ultrasonic control unit 26 lowers the ultrasonic head 14 onto the work 42 and pushes it up with the alignment adjustment completed.
- a predetermined joint on the work 42 is ultrasonically bonded by supplying the ultrasonic head 14 and vibrating it.
- FIG. 3 is an explanatory diagram of the imaging device 18 of FIG. 2 provided with a lighting unit.
- the imaging device 18 has a lighting unit 38 attached to the tip of a lens 36 provided in the CCD camera 34.
- a beam splitter 70 is disposed on the optical axis of the lens 36, beam splitters 72 and 74 are disposed thereon, and LED illumination units 76 and 78 are provided for the beam splitters 72 and 74, respectively.
- An exposure time control unit 54 is provided for the CCD camera 34, and an illumination intensity control unit 52 is provided for the LED illumination units 76 and 78.
- the illumination intensity control unit 52 images the alignment mark 44 of the work 42 mounted on the alignment stage 40 with the CCD camera 34.
- the LED illumination unit 76 is turned on, only the ultrasonic head 14 is imaged.
- the LED illumination unit 78 When the LED illumination unit 78 is turned on, the illumination light from the LED illumination unit 78 is reflected downward by the beam splitter 74 to illuminate the workpiece 42 on which the alignment mark 44 is formed.
- the reflected light from the illumination of the work 42 passes through the beam splitter 74, is reflected laterally by the beam splitter 70, enters the CCD camera 34 through the lens 36, and is combined with the image of the work 42. can do.
- the LED illumination unit 76 when the LED illumination unit 76 is turned on, the illumination light is reflected upward by the beam splitter 72 and illuminates the screen of the ultrasonic head 14. For this reason, to illuminated ultrasound The reflected light from the screen of the head 14 passes through the beam splitter 72 and enters the illumination unit 38, then reflects leftward, reflects off the left end face, returns to the right, and enters the CCD camera 34 via the lens 36. Then, the screen image of the ultrasonic head 14 is combined.
- the CCD camera 34 by switching the lighting of the LED illumination units 76 and 78, the image of the alignment mark 44 of the work 42 and the image of the screen of the ultrasonic head 14 are taken, and the image of the alignment mark 4 4 The position of the alignment stage 40 is adjusted so that the detected workpiece position matches the specified position of the image of the ultrasonic head 14.
- FIG. 4 is an explanatory diagram of alignment marks formed on the work 42 in FIG.
- a workpiece 42 in FIG. 4 is a substrate or chip on which a semiconductor integrated circuit is formed.
- alignment marks 44 1 and 44 2 are formed by fine processing such as edging in two places, an upper right corner and a lower left corner.
- Alignment marks 44-1 and 44-2 are cross marks in this example, and their sizes are as small as about 60 m to 99 m.
- the center points PI and P2 of the alignment marks 44-1 and 44-2 with crosses indicate the coordinate points of the mark detection position.
- FIG. 5 is an explanatory diagram of the correlation calculation performed by sliding the template on the mark image.
- FIG. 5 (A) is an image 80 obtained by imaging the work 42 in FIG. 4, and has, for example, an image size of horizontal M dots and vertical N dots.
- Alignment mark mark images 82-1 and 82-2 are present at two locations on the image 80, and have center points PI and P2 as mark detection positions, respectively.
- FIG. 5 (B) shows a template image 86, which has a smaller size of horizontal m dots and vertical n dots than the image 80 of FIG. 5, and a reference mark image 88 at the center position.
- the center is the reference center point PO that gives the reference detection position.
- an image is cut out from the image 80 in FIG. 5 using a cutout region 84 having the same size as the template image 86 in FIG. 5B, for example, with the left corner coordinate point of the image 80 as an initial position. Correlation between the cut-out image in the cut-out area 84 and the template image 86 is performed.
- the correlation calculation between the image of the cutout region and the template image 86 is similarly performed while shifting the cutout region 84 by one dot in the horizontal direction. repeat. If the cutout area 84 reaches the right edge, Go back and shift one dot in the vertical direction. Similarly, slide the left force to the right and calculate the correlation with the template image 86 at each slide position.
- the correlation calculation between the cut-out image of the cut-out area 84 and the template image 86 is performed by the following equation.
- C is the correlation value
- (u, V) is the coordinate position of the correlation value C
- I (X, Y) is the target value in the position image of the clipped image
- I (x, y) is a target value of the position image of the template image 86.
- the cutout region 84 is slid from the left corner to the final position in the lower right corner while scanning the image 80 in the horizontal and vertical directions, and the correlation value with the template image 86 is calculated for each slide position.
- the minimum correlation value is obtained in the vicinity of the mark image 82-1 and the vicinity of the mark image 82-2, and these two minimum correlation values are expressed in the coordinates of P1 and P2.
- the result is stored in the result storage memory 64 provided in the image recognition unit 48 of FIG. 2 together with the given mark detection position.
- the mark detection position having the smallest correlation value is output as the optimum solution from the minimum correlation values obtained from 10 images captured by changing the illumination intensity 10 times within a predetermined range. Will do.
- FIG. 6 is a flowchart of mark image recognition processing according to the first embodiment of the present invention for imaging with varying illumination intensity.
- the illumination volume that is, the illumination intensity
- the empirical and statistical optimum positive illumination intensity value is set to 10 steps within a range of ⁇ 5%, for example.
- Volume variable i is set to. In this case, the optimum exposure time obtained empirically and statistically is fixedly used as the exposure time.
- step S3 the process proceeds to step S3 to turn on the illumination.
- This illumination turns on the LED illumination unit 78 in FIG.
- the reflected light from the illumination unit passes through the beam splitter 74, is reflected by the beam splitter 70, enters the CCD camera 34, and combines the captured image of the alignment mark 44.
- step S4 the CCD camera 34 performs imaging by exposure reading and inputs an alignment mark image.
- step S5 the illumination is turned off.
- step S6 the most matching position that minimizes the correlation value is detected by the correlation calculation between the template image and the image, and in step S7, the illumination volume value of the matching position, the coordinates (x, y) indicating the detection position,
- the correlation value Ci as a matching score is stored in the storage result memory 64.
- step S9 When the completion of the illumination volume setting range is determined in step S9, the process proceeds to step S10, and the position where the correlation value as the matching score is minimum in the data in the result storage memory 64 is set as the mark detection position as the optimum solution. Extract and output.
- FIG. 7 is a flowchart of a mark image recognition process in the second embodiment of the present invention in which an image is picked up by changing the exposure time.
- the exposure time is, for example, 10 steps within a range of ⁇ 5%, for example, centering on an empirical and statistical optimum exposure time value that is fixed when the shooting conditions are not changed. It is preset to change. Then turn on the illumination in step S3. In this case, the optimum illumination intensity obtained empirically and statistically is fixedly used as the illumination intensity.
- step S6 the position that has the smallest matching value is detected by the correlation calculation between the template image and the captured image, and in step S7, the exposure time ⁇ , the detected position (x, y), and matching are stored in the result storage memory 64.
- the detection position Ci as a score is stored.
- FIG. 8 is a flowchart of a mark image recognition process according to the third embodiment of the present invention in which imaging is performed by changing the illumination intensity and the exposure time.
- step S5 the illumination is turned on with the intensity of the set value of the illumination volume at that time, and in step S6, the image is taken with the exposure time set at that time T milliseconds, and in step S7, the illumination is turned off.
- step S8 the position where the correlation value that matches most is obtained by the correlation calculation between the template image and the image is detected, and in step S9, the illumination volume value, exposure time, detection position (x, y), and matching score are detected.
- the minimum correlation value Ci is stored in the result storage memory 64.
- step S13 When the illumination volume setting range is completed in step S13, the process proceeds to step S14. At that time, the result is stored in the result storage memory 64! The position where the correlation value as the matching score is minimized is also stored in the result storage memory 64! Extract and output as the optimal solution for the mark detection position.
- the number of changes in the setting range of the illumination volume and the exposure time is 10 times
- the minimum correlation value obtained by the correlation calculation is obtained for each of the images obtained by 100 times of the total imaging.
- the detection position is obtained, and the mark detection position having the smallest correlation value is extracted as the optimum solution.
- the number of times of imaging is as large as the upper number of adjustments of the operation, and therefore, for example, the adjustment of the first embodiment and the second embodiment
- the number of adjustments may be limited to 5 times to reduce the overall number of times of imaging and shorten the processing time.
- the process of changing the exposure time within a predetermined range and imaging is repeated, but conversely, the exposure time is set.
- the process of changing the adjustment volume within a predetermined range may be repeated.
- the present invention provides a mark image processing program for alignment marks, and this program is executed by the hardware environment of the computer constituting the mark image processing 32 of FIG. Become.
- the mark image processing 32 of FIG. 2 is realized in the hardware environment of a computer.
- ROM, RAM, and hard disk drive are connected to the CPU bus, and the hard disk drive
- the mark image processing program according to the present invention is loaded, and when the computer is started, the hard disk drive also reads the mark image processing program of the present invention, develops it on the ROM, and executes it by the CPU.
- the mark image processing program of the present invention executed in the hardware environment of this computer has the processing procedure shown in the flowchart of FIG. 6, FIG. 7, or FIG.
- the present invention has been described by taking the case where the mark image processing device 32 is applied to an ultrasonic coupling device as an example, but the present invention is not limited to this, and a fine circuit board or chip-shaped fine image is used.
- the present invention can be applied to an appropriate apparatus as long as it is an apparatus that detects the position by imaging the limb mark with an imaging apparatus.
- the present invention includes appropriate modifications that do not impair the object and advantages thereof, and is not limited by the numerical values shown in the above embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
- Exposure And Positioning Against Photoresist Photosensitive Materials (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2007501475A JP4618691B2 (ja) | 2005-02-03 | 2005-02-03 | マーク画像処理方法、プログラム及び装置 |
| PCT/JP2005/001595 WO2006082639A1 (ja) | 2005-02-03 | 2005-02-03 | マーク画像処理方法、プログラム及び装置 |
| US11/771,587 US20070253616A1 (en) | 2005-02-03 | 2007-06-29 | Mark image processing method, program, and device |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2005/001595 WO2006082639A1 (ja) | 2005-02-03 | 2005-02-03 | マーク画像処理方法、プログラム及び装置 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US11/771,587 Continuation US20070253616A1 (en) | 2005-02-03 | 2007-06-29 | Mark image processing method, program, and device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2006082639A1 true WO2006082639A1 (ja) | 2006-08-10 |
Family
ID=36777034
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2005/001595 Ceased WO2006082639A1 (ja) | 2005-02-03 | 2005-02-03 | マーク画像処理方法、プログラム及び装置 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20070253616A1 (ja) |
| JP (1) | JP4618691B2 (ja) |
| WO (1) | WO2006082639A1 (ja) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008276466A (ja) * | 2007-04-27 | 2008-11-13 | Optex Fa Co Ltd | 画像処理装置 |
| JP2010191593A (ja) * | 2009-02-17 | 2010-09-02 | Honda Motor Co Ltd | 対象物の位置検出装置及び位置検出方法 |
| JP2010191590A (ja) * | 2009-02-17 | 2010-09-02 | Honda Motor Co Ltd | 対象物の位置検出装置及び位置検出方法 |
| JP2013042148A (ja) * | 2012-09-18 | 2013-02-28 | Bondtech Inc | 転写方法および転写装置 |
| JP2013048249A (ja) * | 2012-09-18 | 2013-03-07 | Bondtech Inc | 転写方法および転写装置 |
| JP2023092401A (ja) * | 2021-12-21 | 2023-07-03 | ファスフォードテクノロジ株式会社 | 実装装置、照明システムの調整方法および半導体装置の製造方法 |
| JP2024505159A (ja) * | 2021-01-15 | 2024-02-05 | クリック アンド ソッファ インダストリーズ、インク. | ワイヤーボンディングおよびその他の電子部品パッケージング装置のためのインテリジェントパターン認識システムおよびそれに関連する方法 |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8492178B2 (en) * | 2007-02-23 | 2013-07-23 | Rudolph Technologies, Inc. | Method of monitoring fabrication processing including edge bead removal processing |
| USD629533S1 (en) | 2008-07-23 | 2010-12-21 | Diversitech Corporation | Equipment support |
| US9288449B2 (en) * | 2008-08-05 | 2016-03-15 | University Of Florida Research Foundation, Inc. | Systems and methods for maintaining multiple objects within a camera field-of-view |
| JP5506634B2 (ja) * | 2010-11-05 | 2014-05-28 | 株式会社アドテックエンジニアリング | 位置合わせ用照明装置及び該照明装置を備えた露光装置 |
| CN105260733A (zh) * | 2015-09-11 | 2016-01-20 | 北京百度网讯科技有限公司 | 用于处理图像信息的方法和装置 |
| US10665260B2 (en) * | 2018-01-30 | 2020-05-26 | Panasonic Intellectual Property Management Co., Ltd. | Optical disc recording device and optical disc recording method |
| JP7317579B2 (ja) | 2019-06-07 | 2023-07-31 | キヤノン株式会社 | 位置合わせ装置、位置合わせ方法、リソグラフィ装置、および、物品の製造方法 |
| WO2025157581A1 (en) * | 2024-01-25 | 2025-07-31 | Asml Netherlands B.V. | Image position measurement method |
| CN119722871B (zh) * | 2025-02-26 | 2025-05-16 | 荣芯半导体(宁波)有限公司 | 一种对准图像的生成方法和芯片对准方法 |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1082613A (ja) * | 1996-09-09 | 1998-03-31 | Matsushita Electric Ind Co Ltd | 視覚認識方法及び装置 |
| WO2002045023A1 (en) * | 2000-11-29 | 2002-06-06 | Nikon Corporation | Image processing method, image processing device, detection method, detection device, exposure method and exposure system |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4629313A (en) * | 1982-10-22 | 1986-12-16 | Nippon Kogaku K.K. | Exposure apparatus |
| JP3235387B2 (ja) * | 1991-07-12 | 2001-12-04 | オムロン株式会社 | 照明条件設定支援装置および方法 |
| US5525808A (en) * | 1992-01-23 | 1996-06-11 | Nikon Corporaton | Alignment method and alignment apparatus with a statistic calculation using a plurality of weighted coordinate positions |
| JP3224041B2 (ja) * | 1992-07-29 | 2001-10-29 | 株式会社ニコン | 露光方法及び装置 |
| JP3491346B2 (ja) * | 1994-08-22 | 2004-01-26 | 株式会社ニコン | 位置合わせ方法及びそれを用いた露光方法、並びに位置合わせ装置及びそれを用いた露光装置 |
| US5552611A (en) * | 1995-06-06 | 1996-09-03 | International Business Machines | Pseudo-random registration masks for projection lithography tool |
| JP3310524B2 (ja) * | 1996-02-08 | 2002-08-05 | 日本電信電話株式会社 | 外観検査方法 |
| JPH1173513A (ja) * | 1997-06-25 | 1999-03-16 | Matsushita Electric Works Ltd | パターン検査方法及びその装置 |
| JPH11307449A (ja) * | 1998-02-20 | 1999-11-05 | Canon Inc | 露光装置及びデバイスの製造方法 |
| EP1003128A4 (en) * | 1998-05-12 | 2005-02-23 | Omron Tateisi Electronics Co | MODEL REGISTRATION AUXILIARY METHOD, METHOD USING AUXILIARY REGULATORY AUXILIARY APPARATUS, AND IMAGE PROCESSOR |
| JP2000259830A (ja) * | 1999-03-05 | 2000-09-22 | Matsushita Electric Ind Co Ltd | 画像認識装置および画像認識方法 |
| JP3927774B2 (ja) * | 2000-03-21 | 2007-06-13 | キヤノン株式会社 | 計測方法及びそれを用いた投影露光装置 |
| JP3548501B2 (ja) * | 2000-05-15 | 2004-07-28 | ペンタックス株式会社 | カメラの操作部材 |
| JP2002352232A (ja) * | 2001-05-29 | 2002-12-06 | Matsushita Electric Ind Co Ltd | 画像入力装置 |
| JP2003329596A (ja) * | 2002-05-10 | 2003-11-19 | Mitsubishi Rayon Co Ltd | 欠陥検査装置及び欠陥検査方法 |
| US7349580B2 (en) * | 2003-06-03 | 2008-03-25 | Topcon Corporation | Apparatus and method for calibrating zoom lens |
| US7164518B2 (en) * | 2003-10-10 | 2007-01-16 | Yuping Yang | Fast scanner with rotatable mirror and image processing system |
-
2005
- 2005-02-03 JP JP2007501475A patent/JP4618691B2/ja not_active Expired - Fee Related
- 2005-02-03 WO PCT/JP2005/001595 patent/WO2006082639A1/ja not_active Ceased
-
2007
- 2007-06-29 US US11/771,587 patent/US20070253616A1/en not_active Abandoned
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1082613A (ja) * | 1996-09-09 | 1998-03-31 | Matsushita Electric Ind Co Ltd | 視覚認識方法及び装置 |
| WO2002045023A1 (en) * | 2000-11-29 | 2002-06-06 | Nikon Corporation | Image processing method, image processing device, detection method, detection device, exposure method and exposure system |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008276466A (ja) * | 2007-04-27 | 2008-11-13 | Optex Fa Co Ltd | 画像処理装置 |
| JP2010191593A (ja) * | 2009-02-17 | 2010-09-02 | Honda Motor Co Ltd | 対象物の位置検出装置及び位置検出方法 |
| JP2010191590A (ja) * | 2009-02-17 | 2010-09-02 | Honda Motor Co Ltd | 対象物の位置検出装置及び位置検出方法 |
| JP2013042148A (ja) * | 2012-09-18 | 2013-02-28 | Bondtech Inc | 転写方法および転写装置 |
| JP2013048249A (ja) * | 2012-09-18 | 2013-03-07 | Bondtech Inc | 転写方法および転写装置 |
| JP2024505159A (ja) * | 2021-01-15 | 2024-02-05 | クリック アンド ソッファ インダストリーズ、インク. | ワイヤーボンディングおよびその他の電子部品パッケージング装置のためのインテリジェントパターン認識システムおよびそれに関連する方法 |
| JP2023092401A (ja) * | 2021-12-21 | 2023-07-03 | ファスフォードテクノロジ株式会社 | 実装装置、照明システムの調整方法および半導体装置の製造方法 |
| JP7788847B2 (ja) | 2021-12-21 | 2025-12-19 | ファスフォードテクノロジ株式会社 | 実装装置、照明システムの調整方法および半導体装置の製造方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20070253616A1 (en) | 2007-11-01 |
| JP4618691B2 (ja) | 2011-01-26 |
| JPWO2006082639A1 (ja) | 2008-06-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2006082639A1 (ja) | マーク画像処理方法、プログラム及び装置 | |
| US7545512B2 (en) | Method for automated measurement of three-dimensional shape of circuit boards | |
| US8164027B2 (en) | Laser processing system and laser processing method | |
| JP5174589B2 (ja) | 電子部品実装装置の自動焦点調整方法 | |
| JP7112341B2 (ja) | 実装装置および実装方法 | |
| JP2012043874A (ja) | 部品実装装置および部品検出方法 | |
| WO2012023250A1 (ja) | 部品実装装置および部品検出方法 | |
| JP2008306040A (ja) | ボンディング装置用撮像装置及び撮像方法 | |
| JPH11267868A (ja) | レ―ザ加工装置及びレ―ザ加工方法 | |
| JP2020153681A (ja) | 画像測定装置 | |
| JP2009053485A (ja) | オートフォーカス装置、オートフォーカス方法および計測装置 | |
| JPH11104871A (ja) | レーザ加工装置 | |
| JP5339884B2 (ja) | 撮像装置の焦点調整方法および焦点調整装置 | |
| WO2010087213A1 (ja) | 補正位置検出装置、補正位置検出方法及びボンディング装置 | |
| JP2002269560A (ja) | テンプレートマッチング方法、それを実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体、テンプレートマッチング装置、位置決め装置および実装装置 | |
| JP3063677B2 (ja) | レーザ加工装置及びレーザ加工方法 | |
| JP2003156311A (ja) | アライメントマークの検出、登録方法及び装置 | |
| JP7161430B2 (ja) | 画像測定装置 | |
| JP7240913B2 (ja) | 画像測定装置 | |
| JP3858633B2 (ja) | ボンディング状態の検査方法 | |
| JP2010210389A (ja) | スポット位置測定方法および計測装置 | |
| JP2011066063A (ja) | 複数の表面実装装置からなる基板生産ライン | |
| JPH06277864A (ja) | レーザ加工装置 | |
| JP2014036068A (ja) | ダイボンダおよびボンディングツールの位置検出方法 | |
| JP4124554B2 (ja) | ボンディング装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 11771587 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2007501475 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWP | Wipo information: published in national office |
Ref document number: 11771587 Country of ref document: US |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 05709688 Country of ref document: EP Kind code of ref document: A1 |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 5709688 Country of ref document: EP |