WO2019164382A1 - 부품의 실장 상태를 검사하기 위한 방법, 인쇄 회로 기판 검사 장치 및 컴퓨터 판독 가능한 기록매체 - Google Patents
부품의 실장 상태를 검사하기 위한 방법, 인쇄 회로 기판 검사 장치 및 컴퓨터 판독 가능한 기록매체 Download PDFInfo
- Publication number
- WO2019164382A1 WO2019164382A1 PCT/KR2019/002330 KR2019002330W WO2019164382A1 WO 2019164382 A1 WO2019164382 A1 WO 2019164382A1 KR 2019002330 W KR2019002330 W KR 2019002330W WO 2019164382 A1 WO2019164382 A1 WO 2019164382A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- depth information
- component
- machine
- based model
- light sources
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05K—PRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
- H05K13/00—Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
- H05K13/08—Monitoring manufacture of assemblages
- H05K13/081—Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
- H05K13/0817—Monitoring of soldering processes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/22—Measuring arrangements characterised by the use of optical techniques for measuring depth
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2509—Color coding
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/956—Inspecting patterns on the surface of objects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/956—Inspecting patterns on the surface of objects
- G01N21/95684—Patterns showing highly reflecting parts, e.g. metallic elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0475—Generative networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/094—Adversarial learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/60—Image enhancement or restoration using machine learning, e.g. neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/586—Depth or shape recovery from multiple images from multiple light sources, e.g. photometric stereo
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05K—PRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
- H05K13/00—Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
- H05K13/08—Monitoring manufacture of assemblages
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05K—PRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
- H05K13/00—Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
- H05K13/08—Monitoring manufacture of assemblages
- H05K13/081—Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
- H05K13/0815—Controlling of component placement on the substrate during or after manufacturing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/95—Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
- G01N21/956—Inspecting patterns on the surface of objects
- G01N2021/95638—Inspecting patterns on the surface of objects for PCB's
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30141—Printed circuit board [PCB]
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05K—PRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
- H05K13/00—Apparatus or processes specially adapted for manufacturing or adjusting assemblages of electric components
- H05K13/08—Monitoring manufacture of assemblages
- H05K13/081—Integration of optical monitoring devices in assembly lines; Processes using optical monitoring devices specially adapted for controlling devices or machines in assembly lines
Definitions
- the present disclosure relates to a method for inspecting a mounting state of a component, a printed circuit board inspection apparatus, and a computer readable recording medium.
- a screen printer prints solder pastes on a printed circuit board, and a mounter mounts components on a printed circuit board on which solder pastes are printed. do.
- SMT Surface Mounter Technology
- AOI Automated Optical Inspection
- the AOI device uses the captured image of the printed circuit board to check whether the components are normally mounted on the printed circuit board without dislocation, lifting or tilting.
- noise may occur in the multiple reflection of light irradiated onto the printed circuit board or in the process of processing the received light by the image sensor. That is, optical noise and signal noise may be variously generated. If the noise generated in this manner is not reduced, the quality of the printed circuit board captured image generated by the AOI device may be degraded. When the quality of the printed circuit board picked-up image is degraded, the inspection of the mounting state of the components mounted on the printed circuit board using the printed circuit board picked-up image may not be performed correctly.
- the present disclosure may provide a printed circuit board inspection apparatus that inspects a mounting state of a component by using depth information of a component having reduced noise obtained based on depth information of the component.
- the present disclosure provides a computer readable recording medium having a program recorded thereon, the program including executable instructions for inspecting a part's mounting state using depth information on a part having reduced noise obtained based on depth information on the part. Can provide.
- the present disclosure may provide a method of inspecting a mounted state of a part by using depth information about a part having reduced noise obtained based on depth information about the part.
- the first depth information of the first object is generated using the image sensor for receiving the pattern light reflected from the first object of the pattern light irradiated from the plurality of second light sources, the first noise is reduced
- a memory and a processor storing a machine-learning based model for outputting depth information, the processor using the patterned light reflected from the component received by the image sensor to obtain second depth information on the component.
- the mounting state of the component may be inspected using the second depth information obtained from the half model and the noise reduced.
- the machine-learning based model may include a plurality of third depth information and a plurality of depth information of the second object generated using pattern light reflected from a second object among pattern light emitted from the plurality of second light sources. Learn to output third depth information with reduced noise by using fourth depth information of the second object generated using the pattern light reflected from the second object among the pattern light emitted from the third light source of When the first depth information is input, based on the learning result, the first depth information with reduced noise may be output.
- the number of the plurality of second light sources may be equal to the number of the plurality of first light sources, and the number of the plurality of third light sources may be greater than the number of the plurality of first light sources.
- the machine-learning based model may include a Convolution Neural Network (CNN) or a Generic Adversarial Network (GAN).
- CNN Convolution Neural Network
- GAN Generic Adversarial Network
- the processor may generate a three-dimensional image of the second component by using the second depth information with the noise reduced, and by using the three-dimensional image of the second component.
- the mounting state of parts can be inspected.
- the machine-learning based model may output the first depth information with reduced noise by using the visibility information when the visibility information about the first object is further input. Can be.
- the machine-learning based model may further include third depth information of the second object generated using pattern light reflected from a second object among pattern light emitted from the second light sources. Visibility information for the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources and the second of the pattern light emitted from the plurality of third light sources Trained to output the third depth information with reduced noise, using fourth depth information on the second object generated using the pattern light reflected from the object, and based on the learning result, When depth information and visibility information about a component are input, the first depth information with reduced noise may be output.
- the processor generates viability information for the part using the patterned light reflected from the part received by the image sensor and machine-learning the viability information for the part. You can enter more in the base model.
- a memory and a processor storing a machine-learning based model configured to generate first depth information of the first object based on a plurality of depth information and to output the first depth information with reduced noise
- the processor includes: Generate a plurality of depth information for the part using the patterned light reflected from the part received by the image sensor, and the part Inputs a plurality of depth information about the to the machine-learning based model, obtains second noise information with reduced noise from the machine-learning based model, and wherein the second depth information is a plurality of depth information about the part. Generated by the machine-learning based model based on the second depth information, in which the noise is reduced, to check the mounting state of the component.
- the machine-learning based model may include a plurality of depth information of the second object generated by using pattern light reflected from a second object among pattern light emitted from the plurality of second light sources. Generates third depth information on the second object and is generated using pattern light reflected from the second object among the third depth information and the pattern light irradiated from the plurality of third light sources.
- the third depth information is reduced by using the fourth depth information on the second object and is learned, and based on the learning result, a plurality of depth information about the first object is input.
- the first depth information may be generated, and the first depth information in which the noise is reduced may be output.
- the number of the plurality of second light sources may be equal to the number of the plurality of first light sources, and the number of the plurality of third light sources may be greater than the number of the plurality of first light sources.
- a non-transitory computer readable recording medium having recorded thereon a program for performing on a computer, wherein the program, when executed by a processor, the pattern light is irradiated to the component mounted on the printed circuit board by the processor Controlling a plurality of first light sources so as to generate first depth information for the part using the patterned light reflected from the part received by the image sensor, and machine-learning the first depth information.
- the machine-learning based model includes: from the first object of the pattern light emitted from the plurality of second light sources When first depth information about the first object generated using the reflected pattern light is input, the first depth information with reduced noise may be output.
- the machine-learning based model may include third depth information of the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources. And third depth information of which noise is reduced, using fourth depth information of the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of third light sources.
- the first depth information is input based on the learning result, the first depth information with reduced noise may be output.
- the number of the plurality of second light sources may be equal to the number of the plurality of first light sources, and the number of the plurality of third light sources may be greater than the number of the plurality of first light sources.
- the machine-learning based model may include a Convolution Neural Network (CNN) or a Generic Adversarial Network (GAN).
- CNN Convolution Neural Network
- GAN Generic Adversarial Network
- the executable instruction may further include: generating, by the processor, a three-dimensional image of the second component using the second depth information with the noise reduced and for the second component; By using the three-dimensional image, the step of inspecting the mounting state of the component can be further performed.
- the machine-learning based model if the machine-learning based model further inputs visibility information on the first object, the machine-learning based model outputs the first depth information with reduced noise, using the visibility information. can do.
- the executable instruction further comprises the steps of: the processor generating visibility information for the component using the patterned light reflected from the component received by the image sensor and for the component;
- the method may further include further inputting the visibility information into the machine-learning based model.
- a method of inspecting a mounting state of a component includes controlling a plurality of first light sources so that pattern light is irradiated onto a component mounted on a printed circuit board, and an image Generating first depth information for the part using the patterned light reflected from the part received by the sensor, inputting the first depth information to a machine-learning based model, the noise reduced Acquiring first depth information from the machine-learning based model and checking the mounting state of the component using the first depth information with reduced noise, wherein the machine-learning based model includes: First depth information regarding the first object generated using the pattern light reflected from the first object among the pattern light emitted from the plurality of second light sources is input. , May output the first depth information, the noise is reduced.
- the machine-learning based model may further include a first method for the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources.
- a third noise is reduced by using fourth depth information of the second object generated by using the pattern light reflected from the second object among the depth information and the pattern light emitted from the plurality of third light sources.
- an apparatus for inspecting a printed circuit board may reduce depth in depth information of a part by processing depth information about the part through a machine-learning based model, and reduce the depth of the part where the noise is reduced.
- the information can be used to inspect the mounting state of components mounted on a printed circuit board.
- the printed circuit board inspection apparatus uses a machine-learning-based model to remove noise, such as unreceived or peak signals, from the depth information on the part, even if a relatively small number of image data is acquired to generate depth information.
- the depth information of the part may be generated using a machine-learning based model to restore the lost shape.
- the apparatus for inspecting a printed circuit board may detect the three-dimensional sharpness of the corners of the parts as much as possible without performing the restoration of the joint shape of the parts without additionally damaging the measured foreign material shape.
- noise is reduced in the depth information of the component, and the component is more accurately mounted on the printed circuit board by performing the shape restoration to the level as close as possible to the shape of the actual component and the solder paste. You can check the status.
- FIG. 1 illustrates a printed circuit board inspection apparatus according to various embodiments of the present disclosure.
- FIG. 2 is a block diagram of an apparatus for inspecting a printed circuit board according to various embodiments of the present disclosure.
- FIG. 3 is a flowchart of a method of inspecting a mounting state of components by a printed circuit board inspection apparatus according to various embodiments of the present disclosure.
- 4A through 4C are conceptual views illustrating a method of learning a machine-learning based model according to various embodiments of the present disclosure.
- 5A through 5C are conceptual diagrams for describing an operation of a machine-learning based model according to various embodiments of the present disclosure.
- FIG. 6 is a conceptual diagram illustrating a learning method of a machine-learning based model according to various embodiments of the present disclosure.
- FIG. 7 is a diagram for describing a method of obtaining depth information used for learning a machine-learning based model according to various embodiments of the present disclosure.
- FIG. 8 illustrates an image of a component generated using depth information with reduced noise by a printed circuit board inspection apparatus according to various embodiments of the present disclosure.
- Embodiments of the present disclosure are illustrated for the purpose of describing the technical spirit of the present disclosure.
- the scope of the present disclosure is not limited to the embodiments set forth below or the detailed description of these embodiments.
- the term “unit” refers to software or a hardware component such as a field-programmable gate array (FPGA) or an application specific integrated circuit (ASIC).
- “part” is not limited to hardware and software.
- the “unit” may be configured to be in an addressable storage medium, and may be configured to play one or more processors.
- “parts” means components such as software components, object-oriented software components, class components, and task components, processors, functions, properties, procedures, subroutines, It includes segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays, and variables. Functions provided within a component and "part” may be combined into a smaller number of components and “parts” or further separated into additional components and “parts”.
- the expression “based on” is used to describe one or more factors that affect the behavior or behavior of a decision, judgment, described in a phrase or sentence that includes the expression, which expression Does not exclude additional factors that affect decisions, actions of behaviour, or actions.
- a component when referred to as being "connected” or “connected” to another component, the component may be directly connected to or connected to the other component, or new It is to be understood that the connection may be made or may be connected via other components.
- FIG. 1 illustrates a printed circuit board inspection apparatus according to various embodiments of the present disclosure.
- the printed circuit board inspection apparatus 100 may inspect a mounting state of at least one component mounted on the printed circuit board 110.
- the transfer unit 120 may move the printed circuit board 110 to a preset position to inspect the mounting state of the component.
- the transfer unit 120 moves the completed inspection of the printed circuit board 110 to move away from the preset position, and moves the other printed circuit board 111 in advance. It can be moved to a set printed circuit board.
- the printed circuit board inspection apparatus 100 may include a first light source 101, an image sensor 102, and a frame 103.
- the first light source 101 and the image sensor 102 may be fixed to the frame 103.
- the number and arrangement state of each of the first light source 101, the image sensor 102, and the frame 103 illustrated in FIG. 1 are for illustrative purposes only and are not limited thereto.
- one first light source 101 is disposed at the position of the image sensor 102 shown in FIG. 1, and a plurality of image sensors are disposed at the position of the first light source 101 shown in FIG. 1.
- the first light source 101 and the image sensor 102 may be arranged in various directions and angles through the plurality of frames 103.
- the first light source 101 may irradiate the patterned light to the printed circuit board 110 moved to a preset position to inspect the mounting state of the component.
- the first light sources 101 may be arranged to have different irradiation directions, different irradiation angles, and the like.
- pitch intervals of the pattern light irradiated from the first light sources 101 may be different from each other.
- the patterned light may be light having a pattern having a constant period, which is irradiated to measure a three-dimensional shape for the printed circuit board 110.
- the first light source 101 has a pattern light in which the brightness of the stripe has a sine wave shape, an on-off pattern light or a change in brightness in which the bright and dark parts are repeatedly displayed. Triangular wave pattern light or the like that is a triangular waveform can be irradiated. However, this is only for the purpose of explanation and the present invention is not limited thereto, and the first light source 101 may irradiate light including various types of patterns in which the change in brightness is repeated with a certain period.
- the image sensor 102 may receive patterned light reflected from the printed circuit board 110 and components mounted on the printed circuit board 110.
- the first image sensor 102 may generate image data using the received pattern light.
- FIG. 2 is a block diagram of an apparatus for inspecting a printed circuit board according to various embodiments of the present disclosure.
- the printed circuit board inspection apparatus 100 may include a first light source 210, an image sensor 220, a memory 230, and a processor 240.
- the printed circuit board inspection apparatus 100 may further include a communication circuit 250.
- Each component included in the printed circuit board inspection apparatus 100 may be electrically connected to each other to transmit and receive signals, data, and the like.
- the printed circuit board inspection apparatus 100 may include a plurality of first light sources 210.
- the first light source 210 may irradiate pattern light onto an inspection object (eg, a printed circuit board).
- the first light source 210 may irradiate the pattern light to the entire inspection object or irradiate the pattern light to an object (eg, a component mounted on a printed circuit board) included in the inspection object.
- the first light source 210 will be described based on patterned light on a component mounted on a printed circuit board, but the present invention is not limited thereto, and the first light source 210 includes the entire printed circuit board to be inspected.
- the patterned light may be irradiated to one region of the printed circuit board including at least one component mounted on the printed circuit board.
- the first light source 210 may include a light source (not shown), a grating (not shown), a grating transfer device (not shown), and a projection lens unit (not shown).
- the grating may convert light irradiated from the light source into pattern light.
- the grating may be conveyed through a grating transfer mechanism such as, for example, a piezo actuator (PZT) to generate phase shifted pattern light.
- the projection lens unit may cause the pattern light generated by the grating to be irradiated to the component mounted on the printed circuit board, which is an object included in the inspection object.
- the first light source 210 is a printed circuit that is an object included in the inspection target by forming pattern light through various methods such as liquid crystal display (LCD), digital light processing (DLP), liquid crystal on silicon (LCOS), and the like.
- LCD liquid crystal display
- DLP digital light processing
- LCOS liquid crystal on silicon
- the components mounted on the substrate can be irradiated.
- the image sensor 220 may receive patterned light reflected from the component.
- the image sensor 220 may receive pattern light reflected from the component to generate image data of the component.
- the image sensor 220 may transmit image data about the generated part to the processor 240.
- the memory 230 may store instructions or data related to at least one other component of the printed circuit board inspection apparatus 100.
- the memory 230 may store software and / or a program.
- the memory 230 may include an internal memory or an external memory.
- the internal memory may include at least one of volatile memory (eg, DRAM, SRAM, or SDRAM), and nonvolatile memory (eg, flash memory, hard drive, or solid state drive (SSD)).
- the external memory may be functionally or physically connected to the printed circuit board inspection apparatus 100 through various interfaces.
- the memory 230 may store instructions for operating the processor 240.
- the memory 230 may store instructions for allowing the processor 240 to control other components of the printed circuit board inspection apparatus 100 and to interoperate with an external electronic device or a server.
- the processor 240 may control other components of the printed circuit board inspection apparatus 100 based on the instructions stored in the memory 230, and interwork with an external electronic device or a server.
- the operation of the printed circuit board inspection apparatus 100 will mainly be described with each component of the printed circuit board inspection apparatus 100.
- instructions for performing an operation by each component may be stored in the memory 230.
- memory 230 may store a machine-learning based model.
- the machine-learning based model may receive first depth information about the first object generated using the pattern light reflected from the first object among the pattern light emitted from the plurality of second light sources.
- the first depth information may include at least one of a shape, color information for each pixel, brightness information, and a height value.
- the plurality of second light sources and the plurality of first light sources 210 may be the same or different.
- the plurality of second light sources differ from the plurality of first light sources 210
- the number of the plurality of second light sources may be the same as the number of the plurality of first light sources 210.
- the arrangement positions of the plurality of second light sources in the other printed circuit board inspection apparatus may include the plurality of first light sources in the printed circuit board inspection apparatus 100. May correspond to the placement position.
- the machine-learning based model may output the first depth information with reduced noise.
- the first depth information generated using the pattern light reflected from the first object may generate noise in the process of processing the received light by multiple reflection of the pattern light irradiated on the first object or the image sensor.
- the noise may be a portion of the first depth information that is determined not to correspond to the shape of the first object or to be not related to the first object.
- the machine-learning based model may be trained to output first depth information with reduced noise in order to improve the quality of the image for the first object, for example the three-dimensional image for the first object.
- Machine-learning-based models may include a Convolutional Neural Network (CNN), a Generic Adversarial Network (GAN), and the like.
- the machine-learning based model may be stored in a memory of an external electronic device or a server interworked with the printed circuit board inspection apparatus 100 by wire or wirelessly.
- the printed circuit board inspection apparatus 100 may transmit / receive information with an external electronic device or a server linked with a wire or wirelessly to reduce noise of the first depth information.
- the processor 240 may drive an operating system or an application program to control at least one other component of the printed circuit board inspection apparatus 100, and may perform various data processing and calculations.
- the processor 240 may include a central processing unit or the like, and may also be implemented as a system on chip (SoC).
- SoC system on chip
- the communication circuit 250 may communicate with an external electronic device or an external server.
- the communication circuit 250 may establish communication between the printed circuit board inspection apparatus 100 and the external electronic device.
- the communication circuit 250 may be connected to a network through wireless or wired communication to communicate with an external electronic device or an external server.
- the communication circuit 250 may be connected to the external electronic device by wire to perform communication.
- Wireless communication includes, for example, cellular communication (eg, LTE, LTE Advance (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunications System (UMTS), and Wireless Broadband (WiBro). And the like).
- the wireless communication may include short-range wireless communication (eg, wireless fidelity (WiFi), light fidelity (LiFi), Bluetooth, Bluetooth low power (BLE), Zigbee, Near Field Communication (NFC), etc.).
- WiFi wireless fidelity
- LiFi light fidelity
- BLE Bluetooth low power
- NFC Near Field Communication
- the processor 240 may generate second depth information for the component using the patterned light reflected from the component mounted on the printed circuit board received by the image sensor 220.
- the processor 240 may generate second depth information about the component using an image of the component generated by using pattern light reflected from the component generated by the image sensor 220.
- the image sensor 220 transmits the information about the patterned light to the processor 240, the processor 240 generates an image of the part, and uses the image of the part to generate a part of the part. 2 Depth information may be generated.
- the processor 240 may generate the second depth information on the part by applying an optical triangular method or a bucket algorithm to the image of the part.
- this is only for the purpose of description and the present invention is not limited thereto, and the second depth information of the component may be generated through various methods.
- the processor 240 may input the second depth information into the machine-learning based model.
- the processor 240 may directly input the second depth information to the machine-learning based model.
- the processor 240 may control the communication circuit 250 to transmit the second depth information to the external electronic device or the external server. have.
- processor 240 may obtain second depth information with reduced noise from a machine-learning based model.
- the processor 240 may obtain second depth information with reduced noise directly from the machine-learning based model.
- the processor 240 may receive the second depth information with reduced noise from the external electronic device or the external server through the communication circuit 250. Can be obtained.
- the processor 240 may check the mounting state of the component mounted on the printed circuit board using the second depth information with reduced noise. For example, the processor 240 may generate a 3D image of the component by using the second depth information with reduced noise. In addition, the processor 240 may inspect the mounting state of the component using a 3D image of the generated component. For example, the processor 240 may use a three-dimensional image of a part to determine whether the part is mounted at a predetermined position, whether the part is mounted in a preset direction, and at least a part of the part is tilted and mounted. The mounting status of the part can be inspected by checking whether there is a foreign object in the part or not.
- the machine-learning based model may output first depth information with reduced noise using the visibility information.
- the visibility information is information representing a degree of noise, and the machine-learning based model may use the visibility information to more effectively reduce noise in the first depth information.
- the processor 240 may generate visibility information about the component using the patterned light reflected from the component received by the image sensor 220.
- the visibility information represents the ratio of the amplitude B i (x, y) to the average brightness A i (x, y) in the brightness signal of the image data, and generally increases as the reflectance increases.
- V i (x, y) can be expressed by equation (1).
- pattern light from each of the plurality of first light sources 210 may be irradiated onto the printed circuit board in various directions to generate a plurality of image data for the component by the image sensor 220 or the processor 240.
- the processor 240 generates N brightness degrees I i 1, I i 2, ..., I i N at each position i (x, y) of the XY coordinate system from the generated plurality of image data.
- the average brightness A i (x, y) can be calculated using the amplitude Bi (x, y) and the N-bucket algorithm.
- Processor 240 may generate the amplitude (B i (x, y)) and the mean brightness (A i (x, y)) by, Visionary Stability information (V i (x, y)) using a calculation .
- the processor 240 may further input visibility information on the generated part into the machine-learning based model.
- the machine-learning based model may receive a plurality of depth information about the first object generated using the pattern light reflected from the first object among the pattern light emitted from the plurality of second light sources. Since each of the plurality of second light sources irradiates the pattern light to the first object, and the pattern light irradiated by each of the plurality of second light sources is reflected from the first object and received by the image sensor, the plurality of depth information about the first object. Can be generated.
- the machine-learning based model may generate first depth information about the first object based on the plurality of depth information and output first depth information with reduced noise.
- first depth information may be generated based on the plurality of depth information about the first object as representative depth information about the first object.
- the processor 240 may use the patterned light reflected from the component received by the image sensor 220 to generate a plurality of depth information about the component. Since each of the plurality of first light sources irradiates the pattern light onto the part, and the pattern light irradiated by each of the plurality of first light sources is reflected from the part and received by the image sensor 220, a plurality of depth information about the part may be generated. Can be.
- the processor 240 may input a plurality of depth information about the component into the machine-learning based model.
- each of the plurality of first light sources 210 radiates pattern light to a component mounted on a printed circuit board, and the image sensor 220 uses the pattern light reflected from the component to generate a plurality of image data about the component. Can be generated.
- the image sensor 220 may transfer a plurality of image data to the processor 240.
- the processor 240 may generate a plurality of depth information about the component by using the plurality of image data.
- processor 240 may obtain second depth information with reduced noise from a machine-learning based model.
- the second depth information may be generated by the machine-learning based model based on the plurality of depth information for the part.
- the second depth information may be generated based on the plurality of depth information about the part as the representative depth information about the part.
- the machine-learning based model may receive a plurality of image data for the first object generated using the pattern light reflected from the first object among the pattern light emitted from the plurality of second light sources.
- the machine-learning based model may generate first depth information on the first object and output first depth information with reduced noise using the plurality of image data.
- a detailed method of learning to generate the first depth information and outputting the first depth information with reduced noise may be described. It will be described later.
- the processor 240 may input a plurality of image data for the component generated using the patterned light reflected from the component received by the image sensor 220 into the machine-learning based model.
- the processor 240 may generate a plurality of image data of the part by using the information on the pattern light reflected from the part received by the image sensor 220, and may generate the plurality of image data of the machine. It can also be entered into a learning-based model.
- the processor 240 may obtain the first depth information with reduced noise from the machine-learning based model.
- the second depth information may be generated by the machine-learning based model based on the plurality of image data.
- FIG. 3 is a flowchart of a method of inspecting a mounting state of components by a printed circuit board inspection apparatus according to various embodiments of the present disclosure.
- the apparatus 100 for inspecting printed circuit boards may irradiate patterned light onto a component mounted on the printed circuit board.
- the processor of the printed circuit board inspection apparatus 100 may control the plurality of first light sources such that pattern light is irradiated to each of the plurality of components mounted on the printed circuit board as the inspection target.
- the printed circuit board inspection apparatus 100 may receive pattern light reflected from the component, and generate second depth information on the component using the patterned light.
- the first image sensor may generate an image of the part using the patterned light reflected from the part, and transmit the image of the generated part to the processor.
- the processor may generate second depth information about the part using the image of the part received from the first image sensor.
- the printed circuit board inspection apparatus 100 may input the second depth information into the machine-learning based model.
- the processor may directly input the second depth information to the machine-learning based model.
- the processor may control the communication circuit to transmit the second depth information to the external electronic device or the external server.
- the printed circuit board inspection apparatus 100 may obtain second depth information with reduced noise from the machine-learning based model.
- the processor may obtain second depth information with reduced noise directly from the machine-learning based model.
- the processor may obtain second depth information with reduced noise through a communication circuit from the external electronic device or an external server.
- the apparatus for inspecting a printed circuit board may inspect the mounting state of the component using the second depth information with reduced noise.
- the processor may generate a 3D image of the component using the second depth information with reduced noise.
- the processor may inspect the mounting state of the component using a 3D image of the generated component.
- 4A through 4C are conceptual views illustrating a method of learning a machine-learning based model according to various embodiments of the present disclosure.
- the machine-learning based model 410 may include third depth information 411 of a second object generated using pattern light reflected from a second object among pattern light emitted from the plurality of second light sources. And third depth information with reduced noise by using the fourth depth information 412 for the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of third light sources. Can be learned to output 413.
- the machine-learning based model 410 has first depth information for a first object different from the second object used for learning, based on a result of learning to output the third depth information 413 with reduced noise. Even if it is input, the first depth information with reduced noise may be output.
- the third depth information 411 and the fourth depth information 412 may be input to the machine-learning based model 410 for learning.
- the number of the plurality of third light sources that irradiate the pattern light used to generate the fourth depth information 412 is greater than the number of the plurality of first light sources, and is equal to the number of the plurality of first light sources.
- the branches may be larger than the number of the plurality of second light sources. Since the number of the plurality of third light sources is greater than the number of the plurality of second light sources, the number of the plurality of images for the second object used in generating the fourth depth information 412 may determine the third depth information 411. It may be larger than the number of the plurality of images for the second object used in the creation.
- all of the plurality of images of the second object used in generating the fourth depth information 412 are images of the second object. However, the images may be different from each other.
- all the plurality of images of the second object used in generating the third depth information 411 are all second objects. May be an image for or different images from each other.
- the plurality of fourth light sources may have at least one irradiation direction, at least one irradiation angle, and at least one different from the plurality of third light sources.
- Light may be irradiated to the second object at pitch intervals and the like.
- the number of the plurality of images for the second object used in generating the fourth depth information 412 is greater than the number of the plurality of images for the second object used in generating the third depth information 411.
- the fourth depth information 412 generated through this may generate relatively less noise than the third depth information 411. Accordingly, the shape of the object measured through the depth information generated using a large number of light sources is closer to the actual shape of the object than the shape of the object measured through the depth information generated using a small number of light sources. can do.
- the fourth depth information 412 since the fourth depth information 412 generates relatively less noise than the third depth information 411, the fourth depth information 412 is determined by the machine-learning based model 410. In order to reduce the noise in the information 411, the third depth information 411 may be used as the reference depth information in the process of being trained to detect the noise in the third depth information 411.
- the machine-learning based model 410 may be trained to transform the third depth information 411 to converge on the fourth depth information 412.
- the third depth information 411 converted to converge to the fourth depth information 412 is referred to as transform depth information.
- the machine-learning based model 410 may compare the transform depth information and the fourth depth information 412.
- the machine-learning based model 410 may adjust a parameter for transforming the third depth information 411 based on the comparison result.
- the machine-learning based model 410 repeats the above process, so that the third depth information 411 can be converged to the fourth depth information 412 so that the parameters for the conversion of the third depth information 411 can be obtained. You can decide.
- the machine-learning based model 410 may be trained to convert the third depth information 411 to converge on the fourth depth information 412.
- the machine-learning based model 410 may be trained to output transform depth information as third depth information 414 with reduced noise.
- the machine-learning based model 410 is trained to transform the third depth information 411 to converge on the fourth depth information 412, so that the number of images for the objects available in generating the depth information is relative. Even if it is insufficient, the shape of the object can be measured more accurately.
- the machine-learning based model 410 may be trained to detect noise in the third depth information 411.
- the machine-learning based model 410 may be trained to detect noise in the third depth information 411 and output the third depth information 414 with reduced noise by reducing the detected noise. .
- the machine-learning based model 410 may be trained to detect the first portion determined as noise in the third depth information 411 by comparing the transform depth information with the third depth information 411. .
- the machine-learning based model 410 may be trained to detect a portion where a difference between the transform depth information and the third depth information 411 is greater than or equal to a preset threshold as the first portion.
- the machine-learning based model 410 may be trained to output the third depth information 413 with reduced noise by reducing the noise detected in the third depth information 411.
- the machine-learning based model 420 may include the pattern light reflected from the second object among the third depth information 411, the fourth depth information 412, and the pattern light emitted from the plurality of second light sources. Using the visibility information 421 for the second object generated using, the third depth information 422 with reduced noise may be output.
- the machine-learning based model 420 may generate first depth information for the first object different from the second object used for the training, based on the result of the training to output the third depth information 413 with reduced noise and Even if the visibility information about the first object is input, the first depth information with reduced noise may be output.
- the third depth information 411, the fourth depth information 412, and the visibility information 421 may be input to the machine-learning based model 420.
- the machine-learning based model 420 may be trained to adjust the transform depth information using the visibility information 421 to more accurately represent the shape of the second object.
- the visibility information 421 is information indicating the degree of noise generated in the third depth information 411 which is depth information on the second object, and indicates whether the third depth information 411 is a good measurement value. Can be.
- the machine-learning based model 420 may be trained to determine whether there is a second portion of the visibility information 421 that is greater than or equal to a preset threshold.
- the machine-learning based model 420 determines a part corresponding to the second part of the transform depth information, and corresponds to the second part based on the visibility information 421. Can be learned to adjust the part of the speech.
- the machine-learning based model 421 may be trained to output the adjusted transform depth information as the third depth information 422 with reduced noise.
- the machine-learning based model 420 determines not to adjust the transform depth information when the second part does not exist, and outputs the transform depth information as the third depth information 422 with reduced noise. Can be learned to.
- the machine-learning based model 420 uses the visibility information 421 to more accurately detect noise, and is determined to be noise even if it is not actual noise among the first portions determined to be noise. Can be learned to detect the third portion.
- the machine-learning based model 420 may determine that the first part except the third part except the third part as the noise in the third depth information 411 when the third part is detected. Can be learned.
- the machine-learning based model 420 determines the first part as noise in the third depth information 411, and reduces the noise determined in the third depth information 411. By doing so, it can be learned to output the third depth information 422 with reduced noise.
- the machine-learning based model 430 may include a plurality of depth information about the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources. 431, 432, 433, and 434 may be learned to generate third depth information 411 for the second object.
- the machine-learning based model 430 may be trained to output the third depth information 411 with reduced noise, using the generated third depth information 411 and the fourth depth information 412.
- a detailed method of learning the machine-learning based model 430 to output the noise-reduced third depth information 411 using the third depth information 411 and the fourth depth information 412 is described with reference to FIG. 4A. Since it is the same as the above, a separate description will be omitted.
- the plurality of depth information 431 for the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources is applied to the machine-learning based model 430.
- 432, 433, 434 may be input.
- the machine-learning based model 430 may be trained to generate third depth information 411, which is representative depth information about the second object, using the plurality of depth information 431, 432, 433, and 434.
- the machine-learning based model 430 may include a plurality of image data in a second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources. Can be entered.
- the machine-learning based model 430 may be trained to generate third depth information 411 for the second object by using the plurality of image data in the second object.
- a plurality of image data for a second object generated by using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources is input to the machine-learning based model 430.
- the machine-learning based model 430 may be trained to generate third depth information 411 using the plurality of image data.
- the fourth depth information 412 generated using the pattern light emitted from the plurality of fourth light sources includes a plurality of fourth light sources having a number greater than the number of the plurality of third light sources. It can be produced by the substrate inspection apparatus.
- the fourth depth information 413 may be generated in a printed circuit board inspection apparatus including a plurality of third light sources having a number smaller than the number of the plurality of fourth light sources. In this case, a specific method of generating the fourth depth information 412 will be described with reference to FIG. 7.
- 5A through 5C are conceptual diagrams for describing an operation of a machine-learning based model according to various embodiments of the present disclosure.
- first depth information 511 of a first object generated by using a pattern light reflected from a first object among pattern light emitted from a plurality of second light sources in the machine-learning based model 510. ) May be input.
- the machine-learning based model 510 may output the first depth information 512 with reduced noise.
- reference depth information depth information generated by using light emitted from a plurality of fourth light sources greater than the number of three light sources
- the machine-learning based model 510 may be referred to as reference depth information.
- the first depth information 511 converted to converge to the reference depth information by using the transform depth information.
- the machine-learning based model 510 may transform the first depth information 511 to converge on the reference depth information.
- the machine-learning gibbna model 510 may output the transform depth information as the first depth information 512 with reduced noise.
- the machine-learning based model 510 may detect noise in the first depth information 511.
- the machine-learning based model 510 may detect noise in the first depth information 511 and output the first depth information 512 in which the noise is reduced by reducing the detected noise.
- the machine-learning based model 510 may detect the first portion determined as noise in the first depth information 511 by comparing the transform depth information with the first depth information 511. For example, the machine-learning based model 510 may detect, as the first part, a portion where a difference between the transform depth information and the first depth information 511 is equal to or greater than a preset threshold. The machine-learning based model 510 may output the first depth information 512 in which the noise is reduced by reducing the noise detected in the first depth information 511.
- Visibility information 521 for may be input.
- the machine-learning based model 520 outputs the first depth information 513 in which noise is reduced by using the visibility information 521. can do.
- the machine-learning based model 520 may determine whether there is a second portion of the visibility information 521 that is greater than or equal to a preset threshold. For example, if the second part is present, the machine-learning based model 520 may determine a part corresponding to the second part of the transform depth information, and determine a part corresponding to the second part based on the visibility information. I can adjust it. The machine-learning based model 520 may output the adjusted transform depth information as the third depth information 522 with reduced noise.
- the machine-learning based model 520 determines not to adjust the transform depth information when the second part does not exist, and outputs the transform depth information as the third depth information 522 with reduced noise. can do.
- the machine-learning based model 520 may detect the third portion of the first portion, which is determined to be noise, even if the noise is not actual noise, using the visibility information 421. .
- the machine-learning based model 520 may determine, as the noise in the third depth information 511, the first part excluding the third part from the first part, except for the third part. Can be.
- the machine-learning based model 520 may determine the first portion as noise in the third depth information 511.
- the machine-learning based model 520 may output the third depth information 522 in which the noise is reduced by reducing the noise determined in the third depth information 511.
- a plurality of depth information 531 of the first object generated by using the pattern light reflected from the second object among the pattern light emitted from the plurality of second light sources is applied to the machine-learning based model 530.
- the machine-learning based model 530 may generate first depth information 511, which is representative depth information of the first object, using the plurality of depth information 531, 532, 533, and 534.
- the machine-learning based model 511 may generate the first depth information 511 and then output the first depth information 511 with reduced noise as described with reference to FIG. 5A.
- a plurality of image data of the first object generated by using the pattern light reflected from the second object among the pattern light radiated from the plurality of second light sources is provided in the machine-learning based model 530.
- the machine-learning based model 530 may generate first depth information 511, which is representative depth information of the first object, using the plurality of image data.
- the machine-learning based model 511 may generate the first depth information 511 and then output the first depth information 511 with reduced noise as described with reference to FIG. 5A.
- the printed circuit board inspection apparatus 100 uses the machine-learning based models 510, 520, and 530 to obtain unreceived signals or peaks, even if a relatively small number of image data is acquired to generate depth information. Noise such as signals can be removed from the depth information on the part.
- the printed circuit board inspection apparatus 100 may have a shape lost by using the machine-learning based models 510, 520, and 530, even if a relatively small number of image data is obtained and thus lacks information for generating depth information. Depth information can be generated for the part so that it is restored.
- the printed circuit board inspection apparatus 100 may detect the three-dimensional sharpness of the corners of the parts as much as possible without performing the restoration of the joint shape of the parts without additionally damaging the measured foreign material shape.
- FIG. 6 is a conceptual diagram illustrating a learning method of a machine-learning based model according to various embodiments of the present disclosure.
- the machine-learning based model 620 may include a CNN, a GAN, and the like. Hereinafter, a method of learning a machine-learning based model will be described based on a GAN capable of performing image transformation using U-net.
- the machine-learning based model 620 may include a generator 621 and a discriminator 622.
- the generation unit 621 receives the third depth information 611 of the second object generated using the pattern light reflected from the second object of the pattern light irradiated from the plurality of second light sources, In the distinguishing unit 622, fourth depth information 612 of the second object generated using the pattern light reflected from the second object among the pattern light emitted from the plurality of third light sources may be input.
- the generation unit 621 may generate the converted third depth information by converting the third depth information 611 to converge to the fourth depth information 612.
- the distinguishing unit 622 may distinguish the converted third depth information and the fourth depth information 612 by comparing the converted third depth information with the fourth depth information 612.
- the distinguishing unit 622 may transmit a result of distinguishing the converted third depth information and the fourth depth information 612 to the generation unit 621.
- the generation unit 621 may adjust a parameter for converting the third depth information 611 according to the result received from the discriminating unit 622. This process is repeated until the distinguishing unit 622 cannot distinguish the converted third depth information and the fourth depth information 612, so that the generation unit 621 converges on the fourth depth information 612.
- the generation unit 621 is the third depth information 611 and 4 If the quality is poor among the depth information 612 (at least one pixel, the depth information of one channel such as the shadow area, the saturation area, and the SNR is significantly lower than the preset reference value by comparing the other channels). If there is one, the part data can be further refined to exclude it from the training data.
- FIG. 7 is a diagram for describing a method of obtaining depth information used for learning a machine-learning based model according to various embodiments of the present disclosure.
- the fourth depth information 412 described with reference to FIGS. 4A to 4C may be generated in the printed circuit board inspection apparatus in which the third depth information 411 is generated.
- the number of the plurality of third light sources 710 is four and the number of the plurality of fourth light sources is eight.
- the processor of the printed circuit board inspection apparatus 100 controls the plurality of third light sources 710 to irradiate the pattern light with the component mounted on the printed circuit board, and uses the pattern light reflected from the component to form a third light source.
- the plurality of third light sources 710 may be moved in a clockwise or counterclockwise direction.
- the processor controls the plurality of third light sources 710 moved in the clockwise or counterclockwise direction to irradiate the pattern light with the component mounted on the printed circuit board, and the pattern light and the third depth information 411 reflected from the component.
- Using the fourth depth information 412 can be generated.
- the number of the plurality of third light sources 710 has been described as four, but this is only for the purpose of description and the present disclosure is not limited thereto.
- the third light source 710 may be one but not a plurality.
- the fourth depth information 412 may be generated in the printed circuit board inspection apparatus including a plurality of third light sources having a number smaller than the number of the plurality of fourth light sources.
- FIG. 8 illustrates an image of a component generated using depth information with reduced noise by a printed circuit board inspection apparatus according to various embodiments of the present disclosure.
- the printed circuit board inspection apparatus 100 generates depth information on a component using pattern light reflected from the component among pattern light irradiated to the component mounted on the printed circuit board from the plurality of first light sources. can do.
- the printed circuit board inspection apparatus 100 may generate a 3D image of the component by using the generated depth information.
- noise may occur in the process of multiple reflection of the light irradiated onto the printed circuit board or the processing of the received light by the image sensor. If the generated noise is not reduced, the quality of the 3D image of the component generated by the printed circuit board inspection apparatus 100 may be degraded, and an accurate inspection of the mounting state of the component may not be performed.
- the printed circuit board inspection apparatus 100 uses a machine-learning based model to reduce noise in depth information on the part, and to generate a three-dimensional image of the part using the reduced depth information on the part. can do. Since the three-dimensional image generated by using the depth information with reduced noise can more accurately display the shape of the part, a more accurate inspection of the mounting state of the part can be performed.
- the printed circuit board inspection apparatus 100 when the printed circuit board inspection apparatus 100 generates a three-dimensional image of a part by using depth information without reducing noise, due to noise, the part and the printed circuit board
- the shape 810 of the connecting portion of (eg, solder paste, etc.) may appear as an abnormal shape in a three-dimensional image or as if there are holes.
- the printed circuit board inspection apparatus 100 reduces the noise in the depth information of the part by using the machine-learning based model, and generates the 3D image of the part by using the depth information in which the noise is reduced.
- the shape 811 of the component and the connecting portion (eg, solder paste, etc.) of the printed circuit board can be displayed more accurately in the three-dimensional image.
- the boundary of the component due to the noise may be displayed as an abnormal shape in the 3D image.
- the printed circuit board inspection apparatus 100 reduces the noise in the depth information of the part by using the machine-learning based model, and generates the 3D image of the part by using the depth information in which the noise is reduced.
- the shape 821 of the boundary of the part can be displayed more accurately in the three-dimensional image.
- the printed circuit board inspection apparatus 100 when the printed circuit board inspection apparatus 100 generates a 3D image of a part by using the generated depth information as it is, due to noise, as shown in the hole inside the part.
- the internal shape 830 of the part may be displayed as an abnormal shape in the three-dimensional image.
- the printed circuit board inspection apparatus 100 reduces the noise in the depth information of the part by using the machine-learning based model, and generates the 3D image of the part by using the depth information in which the noise is reduced.
- the internal shape 831 of the part can be displayed more accurately in the three-dimensional image.
- the printed circuit board inspection apparatus 100 displays the shape of the part more accurately through the three-dimensional image generated by using the depth information with reduced noise, so that a more accurate inspection of the mounting state of the part may be performed. have.
- Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system. Examples of computer-readable recording media may include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like.
- the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
- functional programs, codes, and code segments for implementing the above embodiments can be easily inferred by programmers in the art to which the present disclosure belongs.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Manufacturing & Machinery (AREA)
- Operations Research (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Quality & Reliability (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Supply And Installment Of Electrical Components (AREA)
- Medical Informatics (AREA)
Abstract
Description
Claims (20)
- 인쇄 회로 기판에 실장된 부품의 실장 상태를 검사하는 인쇄 회로 기판 검사 장치에 있어서,상기 부품에 패턴광을 조사하는 복수의 제1 광원;상기 부품으로부터 반사된 패턴광을 수신하는 이미지 센서;복수의 제2 광원으로부터 조사된 패턴광 중 제1 객체로부터 반사된 패턴광을 이용하여 생성되는 상기 제1 객체에 대한 제1 깊이 정보가 입력되면, 노이즈가 감소된 상기 제1 깊이 정보를 출력하는 머신-러닝 기반 모델이 저장된 메모리; 및프로세서를 포함하고, 상기 프로세서는상기 이미지 센서에 의해 수신되는 상기 부품으로부터 반사된 상기 패턴광을 이용하여 상기 부품에 대한 제2 깊이 정보를 생성하고,상기 제2 깊이 정보를 상기 머신-러닝 기반 모델에 입력하고,노이즈가 감소된 상기 제2 깊이 정보를 상기 머신-러닝 기반 모델로부터 획득하고,상기 노이즈가 감소된 상기 제2 깊이 정보를 이용하여, 상기 부품의 실장 상태를 검사하는, 인쇄 회로 기판 검사 장치.
- 제1항에 있어서,상기 머신-러닝 기반 모델은,상기 복수의 제2 광원으로부터 조사된 패턴광 중 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제3 깊이 정보 및 복수의 제3 광원으로부터 조사된 패턴광 중 상기 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제4 깊이 정보를 이용하여, 노이즈가 감소된 제3 깊이 정보를 출력하도록 학습되고,상기 학습 결과에 기초하여, 제1 깊이 정보가 입력되면, 상기 노이즈가 감소된 상기 제1 깊이 정보를 출력하는, 인쇄 회로 기판 검사 장치.
- 제2항에 있어서,상기 복수의 제2 광원의 수는 상기 복수의 제1 광원의 수와 동일하고,상기 복수의 제3 광원의 수는 상기 복수의 제1 광원의 수보다 큰, 인쇄 회로 기판 검사 장치.
- 제1항에 있어서,상기 머신-러닝 기반 모델은 CNN(Convolution Neural Network) 또는 GAN(Generative Adversarial Network)을 포함하는, 인쇄 회로 기판 검사 장치.
- 제1항에 있어서,상기 프로세서는,상기 노이즈가 감소된 상기 제2 깊이 정보를 이용하여, 상기 제2 부품에 대한 3차원 이미지를 생성하고,상기 제2 부품에 대한 3차원 이미지를 이용하여, 상기 부품의 실장 상태를 검사하는, 인쇄 회로 기판 검사 장치.
- 제1항에 있어서,상기 머신-러닝 기반 모델은,상기 제1 객체에 대한 비저빌리티(visibility) 정보가 더 입력되면, 상기 비저빌리티 정보를 이용하여, 상기 노이즈가 감소된 제1 깊이 정보를 출력하는, 인쇄 회로 기판 검사 장치.
- 제6항에 있어서,상기 머신-러닝 기반 모델은,상기 복수의 제2 광원으로부터 조사된 패턴광 중 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제3 깊이 정보, 상기 복수의 제2 광원으로부터 조사된 상기 패턴광 중 상기 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 비저빌리티 정보 및 복수의 제3 광원으로부터 조사된 패턴광 중 상기 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제4 깊이 정보를 이용하여, 노이즈가 감소된 상기 제3 깊이 정보를 출력하도록 학습되고,상기 학습 결과에 기초하여, 상기 제1 깊이 정보 및 부품에 대한 비저빌리티 정보가 입력되면, 상기 노이즈가 감소된 상기 제1 깊이 정보를 출력하는, 인쇄 회로 기판 검사 장치.
- 제6항에 있어서,상기 프로세서는,상기 이미지 센서에 의해 수신되는 상기 부품으로부터 반사된 상기 패턴광을 이용하여 상기 부품에 대한 비저빌리티 정보를 생성하고,상기 부품에 대한 비저빌리티 정보를 상기 머신-러닝 기반 모델에 더 입력하는, 인쇄 회로 기판 검사 장치.
- 인쇄 회로 기판에 실장된 부품의 실장 상태를 검사하는 인쇄 회로 기판 검사 장치에 있어서,상기 부품에 패턴광을 조사하는 복수의 제1 광원;상기 부품으로부터 반사된 패턴광을 수신하는 이미지 센서;복수의 제2 광원으로부터 조사된 패턴광 중 제1 객체로부터 반사된 패턴광을 이용하여 생성되는 상기 제1 객체에 대한 복수의 깊이 정보가 입력되면, 상기 제1 객체에 대한 복수의 깊이 정보에 기초하여 상기 제1 객체에 대한 제1 깊이 정보를 생성하고, 노이즈가 감소된 상기 제1 깊이 정보를 출력하는 머신-러닝 기반 모델이 저장된 메모리; 및프로세서를 포함하고, 상기 프로세서는상기 이미지 센서에 의해 수신되는 상기 부품으로부터 반사된 상기 패턴광을 이용하여 상기 부품에 대한 복수의 깊이 정보를 생성하고,상기 부품에 대한 복수의 깊이 정보를 상기 머신-러닝 기반 모델에 입력하고,노이즈가 감소된 제2 깊이 정보를 상기 머신-러닝 기반 모델로부터 획득하고 - 상기 제2 깊이 정보는 상기 부품에 대한 복수의 깊이 정보에 기초하여 상기 머신-러닝 기반 모델에 의해 생성됨 - ,상기 노이즈가 감소된 상기 제2 깊이 정보를 이용하여, 상기 부품의 실장 상태를 검사하는, 인쇄 회로 기판 검사 장치.
- 제9항에 있어서,상기 머신-러닝 기반 모델은,상기 복수의 제2 광원으로부터 조사된 패턴광 중, 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 복수의 깊이 정보를 이용하여, 상기 제2 객체에 대한 제3 깊이 정보를 생성하고,상기 제3 깊이 정보 및 복수의 제3 광원으로부터 조사된 패턴광 중, 상기 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제4 깊이 정보를 이용하여, 노이즈가 감소된 상기 제3 깊이 정보를 출력하도록 학습되고,상기 학습 결과에 기초하여, 상기 제1 객체에 대한 복수의 깊이 정보가 입력되면, 상기 제1 깊이 정보를 생성하고, 상기 노이즈가 감소된 상기 제1 깊이 정보를 출력하는, 인쇄 회로 기판 검사 장치.
- 제10항에 있어서,상기 복수의 제2 광원의 수는 상기 복수의 제1 광원의 수와 동일하고,상기 복수의 제3 광원의 수는 상기 복수의 제1 광원의 수보다 큰, 인쇄 회로 기판 검사 장치.
- 컴퓨터 상에서 수행하기 위한 프로그램을 기록한 비일시적 컴퓨터 판독 가능한 기록 매체에 있어서,상기 프로그램은, 프로세서에 의한 실행 시, 상기 프로세서가,인쇄 회로 기판에 실장된 부품에 패턴광이 조사되도록 복수의 제1 광원을 제어하는 단계;이미지 센서에 의해 수신되는 상기 부품으로부터 반사된 상기 패턴광을 이용하여 상기 부품에 대한 제1 깊이 정보를 생성하는 단계;상기 제1 깊이 정보를 머신-러닝 기반 모델에 입력하는 단계;노이즈가 감소된 상기 제1 깊이 정보를 상기 머신-러닝 기반 모델로부터 획득하는 단계; 및상기 노이즈가 감소된 상기 제1 깊이 정보를 이용하여, 상기 부품의 실장 상태를 검사는 단계를 포함하고,상기 머신-러닝 기반 모델은,복수의 제2 광원으로부터 조사된 패턴광 중 제1 객체로부터 반사된 패턴광을 이용하여 생성되는 상기 제1 객체에 대한 제1 깊이 정보가 입력되면, 노이즈가 감소된 상기 제1 깊이 정보를 출력하는, 컴퓨터 판독 가능한 기록 매체.
- 제12항에 있어서,상기 머신-러닝 기반 모델은,상기 복수의 제2 광원으로부터 조사된 패턴광 중, 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제3 깊이 정보 및 복수의 제3 광원으로부터 조사된 패턴광 중, 상기 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제4 깊이 정보를 이용하여, 노이즈가 감소된 제3 깊이 정보를 출력하도록 학습되고,상기 학습 결과에 기초하여, 제1 깊이 정보가 입력되면, 상기 노이즈가 감소된 상기 제1 깊이 정보를 출력하는, 컴퓨터 판독 가능한 기록 매체.
- 제13항에 있어서,상기 복수의 제2 광원의 수는 상기 복수의 제1 광원의 수와 동일하고,상기 복수의 제3 광원의 수는 상기 복수의 제1 광원의 수보다 큰, 컴퓨터 판독 가능한 기록 매체.
- 제12항에 있어서,상기 머신-러닝 기반 모델은 CNN(Convolution Neural Network) 또는 GAN(Generative Adversarial Network)을 포함하는, 인쇄 회로 기판 검사 장치.
- 제12항에 있어서,상기 실행 가능한 명령은, 상기 프로세서가,상기 노이즈가 감소된 상기 제2 깊이 정보를 이용하여, 상기 제2 부품에 대한 3차원 이미지를 생성하는 단계; 및상기 제2 부품에 대한 3차원 이미지를 이용하여, 상기 부품의 실장 상태를 검사하는 단계를 더 수행하도록 하는 컴퓨터 판독 가능한 기록 매체.
- 제12항에 있어서,상기 머신-러닝 기반 모델은,상기 제1 객체에 대한 비저빌리티(visibility) 정보가 더 입력되면, 상기 비저빌리티 정보를 이용하여, 상기 노이즈가 감소된 제1 깊이 정보를 출력하는, 컴퓨터 판독 가능한 기록 매체.
- 제17항에 있어서,상기 실행 가능한 명령은, 상기 프로세서가,상기 이미지 센서에 의해 수신되는 상기 부품으로부터 반사된 상기 패턴광을 이용하여 상기 부품에 대한 비저빌리티 정보를 생성하는 단계; 및상기 부품에 대한 비저빌리티 정보를 상기 머신-러닝 기반 모델에 더 입력하는 단계를 더 수행하도록 하는, 컴퓨터 판독 가능한 기록 매체.
- 인쇄 회로 기판 검사 장치에서, 부품의 실장 상태를 검사하는 방법에 있어서,인쇄 회로 기판에 실장된 부품에 패턴광이 조사되도록 복수의 제1 광원을 제어하는 단계;이미지 센서에 의해 수신되는 상기 부품으로부터 반사된 상기 패턴광을 이용하여 상기 부품에 대한 제1 깊이 정보를 생성하는 단계;상기 제1 깊이 정보를 머신-러닝 기반 모델에 입력하는 단계;노이즈가 감소된 상기 제1 깊이 정보를 상기 머신-러닝 기반 모델로부터 획득하는 단계; 및상기 노이즈가 감소된 상기 제1 깊이 정보를 이용하여, 상기 부품의 실장 상태를 검사는 단계를 포함하고,상기 머신-러닝 기반 모델은,복수의 제2 광원으로부터 조사된 패턴광 중 제1 객체로부터 반사된 패턴광을 이용하여 생성되는 상기 제1 객체에 대한 제1 깊이 정보가 입력되면, 노이즈가 감소된 상기 제1 깊이 정보를 출력하는, 부품 실장 상태 검사 방법.
- 제19항에 있어서,상기 머신-러닝 기반 모델은,상기 복수의 제2 광원으로부터 조사된 패턴광 중, 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제3 깊이 정보 및 복수의 제3 광원으로부터 조사된 패턴광 중, 상기 제2 객체로부터 반사된 패턴광을 이용하여 생성된 상기 제2 객체에 대한 제4 깊이 정보를 이용하여, 노이즈가 감소된 제3 깊이 정보를 출력하도록 학습되고,상기 학습 결과에 기초하여, 제1 깊이 정보가 입력되면, 상기 노이즈가 감소된 상기 제1 깊이 정보를 출력하는, 부품 실장 상태 검사 방법.
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP19756511.2A EP3761772B1 (en) | 2018-02-26 | 2019-02-26 | Method for inspecting mounting state of component, printed circuit board inspection apparatus, and computer readable recording medium |
| KR1020237035082A KR102716521B1 (ko) | 2018-02-26 | 2019-02-26 | 인쇄 회로 기판 검사 장치 및 부품의 실장 상태를 검사하기 위한 방법 |
| KR1020207024788A KR102473547B1 (ko) | 2018-02-26 | 2019-02-26 | 부품의 실장 상태를 검사하기 위한 방법, 인쇄 회로 기판 검사 장치 및 컴퓨터 판독 가능한 기록매체 |
| KR1020227041178A KR20220163513A (ko) | 2018-02-26 | 2019-02-26 | 인쇄 회로 기판 검사 장치 및 부품의 실장 상태를 검사하기 위한 방법 |
| US16/976,006 US11328407B2 (en) | 2018-02-26 | 2019-02-26 | Method for inspecting mounting state of component, printed circuit board inspection apparatus, and computer readable recording medium |
| CN201980015619.0A CN111788883B (zh) | 2018-02-26 | 2019-02-26 | 部件贴装状态的检查方法、印刷电路板检查装置及计算机可读记录介质 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862635022P | 2018-02-26 | 2018-02-26 | |
| US62/635,022 | 2018-02-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019164382A1 true WO2019164382A1 (ko) | 2019-08-29 |
Family
ID=67687308
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2019/002330 Ceased WO2019164382A1 (ko) | 2018-02-26 | 2019-02-26 | 부품의 실장 상태를 검사하기 위한 방법, 인쇄 회로 기판 검사 장치 및 컴퓨터 판독 가능한 기록매체 |
| PCT/KR2019/002328 Ceased WO2019164381A1 (ko) | 2018-02-26 | 2019-02-26 | 부품의 실장 상태를 검사하기 위한 방법, 인쇄 회로 기판 검사 장치 및 컴퓨터 판독 가능한 기록매체 |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2019/002328 Ceased WO2019164381A1 (ko) | 2018-02-26 | 2019-02-26 | 부품의 실장 상태를 검사하기 위한 방법, 인쇄 회로 기판 검사 장치 및 컴퓨터 판독 가능한 기록매체 |
Country Status (5)
| Country | Link |
|---|---|
| US (2) | US11244436B2 (ko) |
| EP (2) | EP3761772B1 (ko) |
| KR (4) | KR102716521B1 (ko) |
| CN (2) | CN111788476B (ko) |
| WO (2) | WO2019164382A1 (ko) |
Families Citing this family (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11650164B2 (en) * | 2019-05-15 | 2023-05-16 | Getac Technology Corporation | Artificial neural network-based method for selecting surface type of object |
| CN114073175B (zh) * | 2019-07-04 | 2023-09-22 | 株式会社富士 | 元件安装系统 |
| CN111126433A (zh) * | 2019-11-19 | 2020-05-08 | 佛山市南海区广工大数控装备协同创新研究院 | 一种工厂pcb板缺陷检测中正负样本数据平衡方法 |
| US12063745B2 (en) * | 2020-05-05 | 2024-08-13 | Integrated Dynamics Engineering Gmbh | Method for processing substrates, in particular wafers, masks or flat panel displays, with a semi-conductor industry machine |
| KR102459695B1 (ko) | 2020-11-03 | 2022-10-28 | 주식회사 고영테크놀러지 | 실장 정보를 결정하기 위한 장치, 방법 및 명령을 기록한 기록 매체 |
| CN116507907A (zh) * | 2020-11-30 | 2023-07-28 | 柯尼卡美能达株式会社 | 分析装置、检查系统及学习装置 |
| CN113032919B (zh) | 2021-03-12 | 2022-03-04 | 奥特斯科技(重庆)有限公司 | 部件承载件制造方法、处理系统、计算机程序和系统架构 |
| DE102021114568A1 (de) | 2021-06-07 | 2022-12-08 | Göpel electronic GmbH | Inspektionsvorrichtung zum prüfen von elektronischen bauteilen |
| JP7542159B2 (ja) * | 2021-09-07 | 2024-08-29 | ヤマハ発動機株式会社 | 部品実装システム |
| CN114018934B (zh) * | 2021-11-03 | 2023-11-03 | 四川启睿克科技有限公司 | 一种用于弧形金属表面缺陷检测的成像系统 |
| KR102455733B1 (ko) * | 2022-02-11 | 2022-10-18 | 주식회사 지오스테크놀러지 | 트랜지스터 아웃라인 패키지 제조 장치 및 이의 제조 방법 |
| KR102770142B1 (ko) * | 2022-02-24 | 2025-02-20 | (주)아이피에스오토 | Ai 비전인식을 이용한 볼트 검사장치 |
| WO2023159298A1 (en) * | 2022-02-28 | 2023-08-31 | National Research Council Of Canada | Deep learning based prediction of fabrication-process-induced structural variations in nanophotonic devices |
| JP2023177552A (ja) * | 2022-06-02 | 2023-12-14 | キヤノン株式会社 | 記録装置、制御方法、およびプログラム |
| CN119183518A (zh) * | 2022-08-31 | 2024-12-24 | 株式会社Lg新能源 | 基于学习模型的尺寸测量设备和方法 |
| JP2024135247A (ja) * | 2023-03-22 | 2024-10-04 | 株式会社東芝 | 光学検査方法及びプログラム、並びにそれを用いた光学検査装置 |
| CN116007526B (zh) * | 2023-03-27 | 2023-06-23 | 西安航天动力研究所 | 一种膜片刻痕深度自动测量系统及测量方法 |
| DE102023113821A1 (de) * | 2023-05-25 | 2024-11-28 | ibea Ingenieurbüro für Elektronik und Automatisation GmbH | Inspektionsvorrichtung für Baustoffe und Verfahren zur Inspektion von Baustoffen |
| KR102801231B1 (ko) * | 2024-08-27 | 2025-04-30 | 주식회사 아이브 | 광을 활용하여 제품의 결함 검사를 수행하는 방법 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0989797A (ja) * | 1995-09-25 | 1997-04-04 | Omron Corp | 実装基板検査装置 |
| KR20020046636A (ko) * | 2000-12-15 | 2002-06-21 | 송재인 | 인쇄회로기판의 납땜검사 시스템 및 방법 |
| KR20110088967A (ko) * | 2010-01-29 | 2011-08-04 | 주식회사 고영테크놀러지 | 소자의 불량 검사방법 |
| KR101311215B1 (ko) * | 2010-11-19 | 2013-09-25 | 경북대학교 산학협력단 | 기판 검사방법 |
| KR101684244B1 (ko) * | 2011-03-02 | 2016-12-09 | 주식회사 고영테크놀러지 | 기판 검사방법 |
Family Cites Families (30)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7352892B2 (en) * | 2003-03-20 | 2008-04-01 | Micron Technology, Inc. | System and method for shape reconstruction from optical images |
| KR100834113B1 (ko) | 2006-11-10 | 2008-06-02 | 아주하이텍(주) | 자동 광학 검사 시스템 |
| KR101190122B1 (ko) | 2008-10-13 | 2012-10-11 | 주식회사 고영테크놀러지 | 다중파장을 이용한 3차원형상 측정장치 및 측정방법 |
| DE102010064593A1 (de) * | 2009-05-21 | 2015-07-30 | Koh Young Technology Inc. | Formmessgerät und -verfahren |
| DE202010018585U1 (de) * | 2009-05-27 | 2017-11-28 | Koh Young Technology Inc. | Vorrichtung zur Messung einer dreidimensionalen Form |
| DE102010064635B4 (de) | 2009-07-03 | 2024-03-14 | Koh Young Technology Inc. | Verfahren zum Untersuchen eines Messobjektes |
| US9091725B2 (en) | 2009-07-03 | 2015-07-28 | Koh Young Technology Inc. | Board inspection apparatus and method |
| KR101078781B1 (ko) | 2010-02-01 | 2011-11-01 | 주식회사 고영테크놀러지 | 3차원 형상 검사방법 |
| US8855403B2 (en) * | 2010-04-16 | 2014-10-07 | Koh Young Technology Inc. | Method of discriminating between an object region and a ground region and method of measuring three dimensional shape by using the same |
| US8755043B2 (en) | 2010-11-19 | 2014-06-17 | Koh Young Technology Inc. | Method of inspecting a substrate |
| US9819879B2 (en) * | 2011-07-12 | 2017-11-14 | Samsung Electronics Co., Ltd. | Image filtering apparatus and method based on noise prediction using infrared ray (IR) intensity |
| US8971612B2 (en) | 2011-12-15 | 2015-03-03 | Microsoft Corporation | Learning image processing tasks from scene reconstructions |
| KR101215083B1 (ko) * | 2011-12-27 | 2012-12-24 | 경북대학교 산학협력단 | 기판 검사장치의 높이정보 생성 방법 |
| JP5863547B2 (ja) | 2012-04-20 | 2016-02-16 | ヤマハ発動機株式会社 | プリント基板の検査装置 |
| JP5948496B2 (ja) | 2012-05-22 | 2016-07-06 | コー・ヤング・テクノロジー・インコーポレーテッド | 3次元形状測定装置の高さ測定方法 |
| US20140198185A1 (en) * | 2013-01-17 | 2014-07-17 | Cyberoptics Corporation | Multi-camera sensor for three-dimensional imaging of a circuit board |
| JP2014157086A (ja) | 2013-02-15 | 2014-08-28 | Dainippon Screen Mfg Co Ltd | パターン検査装置 |
| JP2016518747A (ja) * | 2013-03-15 | 2016-06-23 | イェール ユニバーシティーYale University | センサー依存性ノイズを有する画像化データを処理するための技術 |
| KR20150022158A (ko) | 2013-08-22 | 2015-03-04 | (주)토탈소프트뱅크 | 기계제도 학습장치 및 학습방법 |
| KR20150023205A (ko) * | 2013-08-23 | 2015-03-05 | 주식회사 고영테크놀러지 | 기판 검사방법 및 이를 이용한 기판 검사시스템 |
| JP6296499B2 (ja) | 2014-08-11 | 2018-03-20 | 株式会社 東京ウエルズ | 透明基板の外観検査装置および外観検査方法 |
| CN107003255B (zh) | 2014-12-08 | 2020-06-30 | 株式会社高迎科技 | 在基板上形成的部件的端子检查方法及基板检查装置 |
| KR101622628B1 (ko) * | 2014-12-16 | 2016-05-20 | 주식회사 고영테크놀러지 | 부품이 실장된 기판 검사방법 및 검사장치 |
| US20160321523A1 (en) * | 2015-04-30 | 2016-11-03 | The Regents Of The University Of California | Using machine learning to filter monte carlo noise from images |
| TWI737659B (zh) | 2015-12-22 | 2021-09-01 | 以色列商應用材料以色列公司 | 半導體試樣的基於深度學習之檢查的方法及其系統 |
| CN105891215B (zh) * | 2016-03-31 | 2019-01-29 | 浙江工业大学 | 基于卷积神经网络的焊接视觉检测方法及装置 |
| CN105913427B (zh) * | 2016-04-12 | 2017-05-10 | 福州大学 | 一种基于机器学习的噪声图像显著性检测方法 |
| KR101688458B1 (ko) * | 2016-04-27 | 2016-12-23 | 디아이티 주식회사 | 깊은 신경망 학습 방법을 이용한 제조품용 영상 검사 장치 및 이를 이용한 제조품용 영상 검사 방법 |
| US10346740B2 (en) * | 2016-06-01 | 2019-07-09 | Kla-Tencor Corp. | Systems and methods incorporating a neural network and a forward physical model for semiconductor applications |
| WO2018188466A1 (en) | 2017-04-12 | 2018-10-18 | Bio-Medical Engineering (HK) Limited | Automated steering systems and methods for a robotic endoscope |
-
2019
- 2019-02-26 KR KR1020237035082A patent/KR102716521B1/ko active Active
- 2019-02-26 WO PCT/KR2019/002330 patent/WO2019164382A1/ko not_active Ceased
- 2019-02-26 WO PCT/KR2019/002328 patent/WO2019164381A1/ko not_active Ceased
- 2019-02-26 CN CN201980015610.XA patent/CN111788476B/zh active Active
- 2019-02-26 EP EP19756511.2A patent/EP3761772B1/en active Active
- 2019-02-26 CN CN201980015619.0A patent/CN111788883B/zh active Active
- 2019-02-26 KR KR1020207024788A patent/KR102473547B1/ko active Active
- 2019-02-26 EP EP19756876.9A patent/EP3761013A4/en active Pending
- 2019-02-26 US US16/976,008 patent/US11244436B2/en active Active
- 2019-02-26 US US16/976,006 patent/US11328407B2/en active Active
- 2019-02-26 KR KR1020207024783A patent/KR102427381B1/ko active Active
- 2019-02-26 KR KR1020227041178A patent/KR20220163513A/ko not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0989797A (ja) * | 1995-09-25 | 1997-04-04 | Omron Corp | 実装基板検査装置 |
| KR20020046636A (ko) * | 2000-12-15 | 2002-06-21 | 송재인 | 인쇄회로기판의 납땜검사 시스템 및 방법 |
| KR20110088967A (ko) * | 2010-01-29 | 2011-08-04 | 주식회사 고영테크놀러지 | 소자의 불량 검사방법 |
| KR101311215B1 (ko) * | 2010-11-19 | 2013-09-25 | 경북대학교 산학협력단 | 기판 검사방법 |
| KR101684244B1 (ko) * | 2011-03-02 | 2016-12-09 | 주식회사 고영테크놀러지 | 기판 검사방법 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20210042910A1 (en) | 2021-02-11 |
| EP3761013A1 (en) | 2021-01-06 |
| KR102716521B1 (ko) | 2024-10-16 |
| EP3761772A1 (en) | 2021-01-06 |
| KR20230150886A (ko) | 2023-10-31 |
| US11328407B2 (en) | 2022-05-10 |
| KR102473547B1 (ko) | 2022-12-05 |
| KR20200108905A (ko) | 2020-09-21 |
| EP3761013A4 (en) | 2021-05-12 |
| KR20200108483A (ko) | 2020-09-18 |
| CN111788476A (zh) | 2020-10-16 |
| CN111788476B (zh) | 2023-07-07 |
| US11244436B2 (en) | 2022-02-08 |
| EP3761772A4 (en) | 2021-04-28 |
| WO2019164381A1 (ko) | 2019-08-29 |
| EP3761772B1 (en) | 2025-07-23 |
| KR20220163513A (ko) | 2022-12-09 |
| EP3761772C0 (en) | 2025-07-23 |
| CN111788883B (zh) | 2021-11-05 |
| KR102427381B1 (ko) | 2022-08-02 |
| CN111788883A (zh) | 2020-10-16 |
| US20210049753A1 (en) | 2021-02-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019164382A1 (ko) | 부품의 실장 상태를 검사하기 위한 방법, 인쇄 회로 기판 검사 장치 및 컴퓨터 판독 가능한 기록매체 | |
| TWI761880B (zh) | 基板缺陷檢查裝置、方法、電腦可讀記錄介質及電腦程式產品 | |
| WO2018147712A1 (ko) | 인쇄 회로 기판에 실장된 부품을 검사하는 장치, 그 동작 방법 및 컴퓨터 판독 가능한 기록 매체 | |
| KR20120083854A (ko) | 조정 장치, 레이저 가공 장치 및 조정 방법 | |
| US11971367B2 (en) | Inspection device and inspection method | |
| WO2017183923A1 (ko) | 물품의 외관 검사장치 및 이를 이용한 물품의 외관 검사방법 | |
| CN109581824B (zh) | 一种直写式光刻机光均匀性标定方法及系统 | |
| JP2020139765A (ja) | 検査装置、検査システム、検査方法、およびプログラム | |
| JP2009014696A (ja) | 照度可変照明部及び撮像部の独立可動における外観検査装置 | |
| JP2007192652A (ja) | パターン検査装置、パターン検査方法、及び検査対象試料 | |
| JP6184746B2 (ja) | 欠陥検出装置、欠陥修正装置および欠陥検出方法 | |
| CN102794771A (zh) | 机械手臂校正系统及方法 | |
| JP2006275780A (ja) | パターン検査方法 | |
| US10269105B2 (en) | Mask inspection device and method thereof | |
| JP2023109897A (ja) | 学習装置、検査装置、アライメント装置および学習方法 | |
| WO2019132599A1 (ko) | 기판에 삽입된 복수의 핀의 삽입 상태를 검사하는 방법 및 기판 검사 장치 | |
| JP4526348B2 (ja) | シェーディング補正装置及び方法 | |
| WO2019132601A1 (ko) | 기판에 삽입된 커넥터에 포함된 복수의 핀의 삽입 상태를 검사하는 방법 및 기판 검사 장치 | |
| CN120129829A (zh) | 基片检查方法、基片检查装置和基片检查程序 | |
| KR20240035661A (ko) | 멀티어레이 센서를 이용한 레이저 가공 모니터링 방법 및 장치 | |
| WO2025014131A1 (ko) | 코팅 두께를 검사하는 장치, 방법 및 명령을 기록한 기록 매체 | |
| JP4211409B2 (ja) | 基板検査装置および検査方法 | |
| CN111091779A (zh) | 显示屏调试方法、装置、设备及计算机可读存储介质 | |
| JPH04225173A (ja) | 回路基板検査装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19756511 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 20207024788 Country of ref document: KR Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2019756511 Country of ref document: EP Effective date: 20200928 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 2019756511 Country of ref document: EP |