[go: up one dir, main page]

GB2454857A - Method for processing an intensity image of a microscope - Google Patents

Method for processing an intensity image of a microscope Download PDF

Info

Publication number
GB2454857A
GB2454857A GB0905233A GB0905233A GB2454857A GB 2454857 A GB2454857 A GB 2454857A GB 0905233 A GB0905233 A GB 0905233A GB 0905233 A GB0905233 A GB 0905233A GB 2454857 A GB2454857 A GB 2454857A
Authority
GB
United Kingdom
Prior art keywords
intensity image
pixel
imin
imax
intensities
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0905233A
Other versions
GB2454857B (en
GB0905233D0 (en
Inventor
Maximilian Staudacher
Fred Hamprecht
Linus Goerlitz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of GB0905233D0 publication Critical patent/GB0905233D0/en
Publication of GB2454857A publication Critical patent/GB2454857A/en
Application granted granted Critical
Publication of GB2454857B publication Critical patent/GB2454857B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G06K9/38
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method for processing an intensity image (1) of a microscope, particularly an intensity image (1) created by means of a light microscope or a scanning electron microscope of a carrier element provided with objects (3, 4). The intensity of each pixel (2) of the intensity image (1) is determined. Additionally, a plurality of minima (Imin(S1), Imin(S2), Imin(S3), Imin(S4)) and maxima (Imax(S1), Imax(S2), Imax(S3), Imax(S4)) of the intensities of a surrounding area of each pixel (2) of the intensity image (1) are determined, the area being defined by structural elements (S1 to S4) having predetermined areas and shapes. The determined intensities of the pixels (2) of the intensity image (1) and the determined minima (Imin(S1) to Imin(S4)) and Maxima (Imax(S1) to Imax(S4)) of the intensities of the surrounding area of each pixel (2) are each combined into multi-dimensional vectors.

Description

Description Title
Method for Processing a Microscope Intensity Image
Prior Art
The invention relates to a method for processing a microscope intensity image, in particular an intensity image compiled by means of a light microscope or a scanning electron microscope, of a carrier element loaded with objects.
In practice, after their production and before their intended use, products or components are freed from dirt particles i.e. cleaned by suitable cleaning methods. Since the cleanliness requirements are continually increasing, the efficiency of the cleaning processes also needs to be constantly developed further. This, however, also has the effect of increasing the demands on the analysis methods respectively used to assess the quality of the cleaning effect.
Conventionally, optoelectronic image analysis systems are essentially used in the scope of quality assurance for cleanliness analysis. In this way particles placed on a filter, having been removed from a product by means of a cleaning method, in a size range of between 30 pm and 1500 pm are measured and subsequently divided into size classes.
If however cleanliness analyses also need to be carried out in the fabrication sector with particularly stringent requirements for the product cleanliness, in which dirt particles in a size range of less than 30 pm also compromise the function of a product, it is essentially only possible to carry out analysis by using electron microscopes with which even particle sizes of between 2 and 5 nm can be detected. An analysis performed by means of an electron microscope furthermore has the advantage, compared with analyses carried out by means of light microscopy, that the chemical composition of the particles can also be analysed and thus the origin of the dirt particles can be found more readily. Possible contamination sources can therefore be found and eliminated more easily.
In general, one of the main objects of automatic particle analysis with a microscope is to identify, with the aid of an intensity image compiled by means of a microscope, particles or objects arranged on a carrier element against the background formed by the carrier element, so that they can then be sent for further processing.
In a multiplicity of particle analysis systems known in practice, the objects are separated from the background by means of threshold value observation of the intensities of an intensity image, particles consisting of aluminium phosphate scarcely being detectable with the aid of scanning electron microscope intensity images without increasing the misclassification ratio of the background.
This is due to the fact that when a grey scale value is used to segment the images, objects whose grey values lie in the dynamic range of the grey values of the background of the carrier element are not detectable. In addition, objects are frequently obscured so that detected objects are sometimes divided into a plurality of smaller objects by known particle analysis systems, although this is undesirable in cleanliness analysis.
Disclosure of the Invention
In the method according to the invention for processing a microscope intensity image, in particular an intensity image compiled by means of a light microscope or a scanning electron microscope, of a carrier element loaded with objects, the intensity of each pixel of the intensity image is found. Furthermore, a plurality of minima and maxima of the intensities of an environment, defined by structural elements with predeterrninable areas and shapes, of each pixel of the intensity image are respectively determined. The intensities found for the pixels of the intensity image as well as the minima and maxima determined for the intensities of the environment of each pixel are respectively combined to form multidimensional vectors.
This means that a weak description of the environment of a pixel is used in the method according to the invention in order to separate the background formed by the carrier element from an object, or a particle located on the carrier element, so that particles can be detected significantly better even if they scarcely stand out or do not stand out at all from the background in respect of their brightness.
Further advantages and advantageous configurations of the subject-matter according to the invention may be found in the patent claims, the description and the drawing.
An exemplary embodiment of the invention is represented in the drawing and will be explained in more detail in the
following description.
Brief Description of the Drawings
Figure 1 shows a graphical representation of the procedure according to the invention; Figure 2 shows an intensity image, compiled by means of a microscope, of a carrier element loaded with objects; Figure 3 shows the intensity image represented in Figure 2, which is processed by means of dilation with a structural element having a radius of 5 pixels for each pixel; Figure 4 shows the intensity image according to Figure 2, which is processed by erosion with a structural element having a radius of 5 pixels; Figure 5 shows another intensity image, compiled by means of a microscope, of a carrier element loaded with objects; Figure 6 shows a region denoted in detail by the region X in Figure 5, in an enlarged detail representation in a form processed by the method according to the invention; Figure 7 shows a highly schematised representation of a microscope intensity image processed by a known particle analysis system; and Figure 8 shows a representation, corresponding to Figure 7, of an intensity image which is processed according to the invention.
Embodiments of the Invention Figure 1 shows a graphical representation of the method according to the invention for processing an intensity image 1, compiled by means of a light microscope or a scanning electron microscope, of a carrier element loaded with objects.
First, the intensity I of a pixel 2 of the intensity image 1 is found by means of a suitable arrangement. The environment of the pixel 2 is subsequently eroded and dilated with structural elements Si to S4, which are configured here as concentric circular discs with increasing diameters around the pixel 2. During erosion of the environment of the pixel 2, which is respectively defined by the structural elements Si to S4, a minimum Imin(Sl), Imin(S2), Imin(S3) or Imin(S4) of the intensities of the observed environment is respectively found. In addition, a maximum Imax(Sl), Imax(S2), Imax(S3) or Imax(S4) of the intensities is respectively determined within each structural element Si to S4 during dilation of the environment of the pixel 2. The images processed during the various erosion steps and dilation steps are placed above one another together with the intensity image 1 to form a stack.
Figure 2 shows an unprocessed intensity image 1A compiled by means of a microscope, Figure 3 being a representation of the intensity image 1A after dilation with a circular disc having a radius of 5 pixels.
Figure 4, on the other hand, represents the intensity image 1A after erosion, the erosion being carried out with a structural element configured as a circular disc, having a radius of 5 pixels. The advantage of the intensity images laid above one another in a stack, and processed by means of erosion and dilation, is that a number of intensity values are now available for each pixel of the intensity image 1A according to Figure 2, which represents the starting image, corresponding to the number of intensity images that have been generated, the processed intensity images being placed on the starting image or intensity image 1A according to Figure 2.
The intensity I of the pixel 2 of the underlying intensity image 1 according to Figure 1, as well as the intensity minima Imin(S1) to Imin(S4) and intensity maxima Irnax (Si) to Imax(S4) found inside the structural elements Si to S4 during the dilation procedures and erosion procedures successively carried out repeatedly, are combined to form a multidimensional vector, the vector having a dimension of 9 in the case of four structural elements and being used as an attribute for the pixel 2.
The described procedure is carried out for each pixel of the intensity image 1A according to Figure 2, until a high-dimensional vector is compiled for each pixel, that is to say in the present case a 9-dimensional vector.
In contrast to this, vectors with lower or higher dimensions may be generated in other variants of the method according to the invention, the dimension being dependent on the number of structural elements used. The number of structural elements used depends on the application in question or on the object or particles located on a carrier element, the minimum number of structural elements to be used being two so that at least a 5-dimensional vector is generated for each pixel of an intensity image. Tests have revealed that five structural elements are particularly advantageous for the detection of aluminium phosphate particles, and can be determined with high probability by the procedure described above.
Furthermore, in other variants of the method according to the invention, different structural elements whose geometrical shapes vary are used successively to determine the intensity minima and the intensity maxima, in which case the areas of the structural elements are not modified or are likewise varied, depending on the application in question. This means that ellipses, polygons or the like, with a constant or varied area, may be provided in order to determine a multidimensional vector.
After a minimum and a maximum of the intensities, representing grey levels in the present case, have been determined in the environment defined by the respective structural element by means of dilation and erosion for each pixel of the intensity image 1 according to Figure 2 and for each structural element Si to S4, and multidimensional vectors have been formed therefrom, the multidimensional vectors are assigned to predefined classes by means of a reference system, each pixel of the intensity image 1 being classified as an object or particle, or as belonging to the background or carrier element, by means of the class categorization.
The reference system is configured in the present case as a random forest of decision trees, which is compiled during a monitored training procedure. During the training procedure, an intensity image compiled by means of a microscope is evaluated and machine classification is generated by a formal method, so as to allow a decision by the random forest in new situations based on learned structures.
The random forest, consisting of an ensemble of classifiers or decision trees, searches for a locally constant partition hyperplane in attribute space with the aid of each decision tree, in order to place the attributes for pixels of objects and of the background on different sides of this partition hyperplane. The reference system divides the vector space into regions, which represent the different classes.
During evaluation of the generated multidimensional vectors of the pixels of the intensity image 1, each of the multidimensional vectors is compared individually with the reference system. A check is made as to the region in which the currently observed multidimensional vector of a pixel is arranged. The pixel, characterised by a classified multidimensional vector, of the intensity image 1 is subsequently classified into the class assigned to the vector.
Each of the multidimensional vectors, comprising an intensity found for a pixel as well as the minima and maxima determined for the intensities of the environment of a pixel, is compared successively or simultaneously with all the decision trees of the random forest. Through the comparison, a class is assigned to each vector by a decision tree. Finally, each of the multidimensional vectors of the pixels is assigned the class which the majority of decision trees have assigned to it. This means that a pixel is assigned to the class for which the majority of decision trees or classifiers have voted.
As an alternative to configuring the reference system as a random forest, the reference system may also be configured as a self-teaching neural network or as another suitable classification system, by means of which the multidimensional vectors of the pixels of an intensity image can respectively be assigned to classes characterising particles or classes characterising
background.
The improved quality of the method described above, compared with conventional particle analysis systems, is graphically illustrated in Figure 5 and Figure 6. Figure 5 shows an intensity image of a filter screen loaded with particles, compiled by means of a microscope, the particles being configured both with different sizes and orientations and with different intensities. For example, a particle identified in detail by the reference 3 has a higher intensity or brighter grey value than a particle 4, consisting of aluminium phosphate, arranged in a region X. Owing to the brighter grey scale compared with the background, the particle 3 is easier to detect than the particle 4 whose grey level stands out only slightly from the grey level of the background formed by the filter screen.
If the intensity image represented in Figure 5 is processed with the method described above by means of repeated dilation and erosion within structural elements having different areas around a pixel, successively for all the pixels of the intensity image, and if the images thereby generated are stacked above one another, then the representation shown in Figure 6 is obtained which is an enlarged representation of the region X of Figure 5 and in which the particle 4 stands out more strongly from the
background and is therefore easier to identify.
Figure 7 shows a highly schematised enlarged representation of the region X of Figure 5, which is obtained by evaluating the intensity image according to Figure 5 using a conventional particle analysis system, and in which the particle 4 is not found.
Figure 8 shows an enlarged representation of the region X of Figure 5, which is obtained by applying the method according to the invention. In this representation, the particle 4 is represented clearly contoured and stands out with sufficient quality from the background. By means of this representation, the particle 4 is readily detectable in terms of shape and size, allowing a cleaning process to be assessed as is done in cleanliness analysis.
With the method described above, it is possible to achieve an improved detection ratio of the particles compared with conventional particle analysis systems, even for particles with grey values that differ scarcely from the background.
This means that the functionality of products can potentially be assessed better for the applications respectively studied.
Furthermore, the detection of objects or particles by means of the machine classification described above is independent both of the user and of the histogram form, so that the detection ratio of different evaluations or classifications is reproducible. This directly offers the advantage that procedures based on the method according to the invention have a much more objective meaning.
In principle, the method according to the invention is suitable for all fields of application in which objects need to be detected in intensity images and they differ from the background in terms of texture, the finding of objects in the proposed algorithm being user-and histogram-independent, in contrast to existing detection algorithms.

Claims (12)

  1. Claims 1. Method for processing a microscope intensity image (1), in particular an intensity image (1) compiled by means of a light microscope or a scanning electron microscope, of a carrier element loaded with objects (3, 4), characterised in that the intensity of each pixel (2) of the intensity image (1) is found and a plurality of minima (Imin(Sl) to Imin(S4)) and maxima (Imax (Si) to Imax(S4)) of the intensities of an environment, defined by structural elements (Si to S4) with predeterrninable areas and shapes, of each pixel (2) of the intensity image (1) are respectively determined, the intensities found for the pixels (2) of the intensity image (1) as well as the minima (Imin(Sl) to Imin(S4)) and maxima (Imax (Si) to Iniax(S4)) determined for the intensities of the environment of each pixel (2) respectively being combined to form multidimensional vectors.
  2. 2. Method according to Claim 1, characterised in that the environment of a pixel (2) is respectively observed through at least two structural elements (Si to S4) with different areas and/or shapes.
  3. 3. Method according to Claim 1 or 2, characterised in that for each structural element (Si to S4), a minimum (Imin(Si) to Imin(S4)) and a maximum (Imax (Si) to Irnax(S4)) of the intensities of the environment defined by the respective structural element (Si to S4) are respectively determined.
  4. 4. Method according to Claim 3, characterised in that the minima (Imjn(Si) to Imin(S4)) and maxima (Imax (Si) to Imax(S4)) of the intensities of the observed environment of each pixel (2) of the intensity image (1) are found by dilation and erosion.
  5. 5. Method according to one of Claims 1 to 4, characterised in that the intensities are grey levels.
  6. 6. Method according to one of Claims 1 to 5, characterised in that the structural elements (Si to S4) are configured as circles, the centre of which is in each case the observed pixel (2) of the intensity image (1) to be processed.
  7. 7. Method according to one of Claims 1 to 6, characterjsed in that each multidimensional vector can respectively be assigned by means of a reference system to a class predefined by the reference system.
  8. 8. Method according to Claim 7, characterised in that the reference system is compiled by means of a monitored training procedure, during which an intensity image is evaluated.
  9. 9. Method according to Claim 7 or 8, characterjsed in that each of the multidimensional vectors is compared individually with the reference system, and a check is respectively made as to the region in which a multidimensional vector is arranged, the pixel (2), characterised by a classified multidimensional vector, of the intensity image (1) subsequently being classified into the class assigned to the vector.
  10. 10. Method according to one of Claims 7 to 9, characterised in that the reference system is configured as a self-teaching neural network.
  11. 11. Method according to one of Claims 7 to 9, characterised in that the reference system is configured as a random forest of decision trees, the decision trees being designed as multidimensional vectors which are generated during the monitored classification procedure with the aid of an intensity image.
  12. 12. Method according to Claim 11, characterised in that each of the multidimensional vectors, comprising an intensity found for a pixel (2) as well as the minima (Imin(Si) to Imin(S4)) and maxima (Imax (Si) to Imax(S4)) determined for the intensities of the environment of a pixel (2), is compared with all the decision trees of the random forest, and a class is assigned to each vector by every decision tree, each of the multidimensional vectors finally being assigned to the class which the majority of decision trees have assigned to it.
GB0905233A 2006-09-18 2009-03-26 Method for processing a microscope intensity image Expired - Fee Related GB2454857B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102006043684A DE102006043684A1 (en) 2006-09-18 2006-09-18 Method for processing an intensity image of a microscope
PCT/EP2007/059311 WO2008034721A1 (en) 2006-09-18 2007-09-06 Method for processing an intensity image of a microscope

Publications (3)

Publication Number Publication Date
GB0905233D0 GB0905233D0 (en) 2009-05-13
GB2454857A true GB2454857A (en) 2009-05-27
GB2454857B GB2454857B (en) 2010-06-09

Family

ID=38669507

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0905233A Expired - Fee Related GB2454857B (en) 2006-09-18 2009-03-26 Method for processing a microscope intensity image

Country Status (3)

Country Link
DE (1) DE102006043684A1 (en)
GB (1) GB2454857B (en)
WO (1) WO2008034721A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11615274B2 (en) 2020-03-23 2023-03-28 Robert Bosch Gmbh Plausibility check of the output of neural classifier networks based on additional information about features

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007054392A1 (en) 2007-11-14 2009-05-20 Robert Bosch Gmbh Self-adaptivity of scanning electron microscopes
DE102018205561A1 (en) * 2017-08-18 2019-02-21 Robert Bosch Gmbh Device for classifying signals
DE102020203707A1 (en) 2020-03-23 2021-09-23 Robert Bosch Gesellschaft mit beschränkter Haftung Plausibility check of the output of neural classifier networks
DE102020204758A1 (en) 2020-04-15 2021-10-21 Robert Bosch Gesellschaft mit beschränkter Haftung Fast symmetry detection for the classification of objects from digital images
DE102020208008A1 (en) 2020-06-29 2021-12-30 Robert Bosch Gesellschaft mit beschränkter Haftung Image classification and related training for security-related classification tasks

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992013308A1 (en) * 1991-01-29 1992-08-06 Neuromedical Systems, Inc. Morphological classification system and method
WO1996038809A1 (en) * 1995-05-31 1996-12-05 Neopath, Inc. Method and apparatus for assessing slide and specimen preparation quality
US6456899B1 (en) * 1999-12-07 2002-09-24 Ut-Battelle, Llc Context-based automated defect classification system using multiple morphological masks
US20030147556A1 (en) * 2002-01-18 2003-08-07 Madhusudhana Gargesha Face classification using curvature-based multi-scale morphology
US20040126008A1 (en) * 2000-04-24 2004-07-01 Eric Chapoulaud Analyte recognition for urinalysis diagnostic system
US20040161161A1 (en) * 2003-02-14 2004-08-19 Recht Joel M. Method and system for image segmentation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1992013308A1 (en) * 1991-01-29 1992-08-06 Neuromedical Systems, Inc. Morphological classification system and method
WO1996038809A1 (en) * 1995-05-31 1996-12-05 Neopath, Inc. Method and apparatus for assessing slide and specimen preparation quality
US6456899B1 (en) * 1999-12-07 2002-09-24 Ut-Battelle, Llc Context-based automated defect classification system using multiple morphological masks
US20040126008A1 (en) * 2000-04-24 2004-07-01 Eric Chapoulaud Analyte recognition for urinalysis diagnostic system
US20030147556A1 (en) * 2002-01-18 2003-08-07 Madhusudhana Gargesha Face classification using curvature-based multi-scale morphology
US20040161161A1 (en) * 2003-02-14 2004-08-19 Recht Joel M. Method and system for image segmentation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11615274B2 (en) 2020-03-23 2023-03-28 Robert Bosch Gmbh Plausibility check of the output of neural classifier networks based on additional information about features

Also Published As

Publication number Publication date
WO2008034721A1 (en) 2008-03-27
GB2454857B (en) 2010-06-09
DE102006043684A1 (en) 2008-03-27
GB0905233D0 (en) 2009-05-13

Similar Documents

Publication Publication Date Title
TWI805857B (en) System and method for characterizing a specimen
US11526979B2 (en) Method of defect classification and system thereof
US10115040B2 (en) Convolutional neural network-based mode selection and defect classification for image fusion
US7268939B1 (en) Tracking of cells with a compact microscope imaging system with intelligent controls
US7236623B2 (en) Analyte recognition for urinalysis diagnostic system
Bong et al. Vision-based inspection system for leather surface defect detection and classification
US7646476B2 (en) Process excursion detection
AU2014236057B2 (en) Image quality assessment of microscopy images
GB2454857A (en) Method for processing an intensity image of a microscope
US7987150B1 (en) Method and apparatus for automated rule-based sourcing of substrate microfabrication defects
EP1061571A2 (en) Adaptative method and apparatus for automatically classifying surface defects
KR102550474B1 (en) Autonomous Fault Segmentation
US10636133B2 (en) Automated optical inspection (AOI) image classification method, system and computer-readable media
JP2010535430A5 (en)
JP2008082821A (en) Defect classification method and apparatus, and defect inspection apparatus
US20210334608A1 (en) Abnormal Wafer Image Classification
KR20170127269A (en) Method and apparatus for detecting and classifying surface defect of image
WO2021080888A1 (en) Deep learning networks for nuisance filtering
CN103843034A (en) Improved method of checking the appearance of the surface of a tyre
US12272042B2 (en) Detection of defects using a computationally efficient segmentation approach
IL282902B2 (en) Defect classification by fitting optical signals to a point-spread function
Chang et al. Learning vector quantization neural networks for LED wafer defect inspection
Pak et al. A Study on Defect Inspection System using Efficient Thresholding Method
Pak et al. The implementation of defect detector using efficient thresholding method
France et al. Software aspects of automated recognition of particles: the example of pollen

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20150906