WO2008077680A1 - Procédé et dispositif de contrôle optique d'objets - Google Patents
Procédé et dispositif de contrôle optique d'objets Download PDFInfo
- Publication number
- WO2008077680A1 WO2008077680A1 PCT/EP2007/062184 EP2007062184W WO2008077680A1 WO 2008077680 A1 WO2008077680 A1 WO 2008077680A1 EP 2007062184 W EP2007062184 W EP 2007062184W WO 2008077680 A1 WO2008077680 A1 WO 2008077680A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- determined
- value
- image areas
- areas
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/68—Analysis of geometric attributes of symmetry
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
Definitions
- the invention relates to a method and a corresponding apparatus for the optical inspection of objects.
- Control of all articles produced by a person who, for example, looks at all articles and visually inspects for errors, is ruled out at high production speeds. At higher production speeds, on the one hand, a person would have only a fraction of a second to look at an article, on the other hand, the person would quickly tire due to the monotonous activity.
- machine systems can be adapted to the high production and processing speeds with new technologies.
- image recognition and image processing methods are used.
- the article to be tested is guided past a camera which produces an image of the article, which is then processed further.
- CCD camera images can be generated, which can be transmitted in a simple manner to a data processing system.
- the data processing system checks the image, for example, for deviations against a reference image. If the determined deviations exceed a predefined limit value, this can Data processing system trigger a predefined action, for example, to remove the recognized as faulty article from production.
- WO 01-09820 describes a method for digital image processing in which rectangular areas within an image are identified in which predefined operations are to be performed. These predefined operations are, for example, resizing to a predefined size, a color or gamma correction. Then, for each identified area, it is checked whether predefined features exist in it, for example features that can be interpreted as the presence or absence of features. In this test, the features determined for the image to be examined are compared with a reference in order to determine deviations or similarities.
- WO2005 / 043141 A1 describes a system for testing material, for example networks, and in particular for optimizing the test method.
- the nets are passed by cameras and generated images from both sides of the network, which are processed in a data processing system.
- individual characteristic values are calculated, for example the gray value, which is compared with a previously determined reference gray value.
- the reference values are calculated as the average of a large number of individual values, each of which was determined on the basis of possibly faulty test pieces.
- a deviation of a value of an individual item is interpreted as an error if a statistically determined limit value is exceeded.
- other characteristic features can be determined and tested.
- This method is optimized to reduce deviations erroneously recognized as errors by a person reviewing the evaluated reference images and comparing them with the detected errors. Pictures on which one Error is then removed and it will be determined from the sequence of error-free images new reference values. Thresholds whose overshoot or undershoot is interpreted as an error can still be changed manually.
- a disadvantage of the methods described above is the associated computational effort, which, with limited computational resources, essentially determines the time required for examining an image and thus an article.
- the invention therefore describes a method and a corresponding device for fast and reliable testing of articles.
- the invention describes a method for checking an object for deviations from at least one reference object, with the following method steps:
- At least part of the reference image and a corresponding, equal part of the image are divided into several image areas, and that for each image area of the reference image at least one reference value and for each corresponding image area of the image, a corresponding test value is determined.
- Figure 1 is a schematic plan view of a device for carrying out the
- process Figure 2 is a schematic representation of an object to be tested
- Figure 3 is a schematic representation of the object image in a first
- Figure 4 is a schematic representation of a division of an image in the first
- Figure 5 is a schematic representation of a division of an image into first and second image areas
- FIG. 1 shows a schematic plan view of a device 110 for carrying out the method.
- the objects 120 to be tested are moved on a conveyor belt 130 past a first and a second camera 140, 141 in the direction of the arrow.
- the basic body of the objects is aligned so that it is moved past the cameras in a defined orientation.
- the objects have an oval base body whose short axis 121 points to the cameras.
- the orientation serves to align the object, so that it can be ensured that a particular side of the object is viewed with the following procedure.
- the alignment can take place via any feature, for example, based on the outer shape or in an object round base, for example via a notch in the ground, as it is known from bottles.
- the cameras each generate an image in front of a defined background 150, 151 from a moving object, wherein the first camera 140 generates an image from one side and the second camera 141 generates an image from the opposite side of the object.
- the generation of an image is triggered by means of a suitable triggering device 160, 161, when the object is moved past this.
- a suitable triggering device may for example be a light barrier or a mechanical switch.
- flash devices 170, 171 preferably with LED lights, can be used.
- the cameras such as CCD cameras, generate digital images, each composed of a finite set of pixels.
- a pixel a so-called pixel, can reproduce the color of the imaged object in color or black and white, whereby in the case of a black-and-white reproduction, the color values are reproduced as grayscale values.
- the pixels are arranged in rows and columns so that an image is a matrix of pixels.
- the use of cameras with different resolutions is possible, with a higher resolution allows the reproduction of smaller details of the imaged object or the use of a lower resolution with fewer pixels restricts the reproduction of the object to coarse details.
- higher resolution cameras and data processing equipment will be more powerful in the future, it will be possible to use higher resolution cameras.
- the inventive method is not limited in terms of a higher resolution.
- the transmission of the image files is preferably initiated by a camera, so that immediately after the generation of an image file, this is transmitted to the connected data processing system, in order then to be processed as quickly as possible.
- the cameras can be connected via a known interface to the data processing system, for example by means of a firewire IEEE 1394 interface.
- the image file can be in any format, for example, to reduce the amount of data to be transmitted in a packed format such. B. "jpg” or in an unpacked format such. B. "bmp” or in a proprietary format, which may be, for example, a specific format of the camera manufacturer, transmitted.
- the format conversion into a format suitable for processing the data then takes place in the data processing system.
- the data processing system 180 may be a so-called personal computer PC, in particular a laptop.
- the cameras as well as the device for ejecting an object identified as defective can be connected via the standard interfaces 190 normally present on the data processing system.
- FIG. 2 shows an image 200, shown schematically here, of an object 210 against a background 220, as it is transmitted from a camera to the data processing system.
- the dashed line 201 in the drawing represents the outer dimensions of the image.
- the image shown here schematically as a line drawing is actually transmitted as a color image to the data processing system, so that all colors are reproduced.
- the picture can also be a black-and-white picture, whereby the colors are rendered as grayscale.
- the background 220 can have at least one vertical contrast jump, so that different objects stand out clearly at least from one of the contrasts and thus the outer contour can be determined.
- the background of the exemplary embodiment described here has, in its upper and lower regions, a light area and a dark area positioned therebetween.
- the object 210 is a - bottle-shaped vessel shown schematically - which is shown in frontal view.
- the object has at its upper end a closure 230 and on its front side a label 240.
- the vessel and the label may be colored in a commercial manner; this is not shown in the drawing.
- FIG. 3 illustrates a first method step of processing the image 300 of a pattern object.
- the central axis 310 of the object 320 or the image of the object is determined.
- the first vertical contrast jump is first determined in each case within a surface 330 or 331 from outside to inside, ie from the image edge in the direction of the image center.
- a method known per se in image processing can be used which, for example, compares the color values of horizontally adjacent pixels.
- the surfaces 330, 331 are placed in such a way that they capture both a piece of the light background and the dark background, so that for a light color object, the contrast jump with respect to the dark background and, for an object of dark color, the contrast jump with respect to the light Background can be determined.
- the determined contrast jumps are interpreted in each case as the outer edge of the object 320.
- the central axis 310 of the image of the object can now be determined mathematically simply and is formed by those pixels which lie in the middle between the determined outer edges.
- a grid 410 is subsequently placed on the image 400 of the object, wherein the grid is larger than the object image and the grid shape is coarsely adapted to the shape of the object image.
- the grid can be aligned with the previously determined central axis, so that the imaginary grid is always positioned the same.
- a horizontal center axis can also be determined, which can be used in accordance with the vertical orientation of the imaginary grating.
- the fields of the grid are shaped so that they cover a surface without gaps, without overlapping. Accordingly, the grid fields have a triangular, hexagonal or preferably a quadrangular shape. The shape of the grid is chosen so that the image of the object is completely covered by the grid.
- the grid fields thus define image areas 420, 430.
- image areas 420, 430 For the pixels of each image area, statistical values are subsequently determined, which are stored as reference values for features of an image area.
- Those image areas - designated 420 by way of example - which at least partially represent the background are excluded from further processing. This rules out that parts of the background would be checked and evaluated as erroneous, even though the object itself is error-free. Accordingly, in the subsequent processing steps, only those image areas-designated by way of example by reference numeral 430-are processed and checked, which image exclusively the object.
- the size of the image areas is useful to choose, taking into account various factors.
- the number of image areas must be small enough that their management in the data processing system does not take a disproportionate amount of time.
- the image areas must not be too large, so that the exclusion of the image areas 420 does not leave too much of the object unconsidered.
- the number of pixels per image area is in turn to be chosen such that a statistical evaluation over the pixels of an image area is statistically meaningful or significant.
- the resolution of the camera (in pixels) ie the number of pixels of the entire image has to be considered. For the exemplary embodiment described here, a number of 5,000 to 8,000 image areas has proven successful with a camera resolution of 1392 ⁇ 1040 pixels and an object size of approximately 70 ⁇ 320 mm.
- N number of pixels of the total image, corresponds to pixel d. Camera
- This value represents a guide variable that can be varied depending on the available computing power of the data processing system, the camera resolution and the desired processing speed.
- the determination of the reference values it is optionally possible to increase the number of pixels for an image area which arrive at the determination of a value, the number of image areas to be taken into consideration and their location remaining unchanged, in which case the pixel of the image area is determined adjacent image areas are taken into account.
- the adjoining two rows or columns of pixels of the adjacent image areas can be used to determine the reference values for the image area 431, so that more pixels than the image area itself are used to determine the reference values. In practice it has been found that this procedure increases the security in the determination of the reference values, wherein the amount of pixels lying outside an image area is adjustable.
- a plurality of reference values can now be determined and stored, wherein for different objects different different reference values can first be determined, stored and used in the actual test.
- a reference value may, for example, turn off the color values of the pixels in a color space.
- the standard deviation of the red-green-blue values of the pixels is determined, so that for each color results in a value for the average value and the scattering around this mean value.
- Another reference value can be determined by means of the Sobel operator, which provides a measure of the roughness of an image area.
- the Sobel operator which is known per se, determines differences in brightness between the pixels, which can be interpreted as edges. For each pixel, a value is calculated by means of the Sobel operator, which in turn is used to calculate an average value and the standard deviation.
- the red-green-blue values (RGB) supplied by the camera for calculating the color saturation, intensity and brightness can be transformed into other color spaces, for example the HSV or HLS color space, in order to obtain the mean value and the associated standard deviation of the color saturation and intensity and to determine the brightness.
- the Sobel operator as well as the color saturation and intensity values are determined.
- various other per se known features of a matrix of contiguous pixels can be used as features for the test.
- a certain number of - well found - sample objects are now processed to generate the reference values.
- the determined mean values and standard signatures are stored, so that a number of determined mean values and associated standard deviations are available for each image area and each feature corresponding to the number of pattern objects.
- k different features M are to be taken into account when checking an object, and corresponding reference values are determined on the basis of a number / sample objects, the mean value and the associated standard deviation are determined for each feature M for each feature.
- M 1 [M 11 , ..., ⁇ ]
- M k [M kl , ..., M kl ]
- the values in the intervals M ⁇ ,..., M k are sorted in size. From these values determined for each image area and each feature, the median, ie the mean value of an interval, is then determined as the reference value for the respective feature M. The Selection of the respective median ensures that extreme values have no influence on the reference values.
- the selected reference values are then stored permanently, so that they only have to be loaded when restarting the system or the data processing system in order to be able to check associated objects immediately. Also, reference values for various types or types of objects may be stored so that the device for testing corresponding objects may be set up very quickly by loading the corresponding reference values.
- the inspection of the objects can begin.
- an image is generated by the cameras and transferred to the data processing system.
- the same method steps are used on the transferred images or image files as for determining the reference values, wherein the values now determined for the image areas do not serve as reference values, but the test values are the test values.
- the center axis of the respective imaged object is determined by the method described above.
- a - thought - grid is aligned, which divides the object image into image areas.
- the values are then determined for the same features or, if configured accordingly in the software of the data processing system, only for some of the features, again taking into account those image areas that map at least part of the background.
- each feature value of an image area is then checked whether this is within a predetermined range of values.
- the limits of the range of values are initially predetermined by the standard deviation of the reference value, but can be changed manually, so that larger or smaller deviations from the respective reference value can be set as tolerable. If the test shows that the value of the feature is within the permissible limits, this is considered “good”. Otherwise, if the feature value determined for an image area is outside the permissible limits, this is rated as "erroneous" and an error counter is incremented.
- This error counter is, for example, always incremented by 1 if at least one feature value is outside the predetermined interval for an image area, but at most once for an image area, so that in the case of multiple deviations within an image area the error counter is only incremented once.
- the error counter thus indicates the number of image areas in which at least one value of a feature is outside the permissible interval.
- Whether a tested object is to be assessed as defective and accordingly ejected is determined by means of the error counter. If an image has too many image regions with deviations, that is to say the error counter has a correspondingly high counter reading, the object under test belonging to the image is taken out of further production or ejected by means of a suitable device which is connected to the data processing system ,
- an image section of the object image it can be divided into second image areas.
- objects to be tested are, for example, covered with labels.
- the labels are produced in a separate production process and then glued by a roll using a suitable machine.
- the position of the label on the object in the horizontal and / or vertical direction vary and thus differ from that on the reference objects. If, for example, a label has a color contrast to the color of the background and deviates the position of a label, for example in the horizontal direction, such that the vertical edge of the label falls into a different column of first image areas, for example, they would be erroneously identified as being defective be rated.
- a second grid 510 is placed over the label 540 so that the second image areas defined by the second grid 510 completely cover the label.
- the second image areas may have the same size as the first image areas or be larger or smaller.
- the size of the image areas can also be chosen such that the image detail whose geometric dimensions are known in the image in many cases, is split exactly into second image areas, so that the edges of the second grid on the edge of the image section or just within the edges lies.
- a center line of the detail can be determined and used or one or two reference points or reference numbers 550 can be used, which are printed on each label at the same location.
- this is the silhouette of the head 550.
- the exact placement of the silhouette is now determined within an area in which the silhouette is usually placed using a conventional algorithm in image processing. Since the placement of the reference character relative to the label is known, now the second - imaginary - grid can be placed exactly.
- second image areas overlap at least partially with first image areas. These second image areas are not taken into account in the subsequent processing, but are removed from the processing and thus from the test. This prevents parts of the bottle body from being checked in the case of an improperly placed label. Since these second image areas each map changing sections of the bottle body due to the inaccurate placement, an examination of these second image areas would erroneously lead to a high number of errors.
- first image regions which completely or at least partially overlap with second image regions do not become defective during the processing of the first image regions considered. This prevents correspondingly that in the examination of these first image areas due to the inaccurate placement of the label, the first image areas from object to object represent very different sections of the label and would falsely lead to a high number of errors.
- a group of image areas can be determined manually for the second image areas by means of the software program running on the data processing system, which are excluded from the examination.
- the group of those second image areas which at least partially represent the lettering 560 can be excluded from the test.
- the examination of the second image areas takes place analogously to the examination of the first image areas. Accordingly, for the pixels of each image area, a reference value is initially determined for at least one feature to be checked on the basis of pattern objects and the method described above. Subsequently, for each of the image areas to be processed, the value for each feature to be tested is determined and compared with the reference feature value. If a comparison between one for an image area and the corresponding feature reference value results in a deviation beyond the permissible range, a corresponding counter is increased.
- the counter may be the one that was increased during the examination of the first image areas, or it may be another, separate counter out.
- corresponding limit values for the maximum number of errors allowed for an object and / or area can be defined and evaluated independently of each other before the object belonging to a region to be tested is rejected as defective from further production.
- a plurality of superimposed or juxtaposed grids can be used to define corresponding further image areas which are tested in the manner described above.
- the method can also be designed such that individual image regions from the first or further image regions are excluded from the test. This can for example be done by an image of a pattern object with associated Division into image areas on a monitor of the data processing system is displayed and manually a group of image areas or individual image areas can be marked so that for this no check is performed. In this way, for example, those areas of an image can be taken out of the test, in which a variable lettering, such as a date stamp, is attached. Such variable lettering would in many cases, for example when using the Sobel filter, always lead to different values that would be erroneously considered as errors.
- the described method thus enables fast, reliable testing of objects for visible deviations of sample objects, with which the reference values for the respective features were previously determined.
- the automated generation of reference values based on sample objects offers a flexible adaptation to different objects, so that the method and the corresponding device can be adapted quickly and without great effort to the examination of various objects.
- the storage of once obtained and proven in practice reference values allows the maintenance of consistent product quality, the sensitivity of the method by manually changing the interval limits for a reference value, the fast and flexible adaptation to a desired product quality.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
L'invention concerne un procédé et un dispositif de contrôle d'un objet pour y déceler des écarts par rapport à au moins un objet de référence. Des images numériques de l'objet et de l'objet de référence sont divisées en une pluralité de zones d'image. Au moins une valeur de référence ou de contrôle est à chaque fois déterminée pour les zones d'image d'après un critère de contrôle, ces valeurs étant comparées entre elles pour déterminer les écarts.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102006060741A DE102006060741A1 (de) | 2006-12-21 | 2006-12-21 | Verfahren und Vorrichtung zur optischen Prüfung von Objekten |
| DE102006060741.4 | 2006-12-21 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2008077680A1 true WO2008077680A1 (fr) | 2008-07-03 |
Family
ID=38922686
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2007/062184 Ceased WO2008077680A1 (fr) | 2006-12-21 | 2007-11-12 | Procédé et dispositif de contrôle optique d'objets |
Country Status (2)
| Country | Link |
|---|---|
| DE (1) | DE102006060741A1 (fr) |
| WO (1) | WO2008077680A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010037492A1 (fr) | 2008-10-01 | 2010-04-08 | Panasonic Electric Works Europe Ag | Procédé et système de contrôle pour réaliser un contrôle optique d'un objet |
| WO2010037493A1 (fr) * | 2008-10-01 | 2010-04-08 | Panasonic Electric Works Europe Ag | Procédé et système de contrôle pour réaliser un contrôle optique du contour d'un objet |
| DE102018110062A1 (de) * | 2018-04-26 | 2019-10-31 | IMAGO Technologies GmbH | Verfahren zur Fehlererkennung automatisierter Prozesse |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102007041987A1 (de) * | 2007-09-05 | 2009-03-12 | Henkel Ag & Co. Kgaa | Verfahren und Vorrichtung zur Bilderzeugung für die Produktionsüberwachung |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0222079A2 (fr) * | 1985-11-12 | 1987-05-20 | MANIA Elektronik Automatisation Entwicklung und Gerätebau GmbH | Méthode pour tester optiquement des cartes à circuits imprimés |
| WO2001009820A2 (fr) * | 1999-07-28 | 2001-02-08 | Intelligent Reasoning Systems, Inc. | Systeme et procede de reconnaissance d'image dynamique |
| DE102004039937A1 (de) * | 2004-08-18 | 2006-02-23 | Hoffmann, André | Verfahren und System zur Identifikation, Verifikation, Erkennung und Wiedererkennung |
-
2006
- 2006-12-21 DE DE102006060741A patent/DE102006060741A1/de not_active Withdrawn
-
2007
- 2007-11-12 WO PCT/EP2007/062184 patent/WO2008077680A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0222079A2 (fr) * | 1985-11-12 | 1987-05-20 | MANIA Elektronik Automatisation Entwicklung und Gerätebau GmbH | Méthode pour tester optiquement des cartes à circuits imprimés |
| WO2001009820A2 (fr) * | 1999-07-28 | 2001-02-08 | Intelligent Reasoning Systems, Inc. | Systeme et procede de reconnaissance d'image dynamique |
| DE102004039937A1 (de) * | 2004-08-18 | 2006-02-23 | Hoffmann, André | Verfahren und System zur Identifikation, Verifikation, Erkennung und Wiedererkennung |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2010037492A1 (fr) | 2008-10-01 | 2010-04-08 | Panasonic Electric Works Europe Ag | Procédé et système de contrôle pour réaliser un contrôle optique d'un objet |
| WO2010037493A1 (fr) * | 2008-10-01 | 2010-04-08 | Panasonic Electric Works Europe Ag | Procédé et système de contrôle pour réaliser un contrôle optique du contour d'un objet |
| DE102018110062A1 (de) * | 2018-04-26 | 2019-10-31 | IMAGO Technologies GmbH | Verfahren zur Fehlererkennung automatisierter Prozesse |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102006060741A1 (de) | 2008-06-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| DE69326502T2 (de) | Vorrichtung zur Prüfung von Produkten | |
| EP2065493B1 (fr) | Methode et dispositif pour l'identification automatique de tubes de bobines | |
| DE102013005489B4 (de) | Verfahren und Vorrichtung zur automatischen Fehlerstellenerkennung bei biegeschlaffen Körpern | |
| DE102013105214A1 (de) | Untersuchungsverfahren eines Produkts in einer Verpackungsmaschine | |
| EP4123506A1 (fr) | Procédé et dispositif d'analyse d'un produit, procédé d'apprentissage, système, programme informatique et support d'enregistrement lisible par ordinateur | |
| DE112015003903T5 (de) | Gerät zum Prüfen von Druckerzeugnissen, Verfahren zum Prüfen von Druckerzeugnissen und Programm | |
| DE102017219244B4 (de) | Inspektionsbedingungsbestimmungseinrichtung, inspektionsbedingungsbestimmungsverfahren und inspektionsbedingungsbestimmungsprogramm | |
| DE112008001839T5 (de) | Prüfvorrichtung und Prüfverfahren, welches durchdringende Strahlung verwendet | |
| DE102016100134B4 (de) | Verfahren und Vorrichtung zum Untersuchen eines Objekts unter Verwendung von maschinellem Sehen | |
| DE102004025948B3 (de) | Verfahren und Vorrichtung zum Untersuchen von lichtdurchlässigen Objekten | |
| EP3482348A1 (fr) | Procédé et dispositif pour catégoriser une surface de rupture d'un élément | |
| WO2008077680A1 (fr) | Procédé et dispositif de contrôle optique d'objets | |
| EP1578609B1 (fr) | Procede et dispositif de controle en temps reel d'images imprimees | |
| WO2023084116A1 (fr) | Procédé et dispositif de classification d'objets qui sont transportés en étant couchés sur une surface | |
| EP3247990A1 (fr) | Procédé et dispositif de détermination de l'effet de substances actives sur des nématodes et d'autres organismes dans des tests aqueux | |
| DE102013105216A1 (de) | Untersuchungsverfahren eines Produkts in einer Verpackungsmaschine | |
| WO2009127572A1 (fr) | Système et procédé d'inspection pour l'examen optique de surfaces d'objets, en particulier de surfaces de tranches semi-conductrices | |
| EP1139285B1 (fr) | Procédé et dispositif pour contrôle ou inspection d'objets | |
| DE112021005038T5 (de) | Modellerzeugungsvorrichtung für die sichtprüfung und sichtprüfungsvorrichtung | |
| EP0623885A2 (fr) | Dispositif d'acquisition d'informations de classification | |
| EP4205091B1 (fr) | Procédé pour produire un identifiant numérique d'un exemplaire d'un produit d'impression, ledit exemplaire présentant au moins une image d'impression, téléphone intelligent ou tablette doté(e) d'un tel dispositif et procédé d'utilisation de ce dispositif | |
| DE102021117716B4 (de) | Verfahren, computerlesbares Speichermedium, Vorrichtung und System zur Auswahl einer fotografischen Aufnahme | |
| EP2302563B1 (fr) | Procédé destiné à la saisie de codes barres | |
| EP3355271A1 (fr) | Procédé de configuration d'un système d'inspection assistée par ordinateur | |
| DE102008033171A1 (de) | Verfahren und Vorrichtung zur Inline-Qualitätssicherung an Druckmaschinen |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07822472 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 07822472 Country of ref document: EP Kind code of ref document: A1 |