WO2024232680A1 - Inspection system and inspection method for inspecting low density foreign substance - Google Patents
Inspection system and inspection method for inspecting low density foreign substance Download PDFInfo
- Publication number
- WO2024232680A1 WO2024232680A1 PCT/KR2024/006249 KR2024006249W WO2024232680A1 WO 2024232680 A1 WO2024232680 A1 WO 2024232680A1 KR 2024006249 W KR2024006249 W KR 2024006249W WO 2024232680 A1 WO2024232680 A1 WO 2024232680A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- product
- image
- foreign substance
- plc
- inspected
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/48—Thermography; Techniques using wholly visual means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/94—Investigating contamination, e.g. dust
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N25/00—Investigating or analyzing materials by the use of thermal means
- G01N25/72—Investigating presence of flaws
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present disclosure relates to an inspection system and an inspection method for inspecting low-density foreign substances, and more particularly, to an inspection system and an inspection method capable of distinguishing between normal products and foreign substances with high accuracy by combining thermal imaging and machine vision.
- a product can be inspected for foreign matter using an X-ray image.
- the product to be inspected and the foreign matter are displayed differently depending on the difference in density.
- low-density foreign substances include vinyl, plastic, etc.
- the present disclosure aims to provide a foreign substance inspection system and inspection method capable of identifying low-density foreign substances and photographing them through packaging materials.
- the present disclosure aims to provide a foreign substance inspection system and inspection method capable of preventing the problem of defective products being mixed among normal products.
- a foreign matter inspection system including: a conveyor belt configured to transport a product to be inspected; a product detection sensor configured to detect the introduction of the product to be inspected onto the conveyor belt; a PLC configured to generate a camera device trigger signal and transmit the signal to the camera device when the introduction of the product to be inspected is detected by the product detection sensor; the camera device configured to photograph the product to be inspected and process the photographed image; and a PC configured to receive a deep learning model learned by a server and input the image processed by the camera device into the deep learning model, thereby outputting a predicted presence or absence of a foreign matter; wherein the camera device includes an MWIR camera that photographs the product to be inspected using a wavelength of 3 ⁇ m to 5 ⁇ m; and an image processing unit configured to process a raw image photographed by the MWIR camera.
- the server of the foreign matter inspection system includes: a thermal image input module configured to receive a raw image captured by a MWIR camera; a foreign matter inclusion input module configured to receive information on whether a product to be inspected contains a foreign matter; an image processing unit configured to process the input raw image to generate a processed image; a learning information storage unit configured to generate and store a learning information dataset using the processed image and the foreign matter inclusion; and a learning unit configured to learn a deep learning model using the processed image as input data and the foreign matter inclusion as output data.
- the image processing unit of the foreign matter inspection system binarizes the raw image and generates a processed image by converting the binarized image into a preset size, and when the PC receives the image processed by the image processing unit, it outputs a predicted foreign matter inclusion of 0 or 1.
- the PC displays the raw image or the processed image through the output unit, and stores the raw image or the processed image of the product to be inspected, metadata of the raw image, and a value of 0 as a data set in a database.
- the PC transmits 1 to the PLC
- the PLC waits for a predetermined time, generates a lighting signal and transmits the lighting signal to a warning light
- the PLC sets a counter 1 in a memory, generates a rejector operation signal and transmits the rejector operation signal to a rejector, and when the rejector receives the rejector operation signal, moves the inspection target product corresponding to the predicted foreign matter inclusion to a defective product loading space.
- the defective product loading space of the foreign substance inspection system is provided with a defective product sensor configured to detect the introduction of the moved inspection target product, and when the PLC receives a detection signal of the defective product sensor within a preset time from the time when counter 1 is set in the memory, the PLC clears the memory, and when the PLC does not receive a detection signal of the defective product sensor within a preset time from the time when the memory counter 1 is set, the PLC generates a conveyor belt stop signal and transmits the conveyor belt stop signal to the conveyor belt.
- a method for inspecting foreign substances including: (a1) a step in which a deep learning model is learned by a server, which uses a processed image of a product to be inspected as input data and whether or not a foreign substance is included as output data; (a2) a step in which the deep learning model is transmitted to a PC; (b1) a step in which the introduction of a product to be inspected is detected by a product detection sensor, and the product detection sensor generates a product introduction signal and transmits the signal to a PLC; (b2) a step in which the PLC receives the product introduction signal, waits for a predetermined time, and generates a trigger signal and transmits the signal to a camera device; (b3) a step in which the camera device captures a raw image of the product to be inspected using a MWIR camera, and the camera device processes the image using an image processing unit; (b4) a step in which the processed image is transmitted to the PC; and (c
- the step (a1) of the inspection method includes: (a10) a step in which the thermal image input module of the server receives a raw image captured using a MWIR camera; (a11) a step in which the foreign matter inclusion input module of the server receives an input of whether a foreign matter is included corresponding to the raw image captured in the step (a10); (a12) a step in which the image processing unit of the server processes the raw image input in the step (a10) to generate a processed image; and (a13) a step in which the learning unit of the server trains a deep learning model that uses the processed image generated in the step (a12) as input data and the foreign matter inclusion input in the step (a11) as output data.
- the step (c) of the inspection method includes: (c10) a step of outputting the predicted presence or absence of foreign matter as 0; (c11) a step of the PC displaying the raw image or the processed image through an output unit; and (c12) a step of the PC transmitting and storing the raw image or the processed image of the product to be inspected, metadata of the raw image, and a value of 0 as a dataset in a database.
- the step (c) of the inspection method further includes: (c20) a step in which the predicted foreign substance inclusion status is output as 1; (c21) a step in which the PC transmits 1 to the PLC; (c22) a step in which the PLC waits for a predetermined time and then generates a lighting signal and transmits the lighting signal to a warning light; (c23) a step in which the PLC sets a counter 1 in a memory and generates a rejector operation signal to generate the rejector operation signal in the rejector; and (c24) a step in which, when the rejector receives the rejector operation signal, the rejector moves the inspection target product corresponding to the predicted foreign substance inclusion status to a reject loading space.
- the defective product loading space of the inspection method is provided with a defective product sensor configured to detect the introduction of the moved inspection target product
- the step (c) further includes: (c25) a step of the PLC clearing the memory if the PLC receives a detection signal of the defective product sensor within a preset time from a point at which counter 1 is set in the memory; and (c26) a step of the PLC generating a conveyor belt stop signal and transmitting the conveyor belt stop signal to the conveyor belt if the PLC does not receive a detection signal of the defective product sensor within a preset time from a point at which the memory counter 1 is set.
- the foreign substance inspection system and inspection method photograph a product to be inspected by irradiating a wavelength of an energy band capable of detecting low-density foreign substances while being able to penetrate packaging materials, thereby enabling detection of foreign substances that have entered a product to be inspected after packaging is completed, thereby increasing the probability of detecting foreign substances.
- the foreign substance inspection system has the effect of properly notifying the user of the need to remove defective products by stopping all operations if defective products are not properly removed.
- FIG. 1 is a block diagram of a foreign substance inspection system according to one embodiment of the present disclosure.
- FIG. 2 is a block diagram of a PC according to one embodiment of the present disclosure.
- FIG. 3 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
- FIG. 4 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
- FIG. 5 is a block diagram of a server according to one embodiment of the present disclosure.
- FIG. 6 is a flowchart of a method for learning a deep learning model according to one embodiment of the present disclosure.
- FIG. 7 compares a foreign substance inspection system according to one embodiment of the present disclosure with a conventional inspection system.
- symbols such as first, second, i), ii), a), b), etc. may be used. These symbols are only for distinguishing the components from other components, and the nature or order or sequence of the components is not limited by the symbols.
- symbols When a part in the specification is said to "include” or “provide” a component, this does not mean that other components are excluded, but rather that other components can be further included, unless explicitly stated otherwise.
- the “product to be inspected” is a product inspected by a foreign matter inspection system, and may preferably refer to packaging materials and food within the packaging materials.
- the product to be inspected is described as packaging materials and food within the packaging materials, but is not necessarily limited thereto.
- FIG. 1 is a block diagram of a foreign substance inspection system according to one embodiment of the present disclosure.
- FIG. 2 is a block diagram of a PC according to one embodiment of the present disclosure.
- the foreign matter inspection system (1) is configured to detect foreign matters introduced into a product to be inspected and to distinguish between normal products and defective products.
- the foreign matter inspection system (1) includes all or part of a PLC (10), a defective product sensor (11), a rejector (12), a conveyor belt (13), a warning light (14), a camera device (15), a product detection sensor (16), a PC (17), a server (18), and a database (19).
- a PLC programmable logic controller, 10
- PLC programmable logic controller
- the defective product sensor (11) is configured to sense at least a portion of the inspection target product placed on the conveyor belt (13), specifically, the inspection target product determined to be a defective product.
- the conveyor belt (13) is formed to transport the product to be inspected.
- the product to be inspected is fed into one end of the conveyor belt (13), and the product to be inspected is discharged from the other end.
- a driving unit is provided on the conveyor belt (13), so that the conveyor belt (13) can transport the product to be inspected in one direction.
- the inspection target product transported on the conveyor belt (13) is determined as a normal product or a defective product.
- the inspection target product determined as a defective product can be taken out of the conveyor belt (13) by the rejector (12).
- the rejector (12) may be, for example, a take out robot (not shown), but is not necessarily limited thereto.
- the warning light (14) lights up for a preset period of time when a defective product extracted by the rejector (12) is sensed by the defective product sensor (11).
- the camera device (15) is configured to photograph a product to be inspected being transported by a conveyor belt (13) and to process the photographed image.
- the camera device (15) includes a MWIR camera (150) and an image processing unit (152).
- the MWIR camera (150) is configured to photograph the product to be inspected.
- the MWIR camera (150) photographs the product to be inspected using mid-wave infrared (MWIR).
- MWIR means a wavelength of 3 to 5 ⁇ m.
- a conventional SWIR band camera generally cannot penetrate packaging materials formed of materials including NY, PE, and LDPE
- the MWIR camera (150) using the wavelength of the MWIR band can penetrate packaging materials formed of materials including NY, PE, and LDPE, thereby enabling detection of foreign substances introduced into the packaging material after packaging is completed.
- foreign substances introduced during packaging can be detected, thereby increasing the probability of detecting foreign substances.
- the MWIR camera (150) can detect radiant heat of the product to be inspected.
- the image processing unit (152) is configured to process a raw image captured by the MWIR camera (150).
- the image processing unit (152) generates a black and white image by binarizing a raw image and converting it into gray scale. Thereafter, the image processing unit (152) generates a processed image by converting the generated black and white image into an image of a preset size.
- the preset size may be, for example, 223 pixels.
- the image processed by the image processing unit (152) is transmitted to the PC (17).
- the product detection sensor (16) is arranged at one end of the conveyor belt (13), i.e., at the location where the product to be inspected is introduced, and is configured to detect the introduction of the product.
- the product detection sensor (16) detects a product, it can generate a detection signal and transmit it to the PLC (10).
- the PC (17) receives a deep learning model learned by the server (18) and is configured to input a processed image transmitted by the camera device (15) into the deep learning model.
- the PC (17) is configured to output whether or not a foreign substance is included corresponding to the input processed image.
- the PC (17) includes all or part of an input unit (170), an output unit (171), a control unit (172), a processing unit (173), and a communication unit (174) (see FIG. 2).
- the components of the PC (17) described in the present disclosure only describe the configurations necessary for inspection according to the present disclosure, and descriptions of additional components typically included in the PC (17) are omitted.
- the input unit (170) is configured to receive commands or data from the outside.
- a command or data input through the input unit (170) can be transmitted to another component of the PC (17).
- an image processed by the image processing unit (152) can be input through the input unit (170).
- the output unit (171) is configured to output a command or data processed or generated by the PC (17).
- the output unit (171) may include a display and a speaker.
- the control unit (172) is configured to control one or more other components of the PC (17).
- the one or more other components are concepts including both hardware and software. That is, the control unit (172) can control the input unit (170), the output unit (171), the processing unit (173), and the communication unit (174) of the PC (17).
- the processing unit (173) can perform a data processing or calculation function, and as part of the data processing or calculation function, is configured to store commands or data received from other components in volatile memory, process the commands or data stored in the volatile memory, and store the results in non-volatile memory.
- the processing unit (173) can input the processed image input through the input unit (170) into a deep learning model downloaded in advance to calculate whether or not a foreign substance is included. Meanwhile, a detailed description of the deep learning model is described in FIGS. 5 and 6.
- the communication unit (174) is configured for communication with external devices, and through the communication unit (174), the PC (17) can send and receive commands and data with the PLC (10), camera device (15), server (18), and database (19).
- PC (17) classifies the result value as "1" when the product to be inspected contains foreign substances, i.e., when it is judged to be a defective product. In addition, PC (17) classifies the result value as "0" when the product to be inspected does not contain foreign substances, i.e., when it is a normal product.
- the PC (17) transmits a predetermined signal to the PLC (10).
- the PC (17) and the PLC (10) can send and receive signals using the SSR or Ethernet/IP Interface.
- the PC (17) can provide the user with an image captured by the camera device (15) or a processed image and an OK display through the output unit (171).
- the PC (17) can store the image of the inspection target product judged as 0, the metadata of the image, and the 0 value in the database (19).
- the server (18) is configured to process, store, and transmit/receive data.
- the server (18) is configured to train a deep learning model. In this regard, it is described in detail in FIGS. 5 and 6.
- the server (18) may be a separately provided device such as an IDC, but is not necessarily limited thereto, and may be a cloud.
- the database (19) is configured to temporarily or permanently store data received from the PC (17).
- the database (19) may be a separate storage device provided outside the PC (17), but is not necessarily limited thereto, and may be a storage device provided in the PC (17) or PLC (10).
- FIG. 3 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
- the introduction of the product to be inspected is detected by the product detection sensor (16).
- the product detection sensor (16) generates a product introduction signal and transmits it to the PLC (10).
- the PLC (10) When the PLC (10) receives a product introduction signal, it waits for a first delay time, then generates a trigger signal for shooting and transmits it to the camera device (15).
- the first delay time is preferably the same time as the time it takes for the introduced inspection target product to be transported to the location where the camera device (15) is placed.
- the camera device (15) When the camera device (15) receives a trigger signal, it can detect the radiant heat of the product to be inspected by photographing the product to be inspected using the MWIR camera (150).
- the raw image photographed by the MWIR camera (150) is processed by the image processing unit (152) of the camera device (15).
- the image processing unit (152) binarizes the raw image and converts it to gray scale to generate a black and white image. Thereafter, the image processing unit (152) generates a processed image by converting the generated black and white image into an image of a preset size.
- the preset size may be, for example, 223 pixels.
- the image processed through the above process is transmitted to the PC (17).
- the PC (17) can download a deep learning model from the server (18) in advance and store it in internal memory (not shown) before receiving the processed image.
- PC (17) can input the processed image into the aforementioned deep learning model and output whether or not a foreign substance is included. At this time, PC (17) can output the result value as 1 when a foreign substance is included and output the result value as 0 when a foreign substance is not included.
- FIG. 4 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
- the PC (17) If it is determined by the PC (17) that there is no foreign substance, i.e., if the result value is 0, the PC (17) outputs a thermal image and an OK indication together to the output unit (171).
- the thermal image may be an image captured by the camera device (15) or a processed image.
- the PC (17) can store the image of the inspection target product judged as 0, the metadata of the image, and the 0 value in the database (19). At this time, the data stored in the database (19) can be used for learning or updating the deep learning model in the future.
- the PC (17) transmits the result value to the PLC (10).
- the PC (17) and the PLC (10) can transmit and receive signals or data using the SSR or Ethernet/IP Interface.
- the PLC (10) When the PLC (10) receives the result value 1 from the PC (17), it waits for the second delay time.
- the second delay time may be the same as the time taken for the inspection target product determined to be defective to be transported from the location where the camera device (15) is located to the location where the rejector (12) is located.
- the PLC (10) generates a lighting signal and transmits it to the warning light (14).
- the warning light (14) receives the lighting signal, it lights up for a preset period of time. This allows the user to know that there are defective products among the products to be inspected.
- the PLC (10) sets counter 1 in the memory (not shown), generates a rejector operation signal, and transmits it to the rejector (12).
- the rejector (12) operates when it receives a rejector operation signal.
- the rejector (12) can move a product to be inspected that has been determined to be defective from the conveyor belt (13) to a space provided separately.
- the rejector (12) may be a take-out robot, but is not necessarily limited thereto.
- a defective product sensor (11) is provided in a separately prepared space. At this time, when a product to be inspected is detected by the defective product sensor (11), the defective product sensor (11) generates a product detection signal and transmits it to the PLC (10).
- PLC 10 checks whether a product detection signal is received within a preset time from the time counter 1 is set in the memory.
- the PLC (10) clears the memory. That is, if a defective product is detected and the defective product is detected by the defective product sensor (11) within a certain time, the PLC (10) determines that the defective product has been removed from the conveyor belt (13), and can continue performing the existing work.
- the PLC (10) generates a conveyor belt stop signal and transmits it to the conveyor belt (13).
- the conveyor belt (13) When the conveyor belt (13) receives a conveyor belt stop signal, it stops operating. That is, when a defective product is detected and the defective product is not detected by the defective product sensor (11) within a certain period of time, the PLC (10) determines that the defective product has not been properly removed from the conveyor belt (13), and stops all operations so that the user can be notified of the need to remove the defective product.
- FIG. 5 is a block diagram of a server according to one embodiment of the present disclosure.
- the server (18) includes all or part of an input unit (180), an image processing unit (183), a learning information storage unit (184), and a learning unit (185).
- the input unit (180) is configured to receive data for learning.
- the input unit (180) includes a thermal image input module (181) and a foreign substance inclusion input module (182).
- the thermal image input module (181) is configured to receive a raw image as a thermal image captured by the MWIR camera (150).
- the input module (182) for detecting whether a foreign substance is included is configured to receive input on whether a foreign substance is included inside the product to be inspected, captured by the MWIR camera (150).
- the image processing unit (183) is configured to preprocess the image input by the thermal image input module (181). Specifically, the image processing unit (183) binarizes a raw image and converts it to gray scale to generate a black and white image.
- the image processing unit (183) generates a processed image by converting the generated black and white image into an image of a preset size.
- the preset size may be, for example, 223 pixels.
- information used for learning can be transformed into a shape and size appropriate for learning.
- the learning information storage unit (184) stores the processed image and whether or not the processed image contains foreign substances as a dataset. At this time, when the learning information storage unit (184) converts the processed image into a dataset, the RAY or NUMPY library can be used to create a dataset by associating the processed image and whether or not the image contains foreign substances with X and Y appropriate for TENSORFLOW KERAS learning.
- the learning unit (185) trains a deep learning model that uses the processed image as input data and the presence or absence of foreign substances as output data.
- the learning unit (185) classifies the learning information data set into training, validation, and test data sets.
- the ratio of training: validation: test is randomly classified to be 0.65: 0.2: 0.15.
- RandomSplitter can be used for random classification.
- train_test_split can be used when the training dataset and test dataset are adjusted
- validation_pct can be used when the validation dataset is adjusted
- the learning unit (185) can learn the deep learning model by compiling layers of the deep learning model after augmenting the training data set and the validation data set.
- the learning unit (185) can expand the diversity of the learning data and prevent overfitting by using a data augmentation technique when augmenting the data.
- Data augmentation can include image enlargement, cropping, rotation, random flip, random resizing, and brightness adjustment.
- the learning unit (185) can transfer learn a deep learning model with the network structure of Resnet and Mobilenet using the Pytorch-based Fastai library. As a result, the deep learning model can be configured and compiled into an advanced layer.
- the learning unit (185) can evaluate the model using a test data set, and save the deep learning model in pkl format and the evaluation values (accuracy, precision, recall, f-score, support) in csv format.
- the deep learning model learned by the learning unit (185) can be transmitted to the PC (17) when requested by the PC (17).
- FIG. 6 is a flowchart of a method for learning a deep learning model according to one embodiment of the present disclosure.
- the processed image and the corresponding presence or absence of foreign substances are created and stored as a learning information dataset (S610).
- the learning unit (185) trains a deep learning model (S620).
- the deep learning model can use the processed image as input data and the presence or absence of foreign substances as output data.
- the deep learning model is trained on the server (18) and downloaded to the PC (17).
- the PC (17) can provide the user with the predicted presence of foreign substances through the output unit (171), and the foreign substance inspection system (1) can control all or part of the components of the foreign substance inspection system (1) using the predicted presence of foreign substances.
- Fig. 7 is a comparison between a foreign substance inspection system according to one embodiment of the present disclosure and a conventional inspection system. Meanwhile, Fig. 7 photographs an inspection target product packaged in a packaging material formed of a material including NY, PE, and LDPE.
- FIG. 7(a) shows a comparison of the results of photographing the same inspection target product using a camera device (15) according to one embodiment of the present disclosure, that is, a MWIR camera (150), and the results of photographing the same product using a conventional camera device, specifically, a SWIR and X-ray camera.
- the MWIR camera (150) clearly distinguishes between food and low-density foreign substances in the product to be inspected.
- the results of the photographing using the SWIR camera barely penetrate the packaging material.
- the image that could be considered low-density foreign substances is not distinguished.
- the results of the photographing using the X-ray camera do not show any part that could be judged as low-density foreign substances.
- the foreign substance inspection system (1) has a superior ability to detect low-density foreign substances compared to conventional methods.
- FIG. 7(b) shows a comparison of the results of photographing the same inspection target product using a camera device (15) according to one embodiment of the present disclosure, that is, a MWIR camera (150), and the results of photographing the same product using a conventional camera device, specifically, a LWIR and X-ray camera.
- a camera device 15) according to one embodiment of the present disclosure, that is, a MWIR camera (150)
- a conventional camera device specifically, a LWIR and X-ray camera.
- the MWIR camera (150) can capture detailed and clear images of internal materials through the packaging material.
- the images taken using the LWIR camera barely penetrate the packaging material.
- the images taken using the X-ray camera show that the packaging material penetration ability is excellent, but it is difficult to distinguish between low-density foreign substances and food.
- the foreign substance inspection system (1) according to one embodiment of the present disclosure has excellent packaging material penetration ability and, at the same time, excellent low-density foreign substance detection ability.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
본 개시는 저밀도 이물질을 검사하기 위한 검사시스템 및 검사방법에 관한 것으로, 구체적으로는 열화상 이미지와 머신 비전을 접목함으로써, 정상품과 이물질을 높은 정확도로 구별해낼 수 있는 검사시스템 및 검사방법에 관한 것이다.The present disclosure relates to an inspection system and an inspection method for inspecting low-density foreign substances, and more particularly, to an inspection system and an inspection method capable of distinguishing between normal products and foreign substances with high accuracy by combining thermal imaging and machine vision.
이 부분에 기술된 내용은 단순히 본 개시에 대한 배경정보를 제공할 뿐 종래기술을 구성하는 것은 아니다.The material described in this section merely provides background information for the present disclosure and does not constitute prior art.
최근, 1인 가구의 증가 및 편의 식품에 대한 기호가 상승함에 따라, 간편 조리 식품 생산량이 증가하고 있다.Recently, as the number of single-person households increases and tastes for convenience foods increase, the production of easy-to-cook foods is increasing.
한편, 철저한 위생 관리에도 불구하고, 간편 조리 식품의 제조 단계에서 다양한 이물질이 유입될 수 있다. 타 제품군보다 식품에 이물질이 인입되는 것에 대해 소비자들의 반감이 높아, 식품군의 경우 제품 출하 전 면밀히 이물질 검사가 요구된다. 이물질의 유입여부를 검사하기 위하여, 다양한 기술이 제시되어 왔다.Meanwhile, despite thorough hygiene management, various foreign substances can be introduced during the manufacturing stage of easy-to-cook foods. Since consumers are more averse to foreign substances being introduced into food than other product groups, thorough foreign substance inspection is required for food groups before product shipment. Various technologies have been proposed to inspect for the introduction of foreign substances.
예를 들어, KR 10-2507591 B1 문헌에 따르면, 엑스레이 영상을 이용하여 제품의 이물질 포함여부가 검사될 수 있다. 엑스레이 이미지를 이용할 경우, 검사대상 제품과 이물질은 밀도 차이에 따라 상이하게 표시된다. For example, according to the KR 10-2507591 B1 document, a product can be inspected for foreign matter using an X-ray image. When using an X-ray image, the product to be inspected and the foreign matter are displayed differently depending on the difference in density.
다만, 엑스레이 이미지의 경우, 검사대상 제품 내에 유입된 저밀도 이물질의 식별이 어렵다는 문제점이 있다. 여기서, 저밀도 이물질이란 비닐, 플라스틱 등을 포함한다. However, in the case of X-ray images, there is a problem in that it is difficult to identify low-density foreign substances that have entered the product being inspected. Here, low-density foreign substances include vinyl, plastic, etc.
한편, 저밀도 이물질을 디텍팅하기 위해 SWIR(Short Wavelength InfraRed) 또는 LWIR(Long Wavelength InfraRed)의 파장대역을 이용하는 카메라를 사용하기도 하는데, 이러한 파장대의 파동은 포장재를 투과할 수 없다는 문제점이 있다. Meanwhile, cameras that utilize the wavelength band of SWIR (Short Wavelength InfraRed) or LWIR (Long Wavelength InfraRed) are also used to detect low-density foreign substances, but there is a problem that waves in these wavelength bands cannot penetrate the packaging material.
또한, 종래의 이물질 검사시스템은 이물질이 포함된 검사대상 제품, 즉, 불량품을 검출하더라도, 불량품이 정상 제품들 사이로 섞일 경우, 이를 알아차리기 어렵다는 문제점이 있다.In addition, conventional foreign substance inspection systems have the problem that even if they detect defective products containing foreign substances, they are difficult to detect if defective products are mixed in with normal products.
이에, 본 개시는 저밀도 이물질을 식별가능하면서도, 포장재를 투과하여 촬영이 가능한 이물질 검사시스템 및 검사방법을 제공하는 데 목적이 있다.Accordingly, the present disclosure aims to provide a foreign substance inspection system and inspection method capable of identifying low-density foreign substances and photographing them through packaging materials.
또한, 본 개시는 불량품이 정상 제품들 사이에 섞이는 문제를 방지할 수 있는 이물질 검사시스템 및 검사방법을 제공하는 데 목적이 있다.In addition, the present disclosure aims to provide a foreign substance inspection system and inspection method capable of preventing the problem of defective products being mixed among normal products.
본 발명이 해결하고자 하는 과제들은 이상에서 언급한 과제들로 제한되지 않으며, 언급되지 않은 또 다른 과제들은 아래의 기재로부터 통상의 기술자에게 명확하게 이해될 수 있을 것이다.The problems to be solved by the present invention are not limited to the problems mentioned above, and other problems not mentioned will be clearly understood by those skilled in the art from the description below.
본 개시의 일 실시예에 의하면, 검사대상 제품을 수송하도록 구성된 컨베이어 벨트; 상기 컨베이어 벨트에 검사대상 제품의 인입을 감지하는 제품감지 센서; 상기 제품감지 센서에 의해 검사대상 제품의 인입이 감지되면, 카메라 장치 트리거 신호를 생성하여 상기 카메라 장치에 전송하는 PLC; 상기 검사대상 제품을 촬영하고, 촬영된 이미지를 처리하도록 구성된 상기 카메라 장치; 및서버에 의해 학습된 딥러닝 모델을 전송받으며, 상기 카메라 장치에 의해 처리된 이미지를 상기 딥러닝 모델에 입력함으로써, 예측 이물질 포함여부를 출력하도록 구성된 PC;를 포함하고, 상기 카메라 장치는 3 ㎛ 내지 5 ㎛의 파장을 이용하여 상기 검사대상 제품을 촬영하는 MWIR 카메라; 및 상기 MWIR 카메라에 의해 촬영된 로우 이미지(raw image)를 처리하도록 구성된 이미지 처리부;를 포함하는, 이물질 검사시스템을 제공한다.According to one embodiment of the present disclosure, a foreign matter inspection system is provided, including: a conveyor belt configured to transport a product to be inspected; a product detection sensor configured to detect the introduction of the product to be inspected onto the conveyor belt; a PLC configured to generate a camera device trigger signal and transmit the signal to the camera device when the introduction of the product to be inspected is detected by the product detection sensor; the camera device configured to photograph the product to be inspected and process the photographed image; and a PC configured to receive a deep learning model learned by a server and input the image processed by the camera device into the deep learning model, thereby outputting a predicted presence or absence of a foreign matter; wherein the camera device includes an MWIR camera that photographs the product to be inspected using a wavelength of 3 ㎛ to 5 ㎛; and an image processing unit configured to process a raw image photographed by the MWIR camera.
또한, 바람직하게는, 본 개시의 일 실시예에 따른 이물질 검사시스템의 상기 서버는, MWIR 카메라에 의해 촬영된 로우 이미지를 입력받도록 구성된 열화상 이미지 입력모듈; 검사대상 제품 내의 이물질 포함여부를 입력받도록 구성된 이물질 포함여부 입력모듈; 입력된 상기 로우 이미지를 처리하여, 처리된 이미지를 생성하도록 구성된 이미지 처리부; 상기 처리된 이미지와 상기 이물질 포함여부를 이용하여 학습정보 데이터세트를 생성 및 저장하는 학습정보 저장부; 및 상기 처리된 이미지를 입력자료로 하고, 상기 이물질 포함여부를 출력자료로 하는 딥러닝 모델을 학습하도록 구성된 학습부;를 포함한다.In addition, preferably, the server of the foreign matter inspection system according to one embodiment of the present disclosure includes: a thermal image input module configured to receive a raw image captured by a MWIR camera; a foreign matter inclusion input module configured to receive information on whether a product to be inspected contains a foreign matter; an image processing unit configured to process the input raw image to generate a processed image; a learning information storage unit configured to generate and store a learning information dataset using the processed image and the foreign matter inclusion; and a learning unit configured to learn a deep learning model using the processed image as input data and the foreign matter inclusion as output data.
또한, 바람직하게는, 본 개시의 일 실시예에 따른 이물질 검사시스템의 상기 이미지 처리부는 상기 로우 이미지를 이진화하고, 상기 이진화된 이미지를 미리 설정된 크기로 변환함으로써 처리된 이미지를 생성하고, 상기 PC는 상기 이미지 처리부에 의해 처리된 이미지를 입력받으면, 0 또는 1의 예측 이물질 포함여부를 출력한다.In addition, preferably, the image processing unit of the foreign matter inspection system according to one embodiment of the present disclosure binarizes the raw image and generates a processed image by converting the binarized image into a preset size, and when the PC receives the image processed by the image processing unit, it outputs a predicted foreign matter inclusion of 0 or 1.
또한, 바람직하게는, 본 개시의 일 실시예에 따른 이물질 검사시스템은, 상기 예측 이물질 포함여부가 0이면, 상기 PC는 출력부를 통해 상기 로우 이미지 또는 상기 처리된 이미지를 표시하고, 상기 검사대상 제품의 로우 이미지 또는 처리된 이미지, 상기 로우 이미지의 메타 데이터 및 0값을 데이터세트로 하여 데이터베이스에 저장한다.In addition, preferably, in the foreign matter inspection system according to one embodiment of the present disclosure, if the predicted foreign matter inclusion is 0, the PC displays the raw image or the processed image through the output unit, and stores the raw image or the processed image of the product to be inspected, metadata of the raw image, and a value of 0 as a data set in a database.
또한, 바람직하게는, 본 개시의 일 실시예에 따른 이물질 검사시스템은, 상기 예측 이물질 포함여부가 1이면, 상기 PC는 상기 PLC에 1을 전송하고, 상기 PLC는 미리 결정된 시간만큼 대기한 후, 점등신호를 생성하여 경광등에 상기 점등신호를 전송하고, 상기 PLC는 메모리에 카운터 1을 세팅하며, 리젝터 작동 신호를 생성하여 리젝터에 상기 리젝터 작동 신호를 전송하고, 상기 리젝터가 상기 리젝터 작동 신호를 수신하면, 불량품 적재 공간에 상기 예측 이물질 포함여부에 대응되는 상기 검사대상 제품을 이동시킨다.In addition, preferably, in the foreign matter inspection system according to one embodiment of the present disclosure, if the predicted foreign matter inclusion is 1, the PC transmits 1 to the PLC, the PLC waits for a predetermined time, generates a lighting signal and transmits the lighting signal to a warning light, the PLC sets a
또한, 바람직하게는, 본 개시의 일 실시예에 따른 이물질 검사시스템의 상기 불량품 적재 공간에는 이동된 상기 검사대상 제품의 인입을 감지하도록 구성된 불량품 센서가 구비되고, 상기 PLC가 상기 메모리에 카운터 1이 세팅된 시점으로부터 미리 설정된 시간 내에 상기 불량품 센서의 감지 신호를 수신하면, 상기 PLC는 상기 메모리를 클리어하고, 상기 PLC가 상기 메모리 카운터 1이 세팅된 시점으로부터 미리 설정된 시간 내에 상기 불량품 센서의 감지 신호를 수신하지 못하면, 상기 PLC는 컨베이어 벨트 정지 신호를 생성하며, 상기 컨베이어 벨트 정지 신호를 상기 컨베이어 벨트에 전송한다.In addition, preferably, the defective product loading space of the foreign substance inspection system according to one embodiment of the present disclosure is provided with a defective product sensor configured to detect the introduction of the moved inspection target product, and when the PLC receives a detection signal of the defective product sensor within a preset time from the time when
또한, 본 개시의 일 실시예에 따르면 (a1) 검사대상 제품의 처리된 이미지를 입력자료로 하고 이물질 포함여부를 출력자료로 하는 딥러닝 모델이 서버에 의해 학습되는 단계; (a2) 상기 딥러닝 모델이 PC에 전송되는 단계; (b1) 제품감지 센서에 의해 검사대상 제품의 인입이 감지되며, 상기 제품감지 센서는 제품 인입 신호를 생성하여 PLC에 전송하는 단계; (b2) 상기 PLC가 상기 제품 인입 신호를 수신하고, 미리 결정된 시간만큼 대기하고, 트리거 신호를 생성하여 카메라 장치에 전송하는 단계; (b3) 상기 카메라 장치가 MWIR 카메라를 이용하여 상기 검사대상 제품의 로우 이미지를 촬영하고, 상기 카메라 장치가 이미지 처리부를 이용하여 이미지를 처리하는 단계; (b4) 상기 처리된 이미지가 상기 PC에 전송되는 단계; 및 (c) 상기 처리된 이미지가 상기 딥러닝 모델에 입력되면, 예측 이물질 포함여부가 출력되는 단계;를 포함하되, 상기 MWIR 카메라는 3 ㎛ 내지 5 ㎛의 파장을 이용하여 상기 검사대상 제품을 촬영하는, 이물질 검사방법을 제공한다.In addition, according to one embodiment of the present disclosure, a method for inspecting foreign substances is provided, including: (a1) a step in which a deep learning model is learned by a server, which uses a processed image of a product to be inspected as input data and whether or not a foreign substance is included as output data; (a2) a step in which the deep learning model is transmitted to a PC; (b1) a step in which the introduction of a product to be inspected is detected by a product detection sensor, and the product detection sensor generates a product introduction signal and transmits the signal to a PLC; (b2) a step in which the PLC receives the product introduction signal, waits for a predetermined time, and generates a trigger signal and transmits the signal to a camera device; (b3) a step in which the camera device captures a raw image of the product to be inspected using a MWIR camera, and the camera device processes the image using an image processing unit; (b4) a step in which the processed image is transmitted to the PC; and (c) a step in which, when the processed image is input to the deep learning model, a predicted foreign substance inclusion or not is output; wherein the MWIR camera captures the product to be inspected using a wavelength of 3 ㎛ to 5 ㎛.
또한, 바람직하게는, 본 개시의 일 실시예에 따른 검사방법의 상기 (a1) 단계는, (a10) 상기 서버의 열화상 이미지 입력모듈이 MWIR 카메라를 이용하여 촬영된 로우 이미지를 입력받는 단계; (a11) 상기 서버의 이물질 포함여부 입력모듈이 상기 (a10) 단계에서 촬영된 로우 이미지에 대응되는 이물질 포함여부를 입력받는 단계; (a12) 상기 서버의 이미지 처리부가 상기 (a10) 단계에서 입력된 로우 이미지를 처리함으로써 처리된 이미지를 생성하는 단계; 및 (a13) 상기 서버의 학습부가, (a12) 단계에서 생성된 처리된 이미지를 입력자료로 하며 상기 (a11) 단계에서 입력된 이물질 포함여부를 출력자료로 하는 딥러닝 모델을 학습시키는 단계;를 포함한다.In addition, preferably, the step (a1) of the inspection method according to one embodiment of the present disclosure includes: (a10) a step in which the thermal image input module of the server receives a raw image captured using a MWIR camera; (a11) a step in which the foreign matter inclusion input module of the server receives an input of whether a foreign matter is included corresponding to the raw image captured in the step (a10); (a12) a step in which the image processing unit of the server processes the raw image input in the step (a10) to generate a processed image; and (a13) a step in which the learning unit of the server trains a deep learning model that uses the processed image generated in the step (a12) as input data and the foreign matter inclusion input in the step (a11) as output data.
또한, 바람직하게는, 본 개시의 일 실시예에 따른 검사방법의 상기 (c) 단계는, (c10) 상기 예측 이물질 포함여부가 0으로 출력되는 단계; (c11) 상기 PC가 출력부를 통해 상기 로우 이미지 또는 상기 처리된 이미지를 표시하는 단계; 및 (c12) 상기 PC가 상기 검사대상 제품의 로우 이미지 또는 처리된 이미지, 상기 로우 이미지의 메타 데이터 및 0값을 데이터세트로 하여 데이터베이스에 전송 및 저장하는 단계;를 포함한다.In addition, preferably, the step (c) of the inspection method according to one embodiment of the present disclosure includes: (c10) a step of outputting the predicted presence or absence of foreign matter as 0; (c11) a step of the PC displaying the raw image or the processed image through an output unit; and (c12) a step of the PC transmitting and storing the raw image or the processed image of the product to be inspected, metadata of the raw image, and a value of 0 as a dataset in a database.
또한, 바람직하게는, 본 개시의 일 실시예에 따른 검사방법의 상기 (c) 단계는, (c20) 상기 예측 이물질 포함여부가 1로 출력되는 단계; (c21) 상기 PC가 상기 PLC에 1을 전송하는 단계; (c22) 상기 PLC는 미리 결정된 시간만큼 대기한 후, 점등신호를 생성하여 경광등에 상기 점등신호를 전송하는 단계; (c23) 상기 PLC는 메모리에 카운터 1을 세팅하고, 리젝터 작동 신호를 생성하여 리젝터에 상기 리젝터 작동 신호를 생성하는 단계; 및 (c24) 상기 리젝터가 상기 리젝터 작동 신호를 수신하면, 상기 리젝터가 불품 적재 공간에 상기 예측 이물질 포함여부에 대응되는 상기 검사대상 제품을 이동시키는 단계;를 더 포함한다.In addition, preferably, the step (c) of the inspection method according to one embodiment of the present disclosure further includes: (c20) a step in which the predicted foreign substance inclusion status is output as 1; (c21) a step in which the PC transmits 1 to the PLC; (c22) a step in which the PLC waits for a predetermined time and then generates a lighting signal and transmits the lighting signal to a warning light; (c23) a step in which the PLC sets a
또한, 바람직하게는, 본 개시의 일 실시예에 따른 검사방법의 상기 불량품 적재 공간에는 이동된 상기 검사대상 제품의 인입을 감지하도록 구성된 불량품 센서가 구비되고, 상기 (c) 단계는, (c25) 상기 PLC가 상기 메모리에 카운터 1이 세팅된 십점으로부터 미리 설정된 시간 내에 상기 불량품 센서의 감지 신호를 수신하면, 상기 PLC는 상기 메모리를 클리어하는 단계; 및 (c26) 상기 PLC가 상기 메모리 카운터 1이 세팅된 시점으로부터 미리 설정된 시간 내에 상기 불량품 센서의 감지 신호를 수신하지 못하면, 상기 PLC는 컨베이어 벨트 정지 신호를 생성하며, 상기 컨베이어 벨트 정지 신호를 상기 컨베이어 벨트에 전송하는 단계;를 더 포함한다.In addition, preferably, the defective product loading space of the inspection method according to one embodiment of the present disclosure is provided with a defective product sensor configured to detect the introduction of the moved inspection target product, and the step (c) further includes: (c25) a step of the PLC clearing the memory if the PLC receives a detection signal of the defective product sensor within a preset time from a point at which
이상에서 설명한 바와 같이 본 개시의 일 실시예에 따른 이물질 검사시스템 및 검사방법은 포장재를 투과할 수 있으면서도, 저밀도 이물질을 감지할 수 있는 에너지 대역의 파장을 조사하여 검사대상 제품을 촬영하기에, 포장이 완료된 검사대상 제품 내에 유입된 이물질의 검출이 가능한바, 이물질 검출확률이 증대되는 효과가 있다.As described above, the foreign substance inspection system and inspection method according to one embodiment of the present disclosure photograph a product to be inspected by irradiating a wavelength of an energy band capable of detecting low-density foreign substances while being able to penetrate packaging materials, thereby enabling detection of foreign substances that have entered a product to be inspected after packaging is completed, thereby increasing the probability of detecting foreign substances.
또한, 본 개시의 일 실시예에 따른 이물질 검사시스템은 불량품이 적절하게 제거되지 못한 경우, 모든 작업을 정지함으로써, 사용자에게 불량품 제거에 대한 필요성을 적절히 고지할 수 있는 효과가 있다.In addition, the foreign substance inspection system according to one embodiment of the present disclosure has the effect of properly notifying the user of the need to remove defective products by stopping all operations if defective products are not properly removed.
도 1은 본 개시의 일 실시예에 따른 이물질 검사시스템의 블록도이다.FIG. 1 is a block diagram of a foreign substance inspection system according to one embodiment of the present disclosure.
도 2는 본 개시의 일 실시예에 따른 PC의 블록도이다.FIG. 2 is a block diagram of a PC according to one embodiment of the present disclosure.
도 3은 본 개시의 일 실시예에 따른 이물질 검사방법의 순서도이다.FIG. 3 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
도 4는 본 개시의 일 실시예에 따른 이물질 검사방법의 순서도이다.FIG. 4 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
도 5는 본 개시의 일 실시예에 따른 서버의 블록도이다.FIG. 5 is a block diagram of a server according to one embodiment of the present disclosure.
도 6은 본 개시의 일 실시예에 따른 딥러닝 모델의 학습방법의 순서도이다.FIG. 6 is a flowchart of a method for learning a deep learning model according to one embodiment of the present disclosure.
도 7은 본 개시의 일 실시예에 따른 이물질 검사시스템과 종래의 검사시스템을 비교한 것이다.FIG. 7 compares a foreign substance inspection system according to one embodiment of the present disclosure with a conventional inspection system.
이하, 본 개시의 일부 실시예들을 예시적인 도면을 통해 상세하게 설명한다. 각 도면의 구성 요소들에 참조 부호를 부가함에 있어서, 동일한 구성 요소들에 대해서는 비록 다른 도면상에 표시되더라도 가능한 한 동일한 부호를 가지도록 하고 있음에 유의해야 한다. 또한, 본 개시를 설명함에 있어, 관련된 공지 구성 또는 기능에 대한 구체적인 설명이 본 개시의 요지를 흐릴 수 있다고 판단되는 경우에는 그 상세한 설명은 생략한다.Hereinafter, some embodiments of the present disclosure will be described in detail with reference to exemplary drawings. When adding reference numerals to components of each drawing, it should be noted that the same components are given the same numerals as much as possible even if they are shown in different drawings. In addition, when describing the present disclosure, if it is determined that a specific description of a related known configuration or function may obscure the gist of the present disclosure, the detailed description thereof will be omitted.
본 개시에 따른 실시예의 구성요소를 설명하는 데 있어서, 제1, 제2, i), ii), a), b) 등의 부호를 사용할 수 있다. 이러한 부호는 그 구성요소를 다른 구성 요소와 구별하기 위한 것일 뿐, 그 부호에 의해 해당 구성요소의 본질 또는 차례나 순서 등이 한정되지 않는다. 명세서에서 어떤 부분이 어떤 구성요소를 '포함' 또는 '구비'한다고 할 때, 이는 명시적으로 반대되는 기재가 없는 한 다른 구성요소를 제외하는 것이 아니라 다른 구성요소를 더 포함할 수 있는 것을 의미한다.In describing components of embodiments according to the present disclosure, symbols such as first, second, i), ii), a), b), etc. may be used. These symbols are only for distinguishing the components from other components, and the nature or order or sequence of the components is not limited by the symbols. When a part in the specification is said to "include" or "provide" a component, this does not mean that other components are excluded, but rather that other components can be further included, unless explicitly stated otherwise.
본 개시에서, "검사대상 제품"은 이물질 검사시스템에 의해 검사되는 제품이며, 바람직하게는 포장재와 포장재 내의 식품을 일컫는 개념일 수 있다. 본 개시에서는, 검사대상 제품이 포장재와 포장재 내의 식품인 것으로 설명하나, 반드시 이에 한정되는 것은 아니하다.In the present disclosure, the “product to be inspected” is a product inspected by a foreign matter inspection system, and may preferably refer to packaging materials and food within the packaging materials. In the present disclosure, the product to be inspected is described as packaging materials and food within the packaging materials, but is not necessarily limited thereto.
1. 이물질 검사시스템의 설명1. Description of the foreign substance inspection system
도 1은 본 개시의 일 실시예에 따른 이물질 검사시스템의 블록도이다. 도 2는 본 개시의 일 실시예에 따른 PC의 블록도이다.FIG. 1 is a block diagram of a foreign substance inspection system according to one embodiment of the present disclosure. FIG. 2 is a block diagram of a PC according to one embodiment of the present disclosure.
도 1 및 도 2를 참조하여, 본 개시의 일 실시예에 따른 이물질 검사시스템(1)을 설명한다. 이물질 검사시스템(1)은 검사대상 제품 내에 인입된 이물질을 감지(detect)하고 정상품과 불량품을 구분하도록 구성된다. 이를 위하여, 이물질 검사시스템(1)은 PLC(10), 불량품 센서(11), 리젝터(12), 컨베이어 벨트(13), 경광등(14), 카메라 장치(15), 제품감지 센서(16), PC(17), 서버(18) 및 데이터베이스(19)의 전부 또는 일부를 포함한다. Referring to FIGS. 1 and 2, a foreign matter inspection system (1) according to one embodiment of the present disclosure is described. The foreign matter inspection system (1) is configured to detect foreign matters introduced into a product to be inspected and to distinguish between normal products and defective products. To this end, the foreign matter inspection system (1) includes all or part of a PLC (10), a defective product sensor (11), a rejector (12), a conveyor belt (13), a warning light (14), a camera device (15), a product detection sensor (16), a PC (17), a server (18), and a database (19).
PLC(programmable logic controller, 10)는 이물질 검사시스템(1)에 포함되는 타 구성요소들과 유선 또는 무선으로 통신하여, 시스템(1)을 전반적으로 제어하도록 구성된다.A PLC (programmable logic controller, 10) is configured to control the overall system (1) by communicating with other components included in the foreign substance inspection system (1) via wired or wireless means.
불량품 센서(11)는 컨베이어 벨트(13) 상에 배치되는 검사대상 제품의 적어도 일부, 상세하게는, 불량품으로 판단된 검사대상 제품을 센싱하도록 구성된다. The defective product sensor (11) is configured to sense at least a portion of the inspection target product placed on the conveyor belt (13), specifically, the inspection target product determined to be a defective product.
컨베이어 벨트(13)는 검사대상 제품을 수송하도록 형성된다. 컨베이어 벨트(13)의 일단에는 검사대상 제품이 인입되며, 타단에는 검사대상 제품이 배출된다. 이때, 컨베이어 벨트(13)에는 구동부가 마련되어, 컨베이어 벨트(13)는 일 방향으로 검사대상 제품을 수송할 수 있다. The conveyor belt (13) is formed to transport the product to be inspected. The product to be inspected is fed into one end of the conveyor belt (13), and the product to be inspected is discharged from the other end. At this time, a driving unit is provided on the conveyor belt (13), so that the conveyor belt (13) can transport the product to be inspected in one direction.
컨베이어 벨트(13) 상에서 수송되는 검사대상 제품은 정상 제품 또는 불량품으로 결정된다. 이때, 불량품으로 결정된 검사대상 제품은 리젝터(12)에 의해 컨베이어 벨트(13)의 외부로 취출될 수 있다. 이를 위하여, 리젝터(12)는 예컨대, 취출 로봇(take out robot, 미도시)일 수 있으나, 반드시 이에 한정되는 것은 아니다.The inspection target product transported on the conveyor belt (13) is determined as a normal product or a defective product. At this time, the inspection target product determined as a defective product can be taken out of the conveyor belt (13) by the rejector (12). For this purpose, the rejector (12) may be, for example, a take out robot (not shown), but is not necessarily limited thereto.
경광등(14)은 리젝터(12)에 의해 취출된 불량품이, 불량품 센서(11)에 의해 센싱되면 기 설정된 시간만큼 점등된다. The warning light (14) lights up for a preset period of time when a defective product extracted by the rejector (12) is sensed by the defective product sensor (11).
카메라 장치(15)는 컨베이어 벨트(13)에 의해 수송되는 검사대상 제품을 촬영하며, 촬영된 이미지를 처리하도록 구성된다. 이를 위하여, 카메라 장치(15)는 MWIR 카메라(150) 및 이미지 처리부(152)를 포함한다. The camera device (15) is configured to photograph a product to be inspected being transported by a conveyor belt (13) and to process the photographed image. To this end, the camera device (15) includes a MWIR camera (150) and an image processing unit (152).
MWIR 카메라(150)는 검사대상 제품을 촬영하도록 구성된다. MWIR 카메라(150)는 중파장 적외선(MWIR, Mid Wave Infrared)을 이용하여 검사대상 제품을 촬영한다. 이때, MWIR은 3 내지 5 ㎛의 파장을 의미한다. 종래의 SWIR 대역의 카메라는 일반적으로 NY, PE 및 LDPE을 포함하는 재질로 형성되는 포장재를 투과하지 못하는 데에 반해, MWIR 대역의 파장을 이용하는 MWIR 카메라(150)는 NY, PE 및 LDPE을 포함하는 재질로 형성되는 포장재를 투과할 수 있어, 포장이 완료된 포장재 내에 유입된 이물질의 검출이 가능하다. 이로 인해, 포장 시 유입되는 이물질을 디텍팅할 수 있어, 이물질 검출 확률이 증대되는 효과가 있다. 다시 말해, MWIR 카메라(150)는 검사대상 제품의 복사열을 디텍팅할 수 있다. The MWIR camera (150) is configured to photograph the product to be inspected. The MWIR camera (150) photographs the product to be inspected using mid-wave infrared (MWIR). At this time, MWIR means a wavelength of 3 to 5 ㎛. While a conventional SWIR band camera generally cannot penetrate packaging materials formed of materials including NY, PE, and LDPE, the MWIR camera (150) using the wavelength of the MWIR band can penetrate packaging materials formed of materials including NY, PE, and LDPE, thereby enabling detection of foreign substances introduced into the packaging material after packaging is completed. As a result, foreign substances introduced during packaging can be detected, thereby increasing the probability of detecting foreign substances. In other words, the MWIR camera (150) can detect radiant heat of the product to be inspected.
이미지 처리부(152)는 MWIR 카메라(150)에 의해 촬영된 로우 이미지(raw image)를 처리하도록 구성된다. The image processing unit (152) is configured to process a raw image captured by the MWIR camera (150).
구체적으로, 이미지 처리부(152)는 로우 이미지(raw image)를 이진화(binarization)하여 그레이 스케일(gray scale)로 변환함으로써 흑백 이미지를 생성한다. 이후, 이미지 처리부(152)는 생성된 흑백 이미지를 미리 설정된 크기의 이미지로 변환함으로써, 처리된 이미지(processed image)를 생성한다. 여기서, 미리 설정된 크기란, 예를 들어, 223 픽셀(pixel)일 수 있다. Specifically, the image processing unit (152) generates a black and white image by binarizing a raw image and converting it into gray scale. Thereafter, the image processing unit (152) generates a processed image by converting the generated black and white image into an image of a preset size. Here, the preset size may be, for example, 223 pixels.
이미지 처리부(152)에 의해 처리된 이미지는 PC(17)로 전송된다.The image processed by the image processing unit (152) is transmitted to the PC (17).
제품감지 센서(16)는 컨베이어 벨트(13)의 일단, 즉, 검사대상 제품이 인입되는 위치에 배치되어, 제품의 인입을 감지(detect)하도록 구성된다. 제품감지 센서(16)는 제품을 감지하면, 감지 신호를 생성하여 PLC(10)에 전송할 수 있다. The product detection sensor (16) is arranged at one end of the conveyor belt (13), i.e., at the location where the product to be inspected is introduced, and is configured to detect the introduction of the product. When the product detection sensor (16) detects a product, it can generate a detection signal and transmit it to the PLC (10).
PC(17)는 서버(18)에 의해 학습된 딥러닝 모델을 전송받으며, 카메라 장치(15)에 의해 전송된 처리된 이미지를 상기한 딥러닝 모델에 입력하도록 구성된다. PC(17)는 입력된 처리된 이미지에 대응되는 이물질 포함여부를 출력하도록 구성된다.The PC (17) receives a deep learning model learned by the server (18) and is configured to input a processed image transmitted by the camera device (15) into the deep learning model. The PC (17) is configured to output whether or not a foreign substance is included corresponding to the input processed image.
이를 위하여, PC(17)는 입력부(170), 출력부(171), 제어부(172), 처리부(173) 및 통신부(174)의 전부 또는 일부를 포함한다(도 2 참조). 한편, 본 개시에서 설명하는 PC(17)의 구성요소는 본 개시에 따른 검사를 위해 필요한 구성만을 설명하는바, 통상적으로 PC(17)에 포함되는 추가 구성요소에 대한 설명은 생략한다. To this end, the PC (17) includes all or part of an input unit (170), an output unit (171), a control unit (172), a processing unit (173), and a communication unit (174) (see FIG. 2). Meanwhile, the components of the PC (17) described in the present disclosure only describe the configurations necessary for inspection according to the present disclosure, and descriptions of additional components typically included in the PC (17) are omitted.
입력부(170)는 외부로부터 명령 또는 데이터를 입력받도록 구성된다. The input unit (170) is configured to receive commands or data from the outside.
입력부(170)를 통해 입력된 명령 또는 데이터는 PC(17)의 타 구성요소로 전송될 수 있다. 본 개시에서는, 입력부(170)를 통해 이미지 처리부(152)에 의해 처리된 이미지가 입력될 수 있다.A command or data input through the input unit (170) can be transmitted to another component of the PC (17). In the present disclosure, an image processed by the image processing unit (152) can be input through the input unit (170).
출력부(171)는 PC(17)에 의해 처리 또는 생성되는 명령 또는 데이터를 출력하도록 구성된다. 본 개시에서는, 출력부(171)는 디스플레이 및 스피커를 포함할 수 있다. The output unit (171) is configured to output a command or data processed or generated by the PC (17). In the present disclosure, the output unit (171) may include a display and a speaker.
제어부(172)는 PC(17)의 하나 이상의 타 구성요소들을 제어하도록 구성된다. 여기서, 하나 이상의 다른 구성요소는, 하드웨어 또는 소프트웨어를 모두 포함하는 개념이다. 즉, 제어부(172)는 PC(17)의 입력부(170), 출력부(171), 처리부(173) 및 통신부(174)를 제어할 수 있다. The control unit (172) is configured to control one or more other components of the PC (17). Here, the one or more other components are concepts including both hardware and software. That is, the control unit (172) can control the input unit (170), the output unit (171), the processing unit (173), and the communication unit (174) of the PC (17).
처리부(173)는 데이터 처리 또는 연산 기능을 수행할 수 있으며, 데이터 처리 또는 연산 기능의 일부로서, 타 구성요소로부터 수신된 명령 또는 데이터를 휘발성 메모리에 저장하며, 휘발성 메모리에 저장된 명령 또는 데이터를 처리하고, 결과를 비휘발성 메모리에 저장하도록 구성된다. The processing unit (173) can perform a data processing or calculation function, and as part of the data processing or calculation function, is configured to store commands or data received from other components in volatile memory, process the commands or data stored in the volatile memory, and store the results in non-volatile memory.
처리부(173)는 입력부(170)를 통해 입력된 처리된 이미지를, 사전에 다운로드 받은 딥러닝 모델에 입력하여, 이물질 포함 여부를 연산할 수 있다. 한편, 딥러닝 모델에 관한 상세한 설명은 도 5 내지 도 6에서 설명한다.The processing unit (173) can input the processed image input through the input unit (170) into a deep learning model downloaded in advance to calculate whether or not a foreign substance is included. Meanwhile, a detailed description of the deep learning model is described in FIGS. 5 and 6.
통신부(174)는 외부 기기와의 통신을 위한 구성으로, 통신부(174)를 통해 PC(17)는 PLC(10), 카메라 장치(15), 서버(18) 및 데이터베이스(19)와 명령 및 데이터 송수신이 가능하다.The communication unit (174) is configured for communication with external devices, and through the communication unit (174), the PC (17) can send and receive commands and data with the PLC (10), camera device (15), server (18), and database (19).
PC(17)는 검사대상 제품에 이물질이 포함되어 있는 경우, 즉, 불량품으로 판단되는 경우의 결과 값을 "1"로 분류한다. 또한, PC(17)는 검사대상 제품에 이물질이 포함되어 있지 아니한 경우, 즉, 정상 제품인 경우의 결과 값을 "0"으로 분류한다. PC (17) classifies the result value as "1" when the product to be inspected contains foreign substances, i.e., when it is judged to be a defective product. In addition, PC (17) classifies the result value as "0" when the product to be inspected does not contain foreign substances, i.e., when it is a normal product.
1로 판단된 경우, PC(17)는 PLC(10)에 미리 결정된 신호를 전송한다. 이때, PC(17)와 PLC(10)는 SSR 또는 Ethernet/IP Interface를 이용하여 신호를 송수신할 수 있다.If judged as 1, the PC (17) transmits a predetermined signal to the PLC (10). At this time, the PC (17) and the PLC (10) can send and receive signals using the SSR or Ethernet/IP Interface.
0으로 판단된 경우, PC(17)는 출력부(171)를 통해 카메라 장치(15)에 의해 촬영된 이미지 또는 처리된 이미지와, OK 표시를 사용자에게 제공할 수 있다. 또한, PC(17)는 0으로 판단된 검사대상 제품의 이미지, 이미지의 메타 데이터 및 0값을 데이터베이스(19)에 저장할 수 있다. If judged as 0, the PC (17) can provide the user with an image captured by the camera device (15) or a processed image and an OK display through the output unit (171). In addition, the PC (17) can store the image of the inspection target product judged as 0, the metadata of the image, and the 0 value in the database (19).
서버(18)는 데이터를 처리, 저장 및 송수신하도록 구성된다. 서버(18)는 딥러닝 모델을 학습시키도록 구성된다. 관련하여, 도 5 내지 도 6에서 상세히 설명한다. 한편, 본 개시에서는 서버(18)가 IDC 등의 별도로 마련되는 장치일 수 있으나, 반드시 이에 한정되는 것은 아니하며, 클라우드일 수 있다. The server (18) is configured to process, store, and transmit/receive data. The server (18) is configured to train a deep learning model. In this regard, it is described in detail in FIGS. 5 and 6. Meanwhile, in the present disclosure, the server (18) may be a separately provided device such as an IDC, but is not necessarily limited thereto, and may be a cloud.
데이터베이스(19)는 PC(17)로부터 수신한 데이터를 일시적 또는 영구적으로 저장하도록 구성된다. 데이터베이스(19)는 PC(17) 외부에 마련되는 별도의 저장장치일 수 있으나, 반드시 이에 한정되는 것은 아니며, PC(17) 또는 PLC(10)에 마련되는 저장장치일 수 있다.The database (19) is configured to temporarily or permanently store data received from the PC (17). The database (19) may be a separate storage device provided outside the PC (17), but is not necessarily limited thereto, and may be a storage device provided in the PC (17) or PLC (10).
2. 이물질 검사방법의 설명2. Description of foreign substance inspection method
2.1. 이물질 포함여부 판단까지의 방법 설명2.1. Description of method for determining whether foreign substances are included
도 3은 본 개시의 일 실시예에 따른 이물질 검사방법의 순서도이다. FIG. 3 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
도 3을 참조하여, 이물질 검사시스템(1)에 검사대상 제품이 인입되는 단계부터 이물질 포함여부를 판단하는 단계까지를 설명한다.Referring to Figure 3, the process from the step of introducing the product to be inspected into the foreign substance inspection system (1) to the step of determining whether or not it contains foreign substances is described.
이물질 검사시스템(1)에 검사대상 제품이 인입되면, 제품감지 센서(16)에 의해 검사대상 제품의 인입이 감지된다. 제품감지 센서(16)는 제품 인입 신호를 생성하여, 이를 PLC(10)에 전송한다.When a product to be inspected is introduced into the foreign substance inspection system (1), the introduction of the product to be inspected is detected by the product detection sensor (16). The product detection sensor (16) generates a product introduction signal and transmits it to the PLC (10).
PLC(10)는 제품 인입 신호를 수신하면, 제1 지연시간 동안 대기한 후, 촬영을 위한 트리거 신호를 생성하여, 이를 카메라 장치(15)에 전송한다. 여기서, 제1 지연시간은, 바람직하게는 인입된 검사대상 제품이 카메라 장치(15)이 배치된 위치까지 수송되는 시간과 동일한 시간이다. When the PLC (10) receives a product introduction signal, it waits for a first delay time, then generates a trigger signal for shooting and transmits it to the camera device (15). Here, the first delay time is preferably the same time as the time it takes for the introduced inspection target product to be transported to the location where the camera device (15) is placed.
카메라 장치(15)는 트리거 신호를 수신하면, MWIR 카메라(150)를 이용하여 검사대상 제품을 촬영함으로써, 검사대상 제품의 복사열을 디텍팅할 수 있다. MWIR 카메라(150)에 의해 촬영된 로우 이미지(raw image)는 카메라 장치(15)의 이미지 처리부(152)에 의해 처리된다. When the camera device (15) receives a trigger signal, it can detect the radiant heat of the product to be inspected by photographing the product to be inspected using the MWIR camera (150). The raw image photographed by the MWIR camera (150) is processed by the image processing unit (152) of the camera device (15).
이미지 처리부(152)는 로우 이미지(raw image)를 이진화(binarization)하고, 그레이 스케일(gray scale)로 변환하여 흑백 이미지를 생성한다. 이후, 이미지 처리부(152)는 생성된 흑백 이미지를 미리 설정된 크기의 이미지로 변환함으로써, 처리된 이미지(processed image)를 생성한다. 여기서, 미리 설정된 크기란, 예를 들어, 223 픽셀(pixel)일 수 있다.The image processing unit (152) binarizes the raw image and converts it to gray scale to generate a black and white image. Thereafter, the image processing unit (152) generates a processed image by converting the generated black and white image into an image of a preset size. Here, the preset size may be, for example, 223 pixels.
상기한 과정을 통해 처리된 이미지는 PC(17)에 전송된다. The image processed through the above process is transmitted to the PC (17).
이때, PC(17)는 처리된 이미지를 수신하기 전에, 미리 서버(18)로부터 딥러닝 모델을 다운로드 받아, 내부 메모리(미도시)에 저장해 놓을 수 있다. At this time, the PC (17) can download a deep learning model from the server (18) in advance and store it in internal memory (not shown) before receiving the processed image.
PC(17)는 처리된 이미지를 상기한 딥러닝 모델에 입력하고, 이물질 포함여부를 출력할 수 있다. 이때, PC(17)는 이물질이 포함된 경우의 결과 값을 1로 출력하고, 이물질이 포함되지 아니한 경우의 결과 값을 0으로 출력할 수 있다.PC (17) can input the processed image into the aforementioned deep learning model and output whether or not a foreign substance is included. At this time, PC (17) can output the result value as 1 when a foreign substance is included and output the result value as 0 when a foreign substance is not included.
2.2. 이물질 포함여부 판단 이후의 방법 설명2.2. Description of the method after determining whether foreign substances are included
도 4는 본 개시의 일 실시예에 따른 이물질 검사방법의 순서도이다.FIG. 4 is a flowchart of a foreign substance inspection method according to one embodiment of the present disclosure.
도 4를 참조하여, 이물질 검사시스템(1)에 의해 이물질 포함여부가 판단된 후의 단계를 설명한다.Referring to Fig. 4, the steps after the presence or absence of a foreign substance is determined by the foreign substance inspection system (1) are described.
PC(17)에 의해 이물질을 포함하고 있지 아니한 것으로 판단된 경우, 즉, 결과 값이 0인 경우, PC(17)는 출력부(171)에 열화상 이미지와 OK 표시를 함께 출력한다. 여기서, 열화상 이미지란 카메라 장치(15)에 의해 촬영된 이미지 또는 처리된 이미지일 수 있다.If it is determined by the PC (17) that there is no foreign substance, i.e., if the result value is 0, the PC (17) outputs a thermal image and an OK indication together to the output unit (171). Here, the thermal image may be an image captured by the camera device (15) or a processed image.
또한, PC(17)는 0으로 판단된 검사대상 제품의 이미지, 이미지의 메타 데이터 및 0값을 데이터베이스(19)에 저장할 수 있다. 이때, 데이터베이스(19)에 저장된 데이터는 추후 딥러닝 모델의 학습 또는 업데이트에 이용될 수 있다.In addition, the PC (17) can store the image of the inspection target product judged as 0, the metadata of the image, and the 0 value in the database (19). At this time, the data stored in the database (19) can be used for learning or updating the deep learning model in the future.
한편, PC(17)에 의해 이물질을 포함하고 있는 것으로 판단된 경우, 즉, 결과 값이 1인 경우, PC(17)는 결과 값을 PLC(10)에 전송한다. 이때, PC(17)와 PLC(10)는 SSR 또는 Ethernet/IP Interface을 이용하여 신호 또는 데이터를 송수신할 수 있다.Meanwhile, if it is determined by the PC (17) to contain a foreign substance, i.e., if the result value is 1, the PC (17) transmits the result value to the PLC (10). At this time, the PC (17) and the PLC (10) can transmit and receive signals or data using the SSR or Ethernet/IP Interface.
PLC(10)는 PC(17)로부터 결과 값 1을 수신하면, 제2 지연시간 동안 대기한다. 여기서, 제2 지연시간은 불량품인 것으로 판단된 검사대상 제품이 카메라 장치(15)가 위치한 곳으로부터 리젝터(12)까지 위치한 곳까지 수송되는 시간과 동일할 수 있다. When the PLC (10) receives the
PLC(10)는 점등신호를 생성하여, 경광등(14)에 전송한다. 경광등(14)은 점등신호를 수신하면, 미리 설정된 시간동안 점등된다. 이로 인해, 사용자는 검사대상 제품 중 불량품이 포함된 사실을 알 수 있다. The PLC (10) generates a lighting signal and transmits it to the warning light (14). When the warning light (14) receives the lighting signal, it lights up for a preset period of time. This allows the user to know that there are defective products among the products to be inspected.
점등신호의 생성과 동시에, 또는 순차적으로 PLC(10)는 메모리(미도시)에 카운터 1을 세팅하고, 리젝터 작동 신호를 생성하여 이를 리젝터(12)에 전송한다.Simultaneously with the generation of the lighting signal, or sequentially, the PLC (10) sets counter 1 in the memory (not shown), generates a rejector operation signal, and transmits it to the rejector (12).
리젝터(12)는 리젝터 작동 신호를 수신하면, 작동된다. 리젝터(12)는 불량품으로 판단된 검사대상 제품을 컨베이어 벨트(13)로부터 별도로 마련된 공간으로 이동시킬 수 있다. 이를 위하여, 리젝터(12)는 취출 로봇일 수 있으나, 반드시 이에 한정되는 것은 아니다.The rejector (12) operates when it receives a rejector operation signal. The rejector (12) can move a product to be inspected that has been determined to be defective from the conveyor belt (13) to a space provided separately. For this purpose, the rejector (12) may be a take-out robot, but is not necessarily limited thereto.
별도로 마련된 공간에는 불량품 센서(11)가 마련되어 있다. 이때, 불량품 센서(11)에 의해 검사대상 제품이 감지되면, 불량품 센서(11)는 제품 감지 신호를 생성하여, 이를 PLC(10)에 전송한다.A defective product sensor (11) is provided in a separately prepared space. At this time, when a product to be inspected is detected by the defective product sensor (11), the defective product sensor (11) generates a product detection signal and transmits it to the PLC (10).
PLC(10)는 메모리에 카운터 1을 세팅한 시점으로부터 미리 설정된 시간 이내에 제품 감지 신호를 수신하는지 여부를 확인한다.PLC (10) checks whether a product detection signal is received within a preset time from the
이때, 미리 설정된 시간 이내에 제품 감지 신호를 수신하면, PLC(10)는 메모리를 클리어한다. 즉, PLC(10)는 불량품이 감지되고 일정 시간 내에 불량품이 불량품 센서(11)에 의해 감지되면, 컨베이어 벨트(13)로부터 불량품이 제거된 것으로 판단하고, 기존 작업을 계속 수행할 수 있다. At this time, if a product detection signal is received within a preset time, the PLC (10) clears the memory. That is, if a defective product is detected and the defective product is detected by the defective product sensor (11) within a certain time, the PLC (10) determines that the defective product has been removed from the conveyor belt (13), and can continue performing the existing work.
한편, 미리 설정된 시간 이내에 제품 감지 신호를 수신하지 못하면, PLC(10)는 컨베이어 벨트 정지 신호를 생성하여, 이를 컨베이어 벨트(13)에 전송한다. Meanwhile, if a product detection signal is not received within a preset time, the PLC (10) generates a conveyor belt stop signal and transmits it to the conveyor belt (13).
컨베이어 벨트(13)는 컨베이어 벨트 정지 신호를 수신하면, 작동을 정지한다. 즉, PLC(10)는 불량품이 감지되고 일정 시간 내에 불량품이 불량품 센서(11)에 의해 감지되지 아니하면, 컨베이어 벨트(13)로부터 불량품이 적절하게 제거되지 못한 것으로 판단하고, 모든 작업을 정지함으로써 사용자에게 불량품 제거에 대한 필요성이 고지될 수 있는 것이다. When the conveyor belt (13) receives a conveyor belt stop signal, it stops operating. That is, when a defective product is detected and the defective product is not detected by the defective product sensor (11) within a certain period of time, the PLC (10) determines that the defective product has not been properly removed from the conveyor belt (13), and stops all operations so that the user can be notified of the need to remove the defective product.
3. 딥러닝 모델의 학습방법 설명3. Description of the learning method of the deep learning model
도 5는 본 개시의 일 실시예에 따른 서버의 블록도이다. FIG. 5 is a block diagram of a server according to one embodiment of the present disclosure.
도 5를 참조하면, 서버(18)는 입력부(180), 이미지 처리부(183), 학습정보 저장부(184) 및 학습부(185)의 전부 또는 일부를 포함한다.Referring to FIG. 5, the server (18) includes all or part of an input unit (180), an image processing unit (183), a learning information storage unit (184), and a learning unit (185).
입력부(180)는 학습을 위한 데이터를 입력받도록 구성된다. 입력부(180)는 열화상 이미지 입력모듈(181) 및 이물질 포함여부 입력모듈(182)을 포함한다. The input unit (180) is configured to receive data for learning. The input unit (180) includes a thermal image input module (181) and a foreign substance inclusion input module (182).
열화상 이미지 입력모듈(181)은 MWIR 카메라(150)에 의해 촬영된 열화상 이미지로서, 로우 이미지(raw image)를 입력받도록 구성된다.The thermal image input module (181) is configured to receive a raw image as a thermal image captured by the MWIR camera (150).
이물질 포함여부 입력모듈(182)은 MWIR 카메라(150)에 의해 촬영된 검사대상 제품 내부에 이물질이 포함되어 있는지 여부를 입력받도록 구성된다. The input module (182) for detecting whether a foreign substance is included is configured to receive input on whether a foreign substance is included inside the product to be inspected, captured by the MWIR camera (150).
이미지 처리부(183)은 열화상 이미지 입력모듈(181)에 의해 입력된 이미지를 전처리하도록 구성된다. 구체적으로, 이미지 처리부(183)은 로우 이미지(raw image)를 이진화(binarization)하여 그레이 스케일(gray scale)로 변환함으로써 흑백 이미지를 생성한다. The image processing unit (183) is configured to preprocess the image input by the thermal image input module (181). Specifically, the image processing unit (183) binarizes a raw image and converts it to gray scale to generate a black and white image.
이후, 이미지 처리부(183)은 생성된 흑백 이미지를 미리 설정된 크기의 이미지로 변환함으로써, 처리된 이미지(processed image)를 생성한다. 여기서, 미리 설정된 크기란, 예를 들어, 223 픽셀(pixel)일 수 있다.Thereafter, the image processing unit (183) generates a processed image by converting the generated black and white image into an image of a preset size. Here, the preset size may be, for example, 223 pixels.
상기한 이미지 처리 과정을 통해, 학습에 사용되는 정보가 학습에 적절한 형태 및 크기로 변형될 수 있다. Through the image processing process described above, information used for learning can be transformed into a shape and size appropriate for learning.
학습정보 저장부(184)는 처리된 이미지와, 처리된 이미지에 대응되는 이물질 포함여부를 데이터세트로 하여 저장한다. 이때, 학습정보 저장부(184)가 데이터세트화 할 때에는, RAY 또는 NUMPY 라이브러리를 이용하여 처리된 이미지와 이물질 포함여부를 TENSORFLOW KERAS 학습에 적절한 X 및 Y로 연관시켜 데이터세트를 생성할 수 있다.The learning information storage unit (184) stores the processed image and whether or not the processed image contains foreign substances as a dataset. At this time, when the learning information storage unit (184) converts the processed image into a dataset, the RAY or NUMPY library can be used to create a dataset by associating the processed image and whether or not the image contains foreign substances with X and Y appropriate for TENSORFLOW KERAS learning.
학습부(185)는 처리된 이미지를 입력자료로 하며, 이물질 포함여부를 출력자료로 하는 딥러닝 모델을 학습시킨다.The learning unit (185) trains a deep learning model that uses the processed image as input data and the presence or absence of foreign substances as output data.
이때, 학습부(185)은 학습정보 데이터세트를 훈련(train), 검증(validation), 및 테스트(test) 데이터세트로 분류한다. 바람직하게는, 훈련 : 검증 : 테스트의 비율은 0.65 : 0.2: 0.15이 되도록 랜덤 분류된다. 랜덤 분류에는, RandomSplitter가 사용될 수 있다. At this time, the learning unit (185) classifies the learning information data set into training, validation, and test data sets. Preferably, the ratio of training: validation: test is randomly classified to be 0.65: 0.2: 0.15. RandomSplitter can be used for random classification.
이때, 훈련 데이터세트와 테스트 데이터세트가 조절될 때에는 train_test_split이 사용될 수 있고, 검증 데이터세트가 조절될 때에는, validation_pct가 사용될 수 있다. At this time, train_test_split can be used when the training dataset and test dataset are adjusted, and validation_pct can be used when the validation dataset is adjusted.
학습부(185)은 훈련(train) 데이터세트와 검증(validation) 데이터세트를 증강한 후, 딥러닝 모델의 레이어를 컴파일(compile)하여, 딥러닝 모델을 학습시킬 수 있다. 바람직하게는, 학습부(185)은 데이터 증강 시, 데이터 증강(Augmentation) 기법을 사용함으로써 학습 데이터의 다양성을 확대하고, 과적합(overfitting)을 방지할 수 있다. 데이터 증강은, 이미지 확대, 자르기, 회전, 랜덤플립, 랜덤 리사이즈 및 밝기 조정을 포함할 수 있다. The learning unit (185) can learn the deep learning model by compiling layers of the deep learning model after augmenting the training data set and the validation data set. Preferably, the learning unit (185) can expand the diversity of the learning data and prevent overfitting by using a data augmentation technique when augmenting the data. Data augmentation can include image enlargement, cropping, rotation, random flip, random resizing, and brightness adjustment.
학습부(185)은 Pytorch 기반의 Fastai 라이브러리를 사용하여 Resnet, Mobilenet의 네트워크 구조가 적용된 딥러닝 모델을 전이학습(Transfer Learning)시킬 수 있다. 이로 인해, 딥러닝 모델은 고도화된 레이어를 구성하고 컴파일될 수 있다.The learning unit (185) can transfer learn a deep learning model with the network structure of Resnet and Mobilenet using the Pytorch-based Fastai library. As a result, the deep learning model can be configured and compiled into an advanced layer.
학습부(185)은 테스트(test) 데이터세트를 이용하여 모델을 평가하고, 딥러닝 모델을 pkl 형식으로, 평가치(accuracy, precision, recall, f-score, support)는 csv 형식으로 저장할 수 있다. The learning unit (185) can evaluate the model using a test data set, and save the deep learning model in pkl format and the evaluation values (accuracy, precision, recall, f-score, support) in csv format.
학습부(185)에 의해 학습된 딥러닝 모델은, PC(17)의 요청이 있는 경우 PC(17)에 전송될 수 있다. The deep learning model learned by the learning unit (185) can be transmitted to the PC (17) when requested by the PC (17).
도 6은 본 개시의 일 실시예에 따른 딥러닝 모델의 학습방법의 순서도이다.FIG. 6 is a flowchart of a method for learning a deep learning model according to one embodiment of the present disclosure.
도 6을 참조하면, MWIR 카메라(150)에 의해 촬영된 이미지가 입력부(180)에 의해 입력되면, 서버(18) 내에 마련된 이미지 처리부(183)에 의해 처리된다(S600).Referring to Fig. 6, when an image captured by the MWIR camera (150) is input by the input unit (180), it is processed by the image processing unit (183) provided in the server (18) (S600).
처리된 이미지와, 이에 대응되는 이물질 포함여부가 학습정보 데이터세트로서 생성 및 저장된다(S610).The processed image and the corresponding presence or absence of foreign substances are created and stored as a learning information dataset (S610).
저장된 학습정보 데이터세트를 이용하여, 학습부(185)은 딥러닝 모델을 학습시킨다(S620). 이때, 딥러닝 모델은 처리된 이미지를 입력자료로 하며, 이물질 포함여부를 출력자료로 할 수 있다. 딥러닝 모델은 서버(18)에서 학습되며, PC(17)로 다운로드 된다. Using the saved learning information data set, the learning unit (185) trains a deep learning model (S620). At this time, the deep learning model can use the processed image as input data and the presence or absence of foreign substances as output data. The deep learning model is trained on the server (18) and downloaded to the PC (17).
PC(17)에 카메라 장치(15)에 의해 촬영 및 처리된 이미지가 전송되면, PC(17)는 저장된 딥러닝 모델에 수신한 처리된 이미지를 입력함으로써, 예측 이물질 포함여부를 출력한다(S630). An image captured and processed by a camera device (15) on a PC (17) When transmitted, the PC (17) inputs the received processed image into the stored deep learning model to output whether or not it contains a predicted foreign substance (S630).
PC(17)는 출력부(171)를 통해 예측 이물질 포함여부를 사용자에게 제공할 수 있고, 이물질 검사시스템(1)은 예측 이물질 포함여부를 이용하여, 이물질 검사시스템(1) 구성요소의 전부 또는 일부를 제어할 수 있다. The PC (17) can provide the user with the predicted presence of foreign substances through the output unit (171), and the foreign substance inspection system (1) can control all or part of the components of the foreign substance inspection system (1) using the predicted presence of foreign substances.
도 7은 본 개시의 일 실시예에 따른 이물질 검사시스템과 종래의 검사시스템을 비교한 것이다. 한편, 도 7에서는 NY, PE 및 LDPE을 포함하는 재질로 형성되는 포장재로 포장된 검사대상 제품을 촬영하였다.Fig. 7 is a comparison between a foreign substance inspection system according to one embodiment of the present disclosure and a conventional inspection system. Meanwhile, Fig. 7 photographs an inspection target product packaged in a packaging material formed of a material including NY, PE, and LDPE.
도 7(a)는 동일한 검사대상 제품을, 본 개시의 일 실시예에 따른 카메라 장치(15), 즉, MWIR 카메라(150)를 이용하여 촬영한 결과와, 종래의 카메라 장치, 구체적으로는 SWIR 및 X-ray 카메라를 이용하여 촬영한 결과를 비교하여 나타낸 것이다.FIG. 7(a) shows a comparison of the results of photographing the same inspection target product using a camera device (15) according to one embodiment of the present disclosure, that is, a MWIR camera (150), and the results of photographing the same product using a conventional camera device, specifically, a SWIR and X-ray camera.
도 7(a)를 참조하면, MWIR 카메라(150)는 검사대상 제품 내 식품과 저밀도 이물질이 확실하게 구분되는 것을 확인할 수 있다. 반면, SWIR 카메라를 이용하여 촬영한 결과, 포장재를 거의 투과하지 못한 것을 알 수 있다. 나아가, 일부 투과된 영역에서는 저밀도 이물질이라고 보일만한 이미지가 구분되지 아니함을 알 수 있다. 또한, X-ray 카메라를 이용하여 촬영한 결과, 저밀도 이물질로 판단될만한 부분이 전혀 보이지 않는 것을 알 수 있다. Referring to Fig. 7(a), it can be confirmed that the MWIR camera (150) clearly distinguishes between food and low-density foreign substances in the product to be inspected. On the other hand, it can be seen that the results of the photographing using the SWIR camera barely penetrate the packaging material. Furthermore, it can be seen that in some areas where it penetrates, the image that could be considered low-density foreign substances is not distinguished. In addition, it can be seen that the results of the photographing using the X-ray camera do not show any part that could be judged as low-density foreign substances.
이러한 결과를 통해, 본 개시의 일 실시예에 따른 이물질 검사시스템(1)은 저밀도 이물질의 감지 능력이 종래의 방법에 비해 우수한 것을 알 수 있다.Through these results, it can be seen that the foreign substance inspection system (1) according to one embodiment of the present disclosure has a superior ability to detect low-density foreign substances compared to conventional methods.
도 7(b)는 동일한 검사대상 제품을, 본 개시의 일 실시예에 따른 카메라 장치(15), 즉, MWIR 카메라(150)를 이용하여 촬영한 결과와, 종래의 카메라 장치, 구체적으로는 LWIR 및 X-ray 카메라를 이용하여 촬영한 결과를 비교하여 나타낸 것이다. FIG. 7(b) shows a comparison of the results of photographing the same inspection target product using a camera device (15) according to one embodiment of the present disclosure, that is, a MWIR camera (150), and the results of photographing the same product using a conventional camera device, specifically, a LWIR and X-ray camera.
도 7(b)를 참조하면, MWIR 카메라(150)는 포장재를 투과하여 내부 물질을 상세하고 선명하게 촬영할 수 있음을 알 수 있다. 반면, LWIR 카메라를 이용하여 촬영한 결과, 포장재를 거의 투과하지 못한 것을 알 수 있다. 또한, X-ray 카메라를 이용하여 촬영한 결과, 포장재 투과 능력은 우수했으나, 저밀도 이물질과 식품의 구분이 어려움을 알 수 있다.Referring to Fig. 7(b), it can be seen that the MWIR camera (150) can capture detailed and clear images of internal materials through the packaging material. On the other hand, it can be seen that the images taken using the LWIR camera barely penetrate the packaging material. In addition, the images taken using the X-ray camera show that the packaging material penetration ability is excellent, but it is difficult to distinguish between low-density foreign substances and food.
이러한 결과를 통해, 본 개시의 일 실시예에 따른 이물질 검사시스템(1)은 포장재 투과 능력이 우수하면서도, 동시에 저밀도 이물질 감지 능력이 매우 우수함을 알 수 있다.Through these results, it can be seen that the foreign substance inspection system (1) according to one embodiment of the present disclosure has excellent packaging material penetration ability and, at the same time, excellent low-density foreign substance detection ability.
이상의 설명은 본 실시예의 기술 사상을 예시적으로 설명한 것에 불과한 것으로서, 본 실시예가 속하는 기술 분야에서 통상의 지식을 가진 자라면 본 실시예의 본질적인 특성에서 벗어나지 않는 범위에서 다양한 수정 및 변형이 가능할 것이다. 따라서, 본 실시예들은 본 실시예의 기술 사상을 한정하기 위한 것이 아니라 설명하기 위한 것이고, 이러한 실시예에 의하여 본 실시예의 기술 사상의 범위가 한정되는 것은 아니다. 본 실시예의 보호 범위는 아래의 청구범위에 의하여 해석되어야 하며, 그와 동등한 범위 내에 있는 모든 기술 사상은 본 실시예의 권리범위에 포함되는 것으로 해석되어야 할 것이다. The above description is merely an illustrative description of the technical idea of the present embodiment, and those with ordinary skill in the art to which the present embodiment belongs may make various modifications and variations without departing from the essential characteristics of the present embodiment. Therefore, the present embodiments are not intended to limit the technical idea of the present embodiment, but to explain it, and the scope of the technical idea of the present embodiment is not limited by these embodiments. The protection scope of the present embodiment should be interpreted by the following claims, and all technical ideas within a scope equivalent thereto should be interpreted as being included in the scope of the rights of the present embodiment.
[부호의 설명][Explanation of symbols]
1: 이물질 검사시스템1: Foreign substance inspection system
10: PLC10: PLC
11: 불량품 센서11: Defective sensor
12: 리젝터12: Rejector
13: 컨베이어 벨트13: Conveyor belt
14: 경광등14: Warning lights
15: 카메라 장치15: Camera Device
16: 제품감지 센서16: Product detection sensor
17: PC17: PC
18: 서버18: Server
19: 데이터베이스19: Database
Claims (11)
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR20230060012 | 2023-05-09 | ||
| KR10-2023-0060012 | 2023-05-09 | ||
| KR10-2023-0154493 | 2023-11-09 | ||
| KR1020230154493A KR20240162974A (en) | 2023-05-09 | 2023-11-09 | Inspection System and Inspection Method for Inspecting Low Density Pollutant |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024232680A1 true WO2024232680A1 (en) | 2024-11-14 |
Family
ID=93430662
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2024/006249 Pending WO2024232680A1 (en) | 2023-05-09 | 2024-05-09 | Inspection system and inspection method for inspecting low density foreign substance |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024232680A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR200160426Y1 (en) * | 1997-06-26 | 1999-11-01 | 김용성 | A device for the fault detection and supplement of chopsticks in the automatic packaging machine |
| KR102045079B1 (en) * | 2019-07-23 | 2019-11-14 | 주식회사 모든 | Inspection apparatus using terahertz wave |
| JP2020509364A (en) * | 2017-02-20 | 2020-03-26 | ヨラン イメージング リミテッドYoran Imaging Ltd. | Method and system for determining package integrity |
| KR20220164124A (en) * | 2021-06-03 | 2022-12-13 | 신신화학공업 주식회사 | System for inspecting product defects by type based on a deep learning model |
| JP2023051527A (en) * | 2021-09-30 | 2023-04-11 | 株式会社リコー | Inspection device and inspection method |
-
2024
- 2024-05-09 WO PCT/KR2024/006249 patent/WO2024232680A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR200160426Y1 (en) * | 1997-06-26 | 1999-11-01 | 김용성 | A device for the fault detection and supplement of chopsticks in the automatic packaging machine |
| JP2020509364A (en) * | 2017-02-20 | 2020-03-26 | ヨラン イメージング リミテッドYoran Imaging Ltd. | Method and system for determining package integrity |
| KR102045079B1 (en) * | 2019-07-23 | 2019-11-14 | 주식회사 모든 | Inspection apparatus using terahertz wave |
| KR20220164124A (en) * | 2021-06-03 | 2022-12-13 | 신신화학공업 주식회사 | System for inspecting product defects by type based on a deep learning model |
| JP2023051527A (en) * | 2021-09-30 | 2023-04-11 | 株式会社リコー | Inspection device and inspection method |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US5392359A (en) | Apparatus for inspecting appearance of cylindrical objects | |
| WO2022065621A1 (en) | Vision inspection system using distance learning of product defect image | |
| WO2020153623A1 (en) | Defect inspection device | |
| KR102168724B1 (en) | Method And Apparatus for Discriminating Normal and Abnormal by using Vision Inspection | |
| JP2017009619A (en) | Method of producing glass products from glass product material, and assembling device for implementing the same | |
| JPH04166751A (en) | Method and apparatus for inspecting defect in bottle and the like | |
| WO2006012194B1 (en) | Method and apparatus for monitoring and detecting defects in plastic package sealing | |
| WO2014163375A1 (en) | Method for inspecting for foreign substance on substrate | |
| WO2022211291A1 (en) | Contact lens package inspection device and inspection method | |
| WO2023167498A1 (en) | Battery failure detection device, method, and system | |
| WO2023022537A1 (en) | Ai-based vehicle disk defect detection system | |
| WO2020256407A1 (en) | Automatic learning method and system for product inspection | |
| WO2025162292A1 (en) | Inline inspection system for stamping part, visual inspection device, and inspection method and apparatus | |
| WO2024232680A1 (en) | Inspection system and inspection method for inspecting low density foreign substance | |
| WO2013018207A1 (en) | Container mouth portion inspection method and device | |
| JP2003262593A (en) | Defect detection device and defect detection method | |
| JP6303219B1 (en) | Image processing system and image processing method | |
| CN212220754U (en) | A cigarette rod anti-white detection system for packaging machine | |
| JP2605581B2 (en) | How to discharge falling cans, etc. | |
| KR20240162974A (en) | Inspection System and Inspection Method for Inspecting Low Density Pollutant | |
| WO2024049201A1 (en) | Device and method for inspecting battery electrode | |
| JP2004045273A (en) | Inspection device | |
| JPH04364449A (en) | Fruit defect detecting apparatus | |
| KR102431822B1 (en) | Multi-faceted conveyor system and deep learning-based defect detection method using the same | |
| JPH0829357A (en) | Appearance inspection device for automation line |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24803734 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |