KR102840845B1 - Non-contact type fingerprint certification emthod including distinguable counterfeit fingerfrient using ai - Google Patents
Non-contact type fingerprint certification emthod including distinguable counterfeit fingerfrient using aiInfo
- Publication number
- KR102840845B1 KR102840845B1 KR1020220102361A KR20220102361A KR102840845B1 KR 102840845 B1 KR102840845 B1 KR 102840845B1 KR 1020220102361 A KR1020220102361 A KR 1020220102361A KR 20220102361 A KR20220102361 A KR 20220102361A KR 102840845 B1 KR102840845 B1 KR 102840845B1
- Authority
- KR
- South Korea
- Prior art keywords
- fingerprint
- fake
- value
- neural network
- fingerprints
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1382—Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1312—Sensors therefor direct reading, e.g. contactless acquisition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/1365—Matching; Classification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0264—Details of the structure or mounting of specific components for a camera module assembly
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Computational Linguistics (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Databases & Information Systems (AREA)
- Signal Processing (AREA)
- Collating Specific Patterns (AREA)
Abstract
본 발명은 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에 관한 것으로, 더욱 상세하게는 카메라가 내장된 모바일기기를 활용하여 비접촉식으로 지문을 인식할 때 종이에 인쇄하거나, 화면에 재생되는 등의 위조지문을 사용하는 것을 판별할 수 있도록 형성하고, 이러한 판별데이터를 AI로 학습하여 위조지문을 빠르고 정확하게 판별할 수 있도록 하는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에 관한 것이다.
본 발명의 바람직한 실시예로 형성되는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에 의하면 카메라로 촬영되는 지문이미지를 이용하는 비접촉식 지문인식방법에서도 위조지문을 판별할 수 있고, 타인의 지문을 도촬하는 등의 위조지문을 촬영된 지문이미지의 노이즈패턴을 활용하여 위조지문으로 판별할 수 있으며, AI를 이용하여 위조지문데이터를 저장 및 분석하여 위조지문을 보다 빠르고 정확하게 판별할 수 있는 등의 효과가 발생한다.The present invention relates to a contactless fingerprint authentication method including a fake fingerprint detection function using AI, and more specifically, to a contactless fingerprint authentication method including a fake fingerprint detection function using AI, which enables the use of fake fingerprints, such as those printed on paper or reproduced on a screen, when recognizing a fingerprint in a contactless manner using a mobile device with a built-in camera, and learns such detection data using AI to quickly and accurately detect fake fingerprints.
According to a non-contact fingerprint authentication method including a fake fingerprint detection function using AI formed as a preferred embodiment of the present invention, fake fingerprints can be detected even in a non-contact fingerprint recognition method using a fingerprint image captured by a camera, fake fingerprints such as those obtained by secretly taking pictures of another person's fingerprints can be detected as fake fingerprints using a noise pattern of the captured fingerprint image, and fake fingerprints can be detected more quickly and accurately by storing and analyzing fake fingerprint data using AI.
Description
본 발명은 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에 관한 것으로, 더욱 상세하게는 카메라가 내장된 모바일기기를 활용하여 비접촉식으로 지문을 인식할 때 종이에 인쇄하거나, 화면에 재생되는 등의 위조지문을 사용하는 것을 판별할 수 있도록 형성하고, 이러한 판별데이터를 AI로 학습하여 위조지문을 빠르고 정확하게 판별할 수 있도록 하는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에 관한 것이다.The present invention relates to a contactless fingerprint authentication method including a fake fingerprint detection function using AI, and more specifically, to a contactless fingerprint authentication method including a fake fingerprint detection function using AI, which enables the use of fake fingerprints, such as those printed on paper or reproduced on a screen, when recognizing a fingerprint in a contactless manner using a mobile device with a built-in camera, and learns such detection data using AI to quickly and accurately detect fake fingerprints.
생체인식(Biometrics)이란 인간의 신체특성 또는 행동적 특징을 기반으로 하여 사용자의 신원을 파악하는 본인 인증기술을 말하며, 국내의 생체인식포럼에서는 '행동적, 생물학적(해부학적, 생리학적) 특징의 관찰에 기반을 둔 본인인증기술로 정의하고 있다.Biometrics refers to a user authentication technology that identifies a user based on human physical or behavioral characteristics. The domestic biometrics forum defines it as a user authentication technology based on the observation of behavioral and biological (anatomical and physiological) characteristics.
상기 생체인식기술에 사용되는 인간의 신체적 특징에 관한 정보를 인식하는 방법은 지문인식(Fingerprint), 홍채인식(Iris-scan), 망막인식(Retina-scan), 손모양(Hand geo-metry), 안면인식(Facial recognition)의 방법이 있고, 행동적 특징에 관한 정보를 인식하는 방법으로는 음성인식(Voice recognition), 서명(Signaturescan), 걸음걸이인식 등의 방법이 있다.Methods for recognizing information about human physical characteristics used in the above biometric technology include fingerprint recognition, iris-scan, retina-scan, hand geometry, and facial recognition, and methods for recognizing information about behavioral characteristics include voice recognition, signature scan, and gait recognition.
상기 생체인식기술에서 가장 많이 활용되고, 보편화 된 것은 지문인식인데, 인간의 손가락에 형성된 지문(Fingerprint)은 모두 다르고, 개인마다 서로 다를 뿐만 아니라, 태어날 때의 모습 그대로 평생 변하지 않는 특성이 있어 판별성에 대한 신뢰도와 안정도가 높으며, 또 정보의 저장 및 대조가 간편용이하다.Among the above biometric technologies, the most widely used and widespread is fingerprint recognition. Fingerprints formed on human fingers are all different, and not only are they different from one individual to another, but they also have the characteristic of not changing throughout life as they are at birth, so they have high reliability and stability in identification, and information storage and comparison are also easy and convenient.
종래의 지문인식을 위한 지문인식기(Fingerprint verification device)는 상단에 유리면(센서)이 형성되고, 내부에 카메라와 조명이 내장된 구성으로, 유리면(센서)에 사용자의 손가락을 접촉하고 지문인식기를 작동하면 카메라가 손가락을 촬영하여 지문이미지를 획득하도록 형성된다.A conventional fingerprint verification device has a glass surface (sensor) formed on the top, and a camera and light built into the inside. When the user touches the glass surface (sensor) and operates the fingerprint verification device, the camera captures the finger and acquires a fingerprint image.
또한, 상기 지문인식기에는 실제 인체의 지문이 아닌 실리콘 등으로 만든 위조 지문인지를 변별하는 기능(LiveFinger Detection, LFD), 지문의 융선의 특징점들을 추출하여 저장하는 기능(최적의 인식지문데이터 생성 기능, Image Enhancement Feature Extraction Matching) 등이 함께 제공된다.In addition, the fingerprint recognition device is provided with a function to distinguish between a fake fingerprint made of silicone or the like and an actual human fingerprint (LiveFinger Detection, LFD), and a function to extract and store the characteristic points of the ridges of a fingerprint (optimal recognition fingerprint data generation function, Image Enhancement Feature Extraction Matching).
상기 지문인식기는 광학식(또는 접촉식) 지문인식기라고 하는데, 이러한 광학식 지문인식기에 의해서 촬영된 지문이미지를 선명도(Sharpness)와 색상대비(Contrast)를 조정하여 깨끗한 평면적 지문이미지를 획득한다.The above fingerprint recognition device is called an optical (or contact) fingerprint recognition device, and the fingerprint image captured by this optical fingerprint recognition device is adjusted in sharpness and color contrast to obtain a clean, flat fingerprint image.
또한, 상기 지문이미지는 지문의 융선(隆線)이 끊어진 부분, 갈라지는 부분 등을 각 지문의 융선의 특징점(위치 및 방향성)들을 추출하여 인식지문데이터로 저장하여 사용한다.In addition, the fingerprint image is used to extract the characteristic points (location and directionality) of each fingerprint ridge, such as the broken or split ridges, and to store them as recognition fingerprint data.
상기 최적의 인식지문데이터는 다른 제품들 간의 호환성(Compatibility)을 위해서 국제표준(ISO19794-2/ANSI378)에서 규정하는 방법(이하, '국제표준'이라 함)으로 데이터를 저장하게 된다.The above optimal recognition fingerprint data is stored in a method (hereinafter referred to as the “international standard”) specified in the international standard (ISO19794-2/ANSI378) to ensure compatibility between different products.
그러나, 종래의 광학식 지문인식기는 사용자가 직접 손가락을 센서에 접촉시켜야 하는 관계로 거부감이 있고, 접촉할 때 유리면(센서)에 가해지는 압력의 강도에 따라 지문의 형태가 왜곡될 수 있으며, 유리면(센서)의 접촉면에서 손가락이 미끄러지면서 지문의 형태가 왜곡될 수 있고, 주변온도와 습도의 차이 또는 사용자의 피부상태(예, 피부의 건조, 습한 정도)에 따라서 선명한 지문이미지를 획득할 수 없는 등의 문제점이 있었다.However, conventional optical fingerprint recognition devices have problems such as the user feeling uncomfortable because they have to directly touch the sensor with their finger, the shape of the fingerprint may be distorted depending on the strength of the pressure applied to the glass surface (sensor) when touching, the shape of the fingerprint may be distorted when the finger slips on the contact surface of the glass surface (sensor), and it is difficult to obtain a clear fingerprint image depending on the difference in ambient temperature and humidity or the condition of the user's skin (e.g., dryness or moisture of the skin).
이러한 종래의 광학식 지문인식기의 문제점을 해결하기 위해서, 최근에는 사용자가 유리면(센서)에 손가락을 직접 접촉하지 않고, 카메라로부터 조금 떨어진 위치에서 촬영하여 지문이미지를 획득하는 비접촉식 지문인식기가 개발되고 있다.To solve the problems of conventional optical fingerprint recognition devices, non-contact fingerprint recognition devices have been developed recently that acquire fingerprint images by taking pictures from a position slightly away from the camera, without the user directly touching the glass surface (sensor).
종래의 비접촉식 지문인식기 및 지문인식방법에 관한 기술은 대한민국 특허청 등록특허공보 제0604267호, 제1274260호 및 제1596298호 등에 개시된 바 있다.Technologies related to conventional non-contact fingerprint recognition devices and fingerprint recognition methods have been disclosed in Korean Intellectual Property Office Registered Patent Publications Nos. 0604267, 1274260, and 1596298.
그런데, 다양한 목적으로 위조지문으로 인증을 하고자 하는 사람들이 있어서, 종래의 지문을 인식 및 인증하는 방법에 있어서, 위조지문을 판별할 수 있는 기능이 필요하다. However, since there are people who want to use fake fingerprints for authentication for various purposes, a function that can detect fake fingerprints is needed in the conventional fingerprint recognition and authentication method.
이러한 위조지문으로는 종이에 인쇄된 위조지문(인쇄위조지문), 화면에 출력되는 위조지문(화면위조지문), 실리콘 등으로 타인의 지문을 본떠서 이를 착용하고 인식하는 위조지문(실리콘위조지문) 또는 동의하지 않는 사람의 손가락을 비접촉 방식으로 도촬하여 인식시키는 위조지문(도촬위조지문) 등이 존재한다.These fake fingerprints include fake fingerprints printed on paper (printed fake fingerprints), fake fingerprints displayed on a screen (screen fake fingerprints), fake fingerprints made by wearing a silicone-like material that imitates another person's fingerprint and then recognizing it (silicon fake fingerprints), and fake fingerprints made by taking a non-contact photograph of a person's fingers without their consent (secretly photographed fake fingerprints).
이러한 위조지문을 막기 위해서 대한민국 특허청 공개특허공보 제2017-0112302호, 특허 제874688호 및 제1179559호 등에는 신경망학습을 통한 위조지문 판별기술이 제공되고 있다.To prevent such fake fingerprints, the Korean Intellectual Property Office provides a technology for identifying fake fingerprints using neural network learning in Patent Publication No. 2017-0112302, Patent No. 874688, and Patent No. 1179559.
그러나, 종래의 위조지문 판별기술에는 다음과 같은 문제점이 있었다.However, conventional fake fingerprint detection technology had the following problems.
(1) 접촉식 지문인식장치에는 위조지문 판별을 위한 다양한 기술이 제공되고 있지만, 카메라가 내장된 모바일기기의 카메라를 사용하는 비접촉식 지문인식방법에는 제공되지 않고 있다.(1) Various technologies for detecting fake fingerprints are provided for contact-type fingerprint recognition devices, but they are not provided for non-contact fingerprint recognition methods that use the camera of a mobile device with a built-in camera.
(2) 비접촉식으로 촬영할 때 잘못인식될 수 있는 특수한 위조지문 또는 도촬에 의한 타인의 지문습득 등의 문제에 대하여는 대처하지 못한다.(2) It cannot address issues such as special fake fingerprints that may be misrecognized when taking non-contact photos or acquiring someone else's fingerprints through secret photography.
(3) 위조지문을 데이터화하고 이를 분석하여 위조지문을 빠르고 정확하게 판별할 수 있는 방법이 제공되지 못하고 있다.(3) There is no method available to quickly and accurately identify fake fingerprints by digitizing them and analyzing them.
상기한 문제점을 해결하기 위해서, 본 발명은 카메라가 내장된 모바일기기로 피사체인 손가락을 촬영하여 손가락지문을 획득하는 비접촉식 지문인식방법에 있어서,In order to solve the above-mentioned problem, the present invention provides a non-contact fingerprint recognition method for obtaining a fingerprint by taking a picture of a subject's finger with a mobile device having a built-in camera.
상기 카메라가 내장된 모바일기기에 지문인식프로그램이 형성되고, 상기 지문인식프로그램에는 위조지문판별프로그램이 형성되되,A fingerprint recognition program is formed in a mobile device having the above camera built in, and a fake fingerprint detection program is formed in the fingerprint recognition program.
상기 위조지문판별프로그램은 노이즈패턴을 인식하여 위조지문을 판별하는 노이즈패턴방식과;The above fake fingerprint detection program uses a noise pattern method to detect fake fingerprints by recognizing noise patterns;
RGB 히스토그램을 통하여 위조지문을 판단하는 히스토그램방식과;Histogram method for determining fake fingerprints through RGB histogram;
손가락의 윤곽선을 인식하여 위조지문을 판단하는 윤곽선방식을 동시에 채용하고,Simultaneously adopting the contour method that recognizes the outline of the finger and determines whether it is a fake fingerprint,
발견되는 위조지문은 해당 노이즈패턴, RGB 히스토그램패턴 및 윤곽선오류 등을 각각 위조지문데이터로 저장하며,The discovered fake fingerprints are stored as fake fingerprint data, including the noise pattern, RGB histogram pattern, and outline errors.
상기 위조지문데이터들을 AI를 통해서 딥러닝하도록 하는 것을 특징으로 한다.It is characterized by deep learning of the above fake fingerprint data through AI.
본 발명의 바람직한 실시예로 형성되는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에 의하면 다음과 같은 효과가 발생한다.According to a non-contact fingerprint authentication method including a fake fingerprint detection function using AI formed as a preferred embodiment of the present invention, the following effects occur.
(1) 카메라로 촬영되는 지문이미지를 이용하는 비접촉식 지문인식방법에서도 위조지문을 판별할 수 있다.(1) Fake fingerprints can also be identified using a non-contact fingerprint recognition method that uses a fingerprint image captured by a camera.
(2) 타인의 지문을 도촬하는 등의 위조지문을 촬영된 지문이미지의 노이즈패턴을 활용하여 위조지문으로 판별할 수 있다.(2) Forged fingerprints, such as those obtained by taking pictures of other people's fingerprints, can be identified as forged fingerprints by utilizing the noise pattern of the captured fingerprint image.
(3) AI를 이용하여 위조지문데이터를 저장 및 분석하여 위조지문을 보다 빠르고 정확하게 판별할 수 있다.(3) Using AI, fake fingerprint data can be stored and analyzed to identify fake fingerprints more quickly and accurately.
도 1은 위조지문의 예를 나타낸 사진.
도 2는 본 발명의 바람직한 실시예로 형성되는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법을 나타낸 순서도.
도 3은 본 발명의 바람직한 실시예로 형성되는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에서 히스토그램방식의 예를 나타낸 사진.Figure 1 is a photo showing an example of a fake fingerprint.
Figure 2 is a flowchart showing a non-contact fingerprint authentication method including a fake fingerprint detection function using AI formed as a preferred embodiment of the present invention.
FIG. 3 is a photograph showing an example of a histogram method in a non-contact fingerprint authentication method including a fake fingerprint detection function using AI formed as a preferred embodiment of the present invention.
본 발명의 구체적인 실시예를 설명하기에 앞서, 본 명세서에 도시된 도면은 본 발명을 보다 명확하게 설명하기 위해서 그 구성요소의 크기나 형상 등을 다소 과장되거나 단순화시켜 표현할 수 있다.Before describing specific embodiments of the present invention, the drawings illustrated in this specification may be somewhat exaggerated or simplified in size or shape of components in order to more clearly explain the present invention.
본 발명에서 정의된 용어 및 부호들은 사용자, 운용자 및 작성자에 의해서 임의로 정의되거나, 선택적으로 사용된 용어이기 때문에, 이러한 용어들은 본 명세서의 전체적인 내용을 토대로 본 발명의 기술적 사상에 부합하는 의미와 개념으로 해석되어야 하고, 용어자체의 의미로 한정하여서는 안된다.Since the terms and symbols defined in the present invention are terms arbitrarily defined or selectively used by users, operators, and authors, these terms should be interpreted as meanings and concepts that conform to the technical idea of the present invention based on the entire contents of this specification, and should not be limited to the meaning of the terms themselves.
본 발명은 카메라가 내장된 모바일기기로 피사체인 손가락을 촬영하여 손가락지문을 획득하는 비접촉식 지문인식방법에 있어서,The present invention relates to a non-contact fingerprint recognition method for obtaining a fingerprint by taking a picture of a subject's finger using a mobile device having a built-in camera.
상기 카메라가 내장된 모바일기기에 지문인식프로그램이 형성되고, 상기 지문인식프로그램에는 위조지문판별프로그램이 형성되되,A fingerprint recognition program is formed in a mobile device having the above camera built in, and a fake fingerprint detection program is formed in the fingerprint recognition program.
상기 위조지문판별프로그램은 노이즈패턴을 인식하여 위조지문을 판별하는 노이즈패턴방식과;The above fake fingerprint detection program uses a noise pattern method to detect fake fingerprints by recognizing noise patterns;
RGB 히스토그램을 통하여 위조지문을 판단하는 히스토그램방식과;Histogram method for determining fake fingerprints through RGB histogram;
손가락의 윤곽선을 인식하여 위조지문을 판단하는 윤곽선방식을 동시에 채용하고,Simultaneously adopting the contour method that recognizes the outline of the finger and determines whether it is a fake fingerprint,
발견되는 위조지문은 해당 노이즈패턴, RGB 히스토그램패턴 및 윤곽선오류 등을 각각 위조지문데이터로 저장하며,The discovered fake fingerprints are stored as fake fingerprint data, including the noise pattern, RGB histogram pattern, and outline errors.
상기 위조지문데이터들을 AI를 통해서 딥러닝함으로써 위조지문을 빠르고 정확하게 판별할 수 있도록 구성된다.The above fake fingerprint data is configured to be able to quickly and accurately identify fake fingerprints by performing deep learning on the AI.
본 발명의 비접촉식 지문인식방법은 카메라가 내장된 모바일기기에 채용되고, 상기 카메라가 내장된 모바일기기에 지문인식프로그램이 설치된 상태에서 지문인식프로그램 내에 위조지문판별프로그램이 형성된다.The non-contact fingerprint recognition method of the present invention is employed in a mobile device having a built-in camera, and a fake fingerprint determination program is formed within the fingerprint recognition program while a fingerprint recognition program is installed in the mobile device having a built-in camera.
상기 위조지문판별프로그램은 노이즈패턴방식, 히스토그램방식 및 윤곽선방식이 동시에 적용된다.The above fake fingerprint detection program applies noise pattern method, histogram method, and outline method simultaneously.
상기 노이즈패턴방식은 카메라가 손가락을 촬영할 때 발생하는 노이즈패턴을 이용하는 방식으로, 원거리에서 도촬하는 손가락이미지에 노이즈가 더 많이 발생하기 때문에 품질이 좋지 않은 것을 활용하는 방식이다.The above noise pattern method is a method that utilizes the noise pattern that occurs when a camera photographs a finger. This method utilizes the fact that the quality of finger images photographed from a distance is poor because more noise occurs in the image.
즉, 정상지문을 촬영할 때에는, 강한 조명이 조사되어 선명하게 촬영되는데, 원거리에서 도촬할 경우에는 노이즈패턴이 생기면서 화질이 떨어지게 되고, 이러한 노이즈패턴이 발생할 경우에는 위조지문으로 판별할 수 있다.That is, when taking a picture of a normal fingerprint, strong lighting is applied to capture it clearly, but when taking a picture from a distance, a noise pattern is created and the image quality deteriorates. If such a noise pattern occurs, it can be determined as a fake fingerprint.
상기 노이즈패턴방식과 동시에 히스토그램방식을 사용하여 위조지문을 판별하는데, 카메라가 내장된 모바일기기로 정상지문을 촬영할 때에는 강한 조명과 함께 촬영되므로 매우 선명하게 촬영될 뿐만 아니라 컬러로 촬영되기 때문에 RGB색상에서 RGB 히스토그램을 확인할 수 있어서 정상지문과 위조지문의 이러한 RGB 히스토그램의 차이에서 위조지문을 판별할 수 있다.In addition to the above noise pattern method, the histogram method is used to determine fake fingerprints. When taking a picture of a normal fingerprint with a mobile device equipped with a built-in camera, it is taken with strong lighting, so it is not only taken very clearly, but also taken in color, so the RGB histogram can be confirmed in RGB colors, so the fake fingerprint can be determined from the difference in the RGB histograms of the normal fingerprint and the fake fingerprint.
즉, 정상지문이 촬영될 때에는 RGB 히스토그램이 분명하게 드러나는데 반해, 인쇄되거나, 화면에 출력되는 위조지문이나 종이에 인쇄된 위조지문은 RGB 히스토그램이 분명하지 않고 서로 겹치면서 색상이 잘 나타나지 않는다.That is, when a normal fingerprint is photographed, the RGB histogram is clearly revealed, whereas for a fake fingerprint that is printed, displayed on a screen, or printed on paper, the RGB histogram is not clear and the colors do not appear well as they overlap.
도 3은 정상지문과 위조지문의 RGB 히스토그램을 대비한 것으로, 왼쪽에 형성된 지문은 정상지문이고, 오른쪽에 형성된 지문은 위조지문(인쇄한 사진을 촬영한)이다.Figure 3 compares the RGB histograms of a normal fingerprint and a fake fingerprint. The fingerprint formed on the left is a normal fingerprint, and the fingerprint formed on the right is a fake fingerprint (a photograph of a printed fingerprint).
도 3에서, 왼쪽의 RGB 히스토그램은 피크가 뾰족하고 색상들이 선명하게 나타나는데 반해, 오른쪽의 RGB 히스토그램은 피크가 뭉그러지면서 모든 색상이 다 나타나지 않는다.In Figure 3, the RGB histogram on the left has sharp peaks and colors appear clearly, while the RGB histogram on the right has blurred peaks and not all colors appear.
상기 히스토그램방식에서는 2~3개 정도의 융선의 영역을 선택하여 융선의 RGB 히스토그램을 분석하여도 충분히 판별할 수 있다.In the above histogram method, it is possible to sufficiently determine the ridge by selecting 2 to 3 ridge areas and analyzing the RGB histogram of the ridges.
또, 정상지문을 촬영하게 되면, 융선영역은 융선과 융선사이의 골영역에 비해 빛반사율이 높게 나타나고, 원근감이 나타나기 때문에 손가락지문영역의 중앙이 손가락지문영역의 외곽지역에 비해 빛반사율이 높게 나타나서 조도의 차이를 확인할 수 있는데, 위조지문은 이러한 조도의 차이가 나타나지 않으므로 이를 판별할 수 있다.Also, when a normal fingerprint is photographed, the ridge area has a higher light reflectance than the valley area between the ridges, and because of the sense of perspective, the center of the fingerprint area has a higher light reflectance than the outer area of the fingerprint area, so the difference in illumination can be confirmed. However, this difference in illumination does not appear in a fake fingerprint, so it can be determined.
본 발명은 노이즈패턴방식과 히스토그램방식만으로도 충분히 위조지문을 판별할 수 있으나, 보다 정확하게 판별할 수 있도록 하기 위해서 윤곽선방식도 채용할 수 있다.The present invention can sufficiently identify fake fingerprints using only the noise pattern method and the histogram method, but the outline method can also be employed to enable more accurate identification.
상기 윤곽선방식은 촬영된 지문이미지를 예상되는 정상지문일때의 융선과 손가락의 외곽선에서 별도로 벗어나거나 새롭게 생기는 음영부분을 확인하여 이러한 음영부분이 위조지문인지를 판별하는 방식이다.The above outline method is a method of determining whether a photographed fingerprint image is a fake fingerprint by checking for newly created or shaded areas that deviate from the expected normal fingerprint ridges and outline of the finger.
상기와 같이 위조지문판별프로그램에 의해서 위조지문을 판별하면, 이러한 위조지문데이터는 노이즈패턴정도, RGB 히스토그램의 분포형태 및 특이한 형상의 윤곽선오류의 데이터를 위조지문데이터로 저장한다.When a fake fingerprint is identified by a fake fingerprint identification program as described above, the fake fingerprint data is saved as fake fingerprint data, including the noise pattern level, RGB histogram distribution shape, and contour error data of a unique shape.
상기 위조지문데이터는 인쇄위조지문, 화면위조지문, 실리콘위조지문, 도촬위조지문 등으로 분류하여 형성한다.The above fake fingerprint data is classified into printed fake fingerprints, screen fake fingerprints, silicon fake fingerprints, and burglarized fake fingerprints.
상기 저장된 위조지문데이터들을 AI를 통해서 딥러닝함으로써 위조지문을 빠르고 정확하게 판별하도록 형성된다.The above stored fake fingerprint data is formed to quickly and accurately identify fake fingerprints by performing deep learning using AI.
상기 위조지문데이터를 분석하는 AI의 인공지능신경망은 지문상태값(s)이 심층강화학습신경망, 판별예측신경망 및 판별신경망으로 전송되는 상태값전송단계와;The artificial intelligence neural network of the AI that analyzes the above-mentioned fake fingerprint data includes a state value transmission step in which the fingerprint state value (s) is transmitted to a deep reinforcement learning neural network, a discriminant prediction neural network, and a discriminant neural network;
상기 판별예측신경망은 정상지문예측신경망과 위조지문예측신경망으로 구성되고, 정상지문예측신경망과 위조지문예측신경망은 지문상태값(s)에 대하여 학습한 후에 추가판별값(γ)을 결정하여 전송하는 추가판별값전송단계와;The above discriminant prediction neural network is composed of a normal fingerprint prediction neural network and a fake fingerprint prediction neural network, and the normal fingerprint prediction neural network and the fake fingerprint prediction neural network have an additional discriminant value transmission step of determining and transmitting an additional discriminant value (γ) after learning about the fingerprint status value (s);
상기 심층강화학습신경망은 Actor신경망과 Critic신경망으로 구성되고, Critic신경망은 전송된 지문상태값(s)과 기대된 판별값(a)을 추가판별값(γ)과 함께 상태-가치함수를 통하여 가치값(Q)을 산출하여 Actor신경망으로 전송하고, Actor신경망은 가치값(Q)과 상태값(s)을 활용하여 최적의 최적판별값(A)을 산출하는 최적판별값산출단계와;The above deep reinforcement learning neural network is composed of an Actor neural network and a Critic neural network, and the Critic neural network calculates a value value (Q) through a state-value function along with the transmitted fingerprint state value (s) and the expected discriminant value (a) and an additional discriminant value (γ) and transmits the value value to the Actor neural network, and the Actor neural network calculates an optimal discriminant value (A) by utilizing the value value (Q) and the state value (s), in an optimal discriminant value calculation step;
상기 판별신경망은 판별값(a)을 학습데이터로 저장하고, 머신러닝기법을 사용하여 최적의 지문판별값(K)을 산출하고, 위조지문판별프로그램의 판별수단으로 전송하는 지문판별값산출단계로 구성된다.The above discriminant neural network is composed of a fingerprint discrimination value calculation step that stores the discrimination value (a) as learning data, calculates the optimal fingerprint discrimination value (K) using a machine learning technique, and transmits it as a discrimination means of a fake fingerprint discrimination program.
상기 인공지능신경망은 지도학습알고리즘을 사용하고 훈련세트(학습데이터)를 이용하여 기본 알고리즘을 최적화한 상태로 운영한다.The above artificial intelligence neural network uses a supervised learning algorithm and operates in an optimized state using a training set (learning data).
상기 지도학습알고리즘은 많은 종류의 위조지문데이터의 지문상태값들을 사용하여, 최적의 지문판별값(K)를 찾을 수 있도록 기본 알고리즘을 최적화한 상태에서 운영한다.The above supervised learning algorithm operates in a state where the basic algorithm is optimized to find the optimal fingerprint discrimination value (K) by using fingerprint status values of many types of fake fingerprint data.
상기 판별예측신경망에 사용되는 판별함수의 예로는 An example of the discriminant function used in the above discriminant prediction neural network is
를 사용할 수 있는데, 특정 위치(t)에 상태값(s)과 판별값(a)일 때 판별값(γ)을 기대값으로 학습산출한다.It can be used to learn and output the discriminant value (γ) as the expected value when the state value (s) and discriminant value (a) are at a specific location (t).
상기 판별예측신경망은 정상지문예측신경망과 위조지문예측신경망으로 구성되고, 정상지문예측신경망과 위조지문예측신경망은 지문상태값(s)에 대하여 학습한 후에 평가된 판별값의 평균으로 최종 추가판별값(γ)으로 결정하여 심층강화학습신경망으로 전송한다.The above discriminant prediction neural network is composed of a normal fingerprint prediction neural network and a fake fingerprint prediction neural network. After the normal fingerprint prediction neural network and the fake fingerprint prediction neural network learn about the fingerprint status value (s), the average of the evaluated discriminant values is determined as the final additional discriminant value (γ) and transmitted to the deep reinforcement learning neural network.
상기 심층강화학습신경망은 Actor신경망과 Critic신경망으로 구성되는데, Critic신경망은 상태값(s)과 판별값(γ)을 활용하여 가치값(Q)을 학습산출한다.The above deep reinforcement learning neural network is composed of an Actor neural network and a Critic neural network. The Critic neural network learns and produces a value value (Q) by utilizing the state value (s) and the discriminant value (γ).
상기 가치값(Q)에 사용되는 방정식의 예로는An example of an equation used for the above value (Q) is
의 벨만기대방정식을 사용할 수 있다.The Bellman expectation equation can be used.
상기 Actor신경망은 상태값(s)과 가치값(Q)을 사용하여 최적의 판별값(A)을 학습산출한다.The above Actor neural network learns and outputs the optimal discriminant value (A) using the state value (s) and value value (Q).
상기 Actor신경망에서 생성된 최적판별값(A)은 Critic신경망으로 전송되어 새로운 직전 상태값으로 저장한다.The optimal decision value (A) generated from the above Actor neural network is transmitted to the Critic neural network and stored as a new previous state value.
상기와 같이 AI를 이용한 딥러닝으로 판별값을 업데이트함으로써, 촬영된 지문이미지로부터 예측되는 위조지문을 정확히 판별하고, 이를 학습하여 추후 위조지문을 빠르고 정확하게 판별할 수 있다.By updating the discrimination value using deep learning using AI as described above, it is possible to accurately determine fake fingerprints predicted from captured fingerprint images and learn from this to quickly and accurately determine fake fingerprints in the future.
본 발명의 바람직한 실시예로 형성되는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법에 의하면 카메라로 촬영되는 지문이미지를 이용하는 비접촉식 지문인식방법에서도 위조지문을 판별할 수 있고, 타인의 지문을 도촬하는 등의 위조지문을 촬영된 지문이미지의 노이즈패턴을 활용하여 위조지문으로 판별할 수 있으며, AI를 이용하여 위조지문데이터를 저장 및 분석하여 위조지문을 보다 빠르고 정확하게 판별할 수 있는 등의 효과가 발생한다.According to a non-contact fingerprint authentication method including a fake fingerprint detection function using AI formed as a preferred embodiment of the present invention, fake fingerprints can be detected even in a non-contact fingerprint recognition method using a fingerprint image captured by a camera, fake fingerprints such as those obtained by secretly taking pictures of another person's fingerprints can be detected as fake fingerprints using a noise pattern of the captured fingerprint image, and fake fingerprints can be detected more quickly and accurately by storing and analyzing fake fingerprint data using AI.
본 발명은 첨부된 도면을 참조하여 바람직한 실시 예를 중심으로 기술되었지만 당업자라면 이러한 기재로부터 후술하는 특허청구범위에 의해 포괄되는 본 발명의 범주를 벗어남이 없이 다양한 변형이 가능하다는 것은 명백하다.Although the present invention has been described with reference to the attached drawings and focusing on preferred embodiments, it will be apparent to those skilled in the art that various modifications are possible without departing from the scope of the present invention encompassed by the claims set forth below.
Claims (2)
상기 인공지능신경망은 위조지문데이터를 분석하도록 지문상태값(s)이 심층강화학습신경망, 판별예측신경망 및 판별신경망으로 전송되는 상태값전송단계와;
상기 판별예측신경망은 정상지문예측신경망과 위조지문예측신경망으로 구성되고, 정상지문예측신경망과 위조지문예측신경망은 지문상태값(s)에 대하여 학습한 후에 추가판별값(γ)을 결정하여 전송하는 추가판별값전송단계와;
상기 심층강화학습신경망은 Actor신경망과 Critic신경망으로 구성되고, Critic신경망은 전송된 지문상태값(s)과 기대된 판별값(a)을 추가판별값(γ)과 함께 상태-가치함수를 통하여 가치값(Q)을 산출하여 Actor신경망으로 전송하고, Actor신경망은 가치값(Q)과 상태값(s)을 활용하여 최적의 최적판별값(A)을 산출하는 최적판별값산출단계와;
상기 판별신경망은 판별값(a)을 학습데이터로 저장하고, 머신러닝기법을 사용하여 최적의 지문판별값(K)을 산출하고, 위조지문판별프로그램의 판별수단으로 전송하는 지문판별값산출단계로 구성되는 것을 특징으로 하는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법.In a general non-contact fingerprint recognition method, a fingerprint is acquired by taking a picture of the subject's finger with a mobile device equipped with a camera and learning it with an artificial intelligence neural network,
The above artificial intelligence neural network includes a state value transmission step in which a fingerprint state value (s) is transmitted to a deep reinforcement learning neural network, a discriminant prediction neural network, and a discriminant neural network to analyze fake fingerprint data;
The above discriminant prediction neural network is composed of a normal fingerprint prediction neural network and a fake fingerprint prediction neural network, and the normal fingerprint prediction neural network and the fake fingerprint prediction neural network have an additional discriminant value transmission step of determining and transmitting an additional discriminant value (γ) after learning about the fingerprint status value (s);
The above deep reinforcement learning neural network is composed of an Actor neural network and a Critic neural network, and the Critic neural network calculates a value value (Q) through a state-value function along with the transmitted fingerprint state value (s) and the expected discriminant value (a) and an additional discriminant value (γ) and transmits the value value to the Actor neural network, and the Actor neural network calculates an optimal discriminant value (A) by utilizing the value value (Q) and the state value (s), in an optimal discriminant value calculation step;
A contactless fingerprint authentication method including a fake fingerprint detection function using AI, characterized in that the above-mentioned discriminant neural network is configured with a fingerprint detection value calculation step that stores the discrimination value (a) as learning data, calculates the optimal fingerprint discrimination value (K) using a machine learning technique, and transmits it to the discrimination means of a fake fingerprint detection program.
상기 위조지문데이터는 인쇄위조지문, 화면위조지문, 실리콘위조지문 및 도촬위조지문으로 분류되는 것을 특징으로 하는 AI를 이용한 위조지문판별기능이 포함된 비접촉식 지문인증방법.In the first paragraph,
A contactless fingerprint authentication method including a fake fingerprint identification function using AI, characterized in that the above fake fingerprint data is classified into printed fake fingerprints, screen fake fingerprints, silicon fake fingerprints, and surreptitious fake fingerprints.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020220102361A KR102840845B1 (en) | 2021-12-02 | 2022-08-16 | Non-contact type fingerprint certification emthod including distinguable counterfeit fingerfrient using ai |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020210170995 | 2021-12-02 | ||
KR1020220102361A KR102840845B1 (en) | 2021-12-02 | 2022-08-16 | Non-contact type fingerprint certification emthod including distinguable counterfeit fingerfrient using ai |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020210170995 Division | 2021-12-02 | 2021-12-02 |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20230083208A KR20230083208A (en) | 2023-06-09 |
KR102840845B1 true KR102840845B1 (en) | 2025-08-01 |
Family
ID=86612527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020220102361A Active KR102840845B1 (en) | 2021-12-02 | 2022-08-16 | Non-contact type fingerprint certification emthod including distinguable counterfeit fingerfrient using ai |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102840845B1 (en) |
WO (1) | WO2023101200A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20250059894A (en) | 2023-10-25 | 2025-05-07 | 주식회사 슈프리마 | device and method for identifying fake fingerprints, and method of learning neural network for identifying fake fingerprints |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210209327A1 (en) | 2020-01-02 | 2021-07-08 | Egis Technology Inc. | Touch display device with fingerprint anti-spoofing function and associated fingerprint anti-spoofing method |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018106987A1 (en) * | 2016-12-08 | 2018-06-14 | Veridium Ip Limited | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices |
KR20190097706A (en) * | 2018-02-13 | 2019-08-21 | 위드로봇 주식회사 | Apparatus and method for contactless fingerprint recognition |
KR20200051903A (en) * | 2018-11-05 | 2020-05-14 | 주식회사 비젼인 | Fake fingerprint detection method and system |
KR102756879B1 (en) * | 2019-10-31 | 2025-01-17 | 엘지전자 주식회사 | Anti-spoofing method and apparatus for biometric recognition |
WO2021226709A1 (en) * | 2020-05-11 | 2021-11-18 | Fluent.Ai Inc. | Neural architecture search with imitation learning |
-
2022
- 2022-08-16 KR KR1020220102361A patent/KR102840845B1/en active Active
- 2022-10-13 WO PCT/KR2022/015488 patent/WO2023101200A1/en not_active Ceased
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210209327A1 (en) | 2020-01-02 | 2021-07-08 | Egis Technology Inc. | Touch display device with fingerprint anti-spoofing function and associated fingerprint anti-spoofing method |
Also Published As
Publication number | Publication date |
---|---|
WO2023101200A1 (en) | 2023-06-08 |
KR20230083208A (en) | 2023-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12288414B2 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
US12223760B2 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
US10339362B2 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
CN110326001B (en) | Systems and methods for performing fingerprint-based user authentication using images captured with a mobile device | |
KR102840845B1 (en) | Non-contact type fingerprint certification emthod including distinguable counterfeit fingerfrient using ai | |
CN112232152B (en) | Non-contact fingerprint identification method and device, terminal and storage medium | |
HK40069201A (en) | Methods and systems for performing fingerprint identification | |
HK40010111A (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
HK40010111B (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PA0107 | Divisional application |
St.27 status event code: A-0-1-A10-A18-div-PA0107 St.27 status event code: A-0-1-A10-A16-div-PA0107 |
|
PA0201 | Request for examination |
St.27 status event code: A-1-2-D10-D11-exm-PA0201 |
|
P22-X000 | Classification modified |
St.27 status event code: A-2-2-P10-P22-nap-X000 |
|
PG1501 | Laying open of application |
St.27 status event code: A-1-1-Q10-Q12-nap-PG1501 |
|
R18-X000 | Changes to party contact information recorded |
St.27 status event code: A-3-3-R10-R18-oth-X000 |
|
P22-X000 | Classification modified |
St.27 status event code: A-2-2-P10-P22-nap-X000 |
|
R18-X000 | Changes to party contact information recorded |
St.27 status event code: A-3-3-R10-R18-oth-X000 |
|
E701 | Decision to grant or registration of patent right | ||
PE0701 | Decision of registration |
St.27 status event code: A-1-2-D10-D22-exm-PE0701 |
|
PR0701 | Registration of establishment |
St.27 status event code: A-2-4-F10-F11-exm-PR0701 |
|
PR1002 | Payment of registration fee |
St.27 status event code: A-2-2-U10-U11-oth-PR1002 Fee payment year number: 1 |
|
PG1601 | Publication of registration |
St.27 status event code: A-4-4-Q10-Q13-nap-PG1601 |