[go: up one dir, main page]

TW201232428A - Palm biometric identification method - Google Patents

Palm biometric identification method Download PDF

Info

Publication number
TW201232428A
TW201232428A TW100103407A TW100103407A TW201232428A TW 201232428 A TW201232428 A TW 201232428A TW 100103407 A TW100103407 A TW 100103407A TW 100103407 A TW100103407 A TW 100103407A TW 201232428 A TW201232428 A TW 201232428A
Authority
TW
Taiwan
Prior art keywords
palm
point
processor
points
tested
Prior art date
Application number
TW100103407A
Other languages
Chinese (zh)
Other versions
TWI531985B (en
Inventor
Zhen-Sen Ouyang
Jing-Tai Jiang
Rong-Qing Wu
zhong-yan Cai
Original Assignee
Univ Ishou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Ishou filed Critical Univ Ishou
Priority to TW100103407A priority Critical patent/TWI531985B/en
Publication of TW201232428A publication Critical patent/TW201232428A/en
Application granted granted Critical
Publication of TWI531985B publication Critical patent/TWI531985B/en

Links

Landscapes

  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed is a palm biometric identification method, suitable for using a processor to receive a palm image to be tested corresponding to the palm of a user. The method includes the steps of: configuring the processor to process the palm image to be tested; configuring the processor to inspect the rotation angle of the palm to be tested; configuring the processor to search all features of the palm to be tested; configuring the processor to obtain eigenvectors of the palm to be tested according to those features; and configuring the processor to obtain and store a biological representative prototype corresponding to the user based on those eigenvectors. Accordingly, a user represented by the biological representative prototype is identified as a legitimate login user or not.

Description

201232428 六、發明說明: 【發明所屬之技術領域】 本發明是有關於一種手掌生物辨識方法’特別是指一 種可以有效辨識以任何方向及任何角度輸入待測手掌影像 之手掌生物辨識方法。 【先前技術】 手掌辨識技術因為具有低成本極高辨識率等優點’而 逐漸成為現今以生物特徵進行身份確認的主流技術之一。201232428 VI. Description of the Invention: [Technical Field] The present invention relates to a palm biometric method, and particularly to a palm biometric method capable of effectively recognizing a palm image to be measured in any direction and at any angle. [Prior Art] Palm recognition technology has gradually become one of the mainstream technologies for identity recognition with biometrics because of its low cost and high recognition rate.

Ribaric, S.等人於 2009 年在 IEEE Transactions on Pattern Analysis and Machine Intelligence 中所提出之「A biometric identification system based on eigenpalm and eigenfinger features」,其中,該先前技術藉由一待測手掌及 其手指之多數個子影像以得到相關的特徵值,進而藉由該 等特徵值以進行生物特徵辨識,然而,該先前技術對於所 偵測的該待測手掌的角度及背景亮度的變化,皆有特定的 要求,因此在實際應用上往往會容易產生誤判。 而 Wu,X·,Zhang 等人於 2006 年在 IEEE Transactions on Systems, Man, and Cybernetics 中所提出之「Palm line extraction and matching for personal authentication」,其中, 該先前技術利用一鍵結編碼(chaining coding )技術,得到 多數條手掌切割線(palm lines)以進行生物特之辨識,然 而,該先前技術往往因為該鏈結編碼的演算法靈敏度太高 導致很難得到該等適當的手掌切割線。 此外,Li, F.,Leung 等人於 2006 年在 Proceedings of 201232428 18th Conference on Pattern Recognition 中所提出之「 Hierarchical identification of palmprint using line-based Hough transform」,以及.Wu,J.,Qiu 等人於 2006 年在 Proceedings of 18th Conference on Pattern Recognition 中 戶斤提出之「A hierarchical palmprint identification method using hand geometry and grayscale distribution features」, 該等先前技術皆是利用手掌幾何及其灰階分布(Gray-scale distribution)特徵以進行生物特之辨識,然而該等先前技術 對於一非垂直放置之待測手掌影像(亦即具有一旋轉角度 之手掌影像),無法校正其旋轉角度而得到該手掌之正確相 關位置。 因此,Ouyang,C.-S.等人於 2006 年在 Proceedings of 21th International Conference on Industrial, Engineering & Other Applications of Applied Intelligent Systems 中所提出 之「An adaptive biometric system based on palm texture feature and LVQ neural network」,其根據一局部模糊樣本( Local fuzzy pattern,LFP )之特徵描述態樣(Texture descriptor ),將每一待測手掌之子影像(Subimage )轉換為 一對應的特徵向量(Feature vector ),然後利用該等特徵向 量偵測出該待測手掌的轉動角度及位置,雖然相較於前面 所述之該等先前技術而言,可以克服需要特定角度及背景 亮度、演算法靈敏度過高、具有轉動角度等等的問題,然 而,該先前技術掃描該待測手掌影像時,若是該待測手掌 之旋轉角度大於50度時將可能產生誤判動作,使得一合法 201232428 使用者因該手拿辨識系統無法判讀其待測手掌影像而發生 無法登入的情形。 此外 〇uyang,C.-S.等人於 2010 年在 internati〇nal conference on machine iearning and 中 所提出之「An improved neuraI_netw〇rk based 心細士 system with rotation detection mechanism」,雖可接受該待測 ”以任意旋轉角度掃描之影像,然而其為了降低辨識演 算法複雜度而降低該待測手掌影像之解析度同時將對應 產生之多數個子影像(Subimage)正規化成直方圖後,會 使所有手指子影像的長度、寬度等資訊有所漏失,因此, 該先前技術雖然可增加辨識待測手掌之位置的準確率,作 =能發生具有相似手掌特徵之使用者,因為對應之子影 =中漏失部分資訊而導致該付辨識系統的誤判率上升之 情形,使得該手掌辨$备级$脸 法使用者n 將一合法使用者判定為一非 者或疋將-非法使用者判定為一合法使用者。 因此,如何找到-個既可提升該待測 度且可有效降低辨哄如+ # A 手I像之辨識 ㈣二費之時間的方法,使得-手掌 辨:系統的辨識準確率及辨識效率可以有效改 相關人士持續研究以改善的目標。 值付 【發明内容】 因此,本發明之目的,即在提供一 法,適用於以-處理器接收一使 物辨識方 掌影像,其包含以下步驟: 旱所對應之待測手 組配該處理器,以處理該 该待/則手掌影像,以得到—組 201232428 邊界像素; 組配該處理器,拈 ㈣…里装 寺測手掌之旋轉角度; 、、-配δ亥處理Is ,依據該待測 邊界像素中每-邊界像素座標之y值與;後邊 之y值的變化情形,i U /月】後邊界像素座;^ 包括多數個;n駐° 個特徵點,其中該等特徵點 包括夕數個^點、指尖點及目標點; 組配該處理器,以捆姑 ,其包括以下子步驟 4特徵點得到多數個子影像 :配錢理is ’根據—指尖點與對應之二指谷點,或 …斜…目‘點與-指谷點,計算出對應每一 才曰尖點對應之中點,再根據 Α八^^ 亥4中點與對應之指尖點的連 線分別付到五個間隔點, 且母一間隔點於該組邊界像素中 對應得到二交越點,而誃笼六 父越點與該指尖點、該中點、 該等指谷點或該指谷點盘兮 ‘5/、。亥目標點得到一對應之第一子影 像; 組配該處理器,根據二目標點所形成之一第一設定線 段,並根據該第一設定線段為-邊長所形成之正方形以得 到一特定點’再根據該特定點、該等目標點及對應之該等 指谷點得到一第二子影像; 組配該處理器,將該等子影像之亮度正規化;及 組配該處理器,將每—子影像轉換為一對應的特徵向 量; 組配該處理器,根攄兮曾 尿Μ專特徵向量,得到並儲存該使 201232428 用者所對應之生物代表模型。 【實施方式】 有關本發明之前述及其他技術内容、特點與功效,在 以下配合參考圖式之一個較佳實施例的詳細說明中,將可 清楚的呈現。 根據 Ouyang, C.-S.等人於 2006 年在 Proceedings of 21th International Conference on Industrial, Engineering & Other Applications of Applied Intelligent Systems 中所提出 之「An adaptive biometric system based on palm texture feature and LVQ neural network」相關技術中,一手掌生物 辨識方法包含以下程序:一註冊程序(Registration )及一 驗證程序(Identification and verification)。 該註冊程序包括接收一組合法使用者的手掌特徵向量 (Palm feature vector ),並經由一學習向量量化(Learning vector quantization, LVQ2)演算法運算之後,得到該合法使 用者所對應之生物代表模型(Representative prototype ),並 將該等生物代表模型儲存於一合法使用者資料庫( Registration user database)中,其中,每一生物代表模型具 有六個特徵向量。 該驗證程序包括接收一組未知使用者的手掌特徵向量 ,並將該等手掌特徵向量與儲存於該合法使用者資料庫中 的每一生物代表模型之六個特徵向量進行比對,若是存在 一生物代表模型之六個特徵向量與該組未知使用者的手掌 特徵向量相等時,則該未知使用者將被視為一合法使用者 201232428 而接受其登入請求,若否, 法使用者而拒絕其登人請求。〃咳用者將被視為—非 使用=手ί:明之特色著重於:如何有效得到一組合法 型後並儲存$ θ 、為對應之生物代表模 ι佼卫儲存之’或是得到一組 ,以蔣I盥兮入^ 便用者的手拿特徵向量 其與边CT法使用者資料庫令的一 六個特微h吾%/ 物代表模型之 /、個特徵向量朴比對,判定該未知 使用者。 令疋洛马一合法 參閱圖1,本發明之手掌生物 在丨丨,、* m从 /έΓ y之一較佳實施 例適用於以一處理器接收一使用者 ㈣後手掌所對應之待測手 掌办像’其包括以下步驟: 步驟91是組配該處理器’以處理該待測手掌影像. 步驟92是組配該處理器,以偵測該待測手掌之旋轉角 度, 步驟93是組配該處理器’以搜尋該待測手掌 徵點;及 a# 步驟94是組配該處理器,以根據該等特徵點得到該待 測手掌之特徵向量。 · 母一步驟91〜94分別說明如下: 參閱圖2’步驟91是處理該待測手掌影像之步驟,且 有以下子步驟: ~ 子步驟9U是組配該處理器,以設定一灰階像素臨界值 (Threshold of gray level) τ,並根據該灰階像素臨界值, 將該待測手掌影像轉換為一個二位元影像(Binary image) 8 201232428 ,其中,該灰階像素臨界值τ的設定方式如下方程式(厂丨)所 示: T · j ΜΛ N-l ..···(尺!) M、N代表該待測手掌影像之解析度為ΜχΝ,而 代表該待測手掌影像中座標為㈣之像素值(Pixel va]ue)。 此外,該二位元影像之該轉換方式4:當該處理器判 斷該待測手掌影像t座標為(x,狀像素的料值心)大於該 灰階像素臨界值T時,組配該處理器將該像素值心)設定 為255 ’反之,當該像素值如)小於該灰階像素臨界值τ 時,組配該處理器將該像素值設定為 子步驟912是組配該處理器,根據一初㈣位渡波 器(Median filter),消除該二位元影像中的雜訊點; 子步驟⑴是组配該處理器,利用一拉普拉斯遽波号 (LaPlaCianmter)’以完成該二位元影像中的_測( 卿―),進而得到一組邊界像素,其中,該植邊界 像素中的每-邊界像素皆為該待财f影像中手掌 之一像素;及 子步驟914是組配該處理器,依序標記該子步驟914 中之該組邊界像素,其依序方式為,由(心,叫、(… N-2 )、…、(Μ- 1,〇)、(m9mi\ ’ lM_2, N-i)、(m_2, N_2)、...、(〇〇 是偵測手掌旋轉角度之步 聯合參閱圖3、4,步驟92 ,具有以下子步驟: 201232428 子步驟921是組配該處理器,設定多數條與該待測手 掌影像交錯之水平線段,並判斷當一條水平線段上的一座 標值與該組邊界像素中的一像素座標值相等時,即代表該 水平線段與該待測手掌影像中之手掌輪廓有一交錯點; 子步驟922是組配該處理器,選取一與該手掌輪廓有 八個交錯點P1〜P8之水平線段,作為一第—參考線段; 子步驟923是組配該處理器,選取該等交錯點p2與p3 之中^點PM1 ’並根據該中心點pM1由該組邊界像素中選 取一對應之參考點pref ; 于步驟924是組配該處理器,根據該參考點&座標应 該交錯點P2判斷該待測手掌方向,#該參相^座標之 y座標大於該該交錯點y座標,代表該待測手掌以 二:指尖端方向朝_y方向,反之,當該參考點^座標之; 手指尖端方向朝+y方向; 亥待測手掌方向為 —驟Γ是組配該處理器’根據該等交錯點P3、P‘ :二朝一標記點,其中,當該待測手掌方向為手指尖端 方向朝+y方向時,—〜 啤 點P3、P4之中點pM, τ ’认疋方式為計算其交錯 標介於該交m’再計算該組邊界像素中,X座 座伊的f 間且7座標大於該中點阳2之》 τ 像素與該中點PM2之距離,其中 之像素即為該第—俨4 y ^ 、中具有最大距離 為手指尖端方向朝笪若是該待測手掌方向 標介於該交錯^3 Γ 邊界像素中,X座 4之間且y座標小於該中點pM2之 10 201232428 座標的每-像素與該中點PM2之距離 之像素即為該第一標記點;同理 有最大距離 交錯點P5 ' P6# >- 6亥處理器根據該等 Μ與忒4交錯點ρ5、Ρ6 二標記點; 中點ΡΜ3設定-第 子步驟926是組配該處理器, 點與中點ΡΜ2的斜率為—第一斜率,厶Τ第-標記 ΡΜ3的斜率為一第二斜率 ^第―才示記點與中點 率之平均值為-平均斜率;& 第—斜率與該第二斜 子步驟927是組配該處理器,根據該 應之轉動角度,例如:當該平均斜率為!時:: 類推^理可以得知,其對應之_度為45度,其他依此 聯合參閱圖5、6,步驟93是搜尋手掌特徵點之步驟, 具有以下子步驟: 子步驟931是組配該處理器,依據該待測手掌方 :=尋該組邊界像素中每-邊界像素之y值與前後邊界 素之y值的變化情形,當該待測手掌方向為手 向朝彷方向時,-目標邊界像素^的丫值 界像素^與後—邊界像素^的y值,該目標邊:;= s破定義為__指谷點(Valley_p〇int)’而若是_目標邊界像 素sm的y值分別大於前一邊界像素%」與後—邊界像素 s.m”的y值,該目標邊界像素、被定義為—指尖點 fingertiP-p〇int),因此最終可以得到五個指尖點丁1〜” 四個指谷點B1〜B4,纟中,該第二指尖點τ2即為該第一枳 201232428 記點,該第三指尖點T3 測手掌方向為手指尖端方‘記點;反之,當該待 向朝-y方向時,一目標邊界像素 n二刀料於刖—邊界像素s".丨與後-邊界像素sn+, 的y值’該目標邊界像素s祧宕羞泛, 、破疋義為一指尖點,而若是一 目才示邊界像素Sm的y值分別 ^ 硌 大於刖一邊界像素Sm_丨與後一 邊界像素S,”的y值,該 點;及 加瓊界像素Sm被定義為一指谷 =❸32^㈣處理器’根據該第一指尖點”與 該第 才曰谷點B1之距離,γ不丨备 距離之在該組邊界像素巾具有相等 '、’且定義其為—第一目標點K1,相似的 ,組配該處理器,根據該第 Β, , ,, , 罘四晶大點Τ4與該第三指谷點 广在輪邊界像素中具有相等距離之—邊界像素, 且定義其為—第二目標‘點Κ2,及根據該第五指尖點Τ5、與 該=ΓΒ4’得到在該組邊界像素中具有相等距離之 一邊界像素’且定義其為-第三目標點Κ3。 之 因此’此步驟完成之後可以得到五個指尖點η〜 ㈣曰合點βι〜Β4,Κ_ 特徵點。 Κ3專共计十二個手掌 ^閱圖7’步驟94是得到該待測手掌特 ,具有以下子步驟: s之步驟 步驟州是組配該處理器,得 image)’其中,包括五個手指子影像及-個手2象傻㈤ 該處理器根據該第一目標點K1、該第—指子=象,而 一指尖點T1得到一第_ ' B1及該第 第手才曰子影像,根據第―、第二心 12 201232428 點B1、B2及該第二指尖點T2得到一第二手指子影像, 才艮 • 據第二、第三指谷點Β2、Β3及該第三指尖點Τ3得到〜第 - 三手指子影像,根據該第二目標點Κ2、該第三指谷Si ^ ^ B3 及該第四指尖點T4得到一第四手指子影像,根據該第= 目 標點K3、該第四指谷點B4及該第五指尖點T5得到〜 手指子影像; & 參閱圖8 ’現以該第一手指子影像說明其設定方式如下 _ 該處理器計算該第一目標點K1及該第一指谷點B1之 中點m 1,並且根據一以該中點m 1與該第一指尖點丁 1為兩 端點之線段,分別於六分之一線段長度處、六分之二線段 長度處、六分之三線段長度處、六分之四線段長度處,及 /、分之五線段長度處’設定一間隔點m2、m3、m4、m5、 m6,再根據該等間隔點m2〜m6分別延伸交越該第一手指之 輪廓於一組交越點(F1,F2)、(F3,F4)、(F5,F6)、(F7,F8)、 (F9,F10),因此,依序沿著該第一目標點ΚΙ、該等交越點 • FI、F3、F5、F7、F9、該第一指尖點Τ1、該等交越點F10 、F8、F6、F4、F2、該第一指谷點Β1及該中點ml,取得 —封閉的第一手指子影像。 其餘之手指子影像之取得方式相同於該第一手指子影 像,在此不多作贅述。 另外,參閱圖9’該手掌子影像之設定方式如下所述: 該處理器根據一以該第一目標點K1與該第二目標點 K2得到一第一設定線段,然後以該第一設定線·段為一邊長 13 201232428 形成一正方形,以得到一輿笔笛 α 興該第一目標點相連之一特定點 s 1,最後,組配該處理器於佑床、VL觉μ 孓依序/α者該特定點S1、該第一 目標點Κ1、該第一指谷點Β1、噹筮一 4匕/、 °茨第二指谷點Β3、該第二 目標點Κ2 ’及該第三目標點Κ3,取p ,, β_ 取传—封閉的手掌子影像 可以得到對應之五個手指子影像及 因此,參閱圖10 一個手掌子影像。 回復參閱圖7’步驟942是組配該處理器,根據直方圖 等化(histogram equalization)之方法將該等子影像之亮度 正規化,使得每一子影像之亮度較為均勻;及 步驟943是組配該處理器,根據局部模糊樣本(L〇cai fuzzy pattern,LFP)之特徵描述態樣(Texture和似㈣沉), 將每一待測手掌之子影像轉換為一對應的特徵向量( Feature vector)’其轉換方式說明如下: 假設一子影像的每個像素點,且其對應的亮度值 為/(x,_y),且其鄰域Μ中乃個參考方塊之亮度值集合為 。首先,計算下列之向量. D{x,y) = {μ,(1),μ,(〇),^(1),μ,(1),^ (〇)) .(ΚΪ) 而 ⑴= l/(l + exp{-a(/(x,·,乃)-/(χ,;;))}),//,(0) = 1-^(1),其中, 及A⑼分別為一模糊歸屬函數’用以表示在一鄰域中參考像 素户(Χ’χ)與像素_P(x,少)之間免度差值編碼成1及〇的程度值。 接著,計算下列之向量//匕少): 14 201232428 作元之二進位數字,共有…組可能的二進位組合。故 • 巧表示編碼成的程度值。再將其正規化,取,二几 -後的向量叫),,‘.·峋。最後,將子圖中所有于 相加並做正規化,便可以取得該子影像對應的特徵向 里 综上所述,本發明可以有效辨識以任何位置、任 度輸入之待測手掌影像,而且藉由精準的設定該等子影 ’可以對應得到更為精確的該等特徵向量以降低—手掌/ • 冑系統之誤判率,故確實能達成本發明之目的。 惟以上所述者,僅為本發明之較佳實施例而已, 能以此限定本發明實施之範圍,即大凡依本發明申請:利 扼圍及發明說明内容所作之簡單的等效變化與修飾, 屬本發明專利涵蓋之範圍内。 【圖式簡單說明】 圖1疋本發明之較佳實施例之流程圖; 圖2是該處理該待測手掌影像之子步驟之流程圖; 馨 ®3是該偵測手掌旋轉角度之子步驟之流程圖; 圖4是該债測手掌旋轉角度之-範例之示意圖; 圖5是該搜尋手掌特徵點之子步驟之流程圖; 圖6是該待測手掌之特徵點之示意圖; •圖7是該得到該待測手掌特徵向量之子步驟之流程圖 圖8是設定—手指子影像之示意圖; 圖9是設定—手掌子影像之示意圖;及 15 201232428 圖ίο是該較佳實施例得到該等子影像之示意圖。"A biometric identification system based on eigenpalm and eigenfinger features" proposed by Ribaric, S. et al. in IEEE Transactions on Pattern Analysis and Machine Intelligence, wherein the prior art is controlled by a palm and its fingers. The majority of the sub-images are used to obtain the relevant feature values, and then the feature values are used for biometric identification. However, the prior art has specific requirements for the detected angles of the palm to be tested and the background brightness. Therefore, it is often easy to produce false positives in practical applications. "Palm line extraction and matching for personal authentication" proposed by Wu, X, and Zhang et al. in IEEE Transactions on Systems, Man, and Cybernetics, in 2006, wherein the prior art utilizes a chaining coding (chaining coding). The technique is to obtain a plurality of palm lines for biometric identification. However, this prior art often makes it difficult to obtain such appropriate palm cutting lines because the sensitivity of the algorithm of the link encoding is too high. In addition, Li, F., and Leung et al., "Hierarchical identification of palmprint using line-based Hough transform", Proceedings of 201232428 18th Conference on Pattern Recognition, and .Wu, J., Qiu et al. "A hierarchical palmprint identification method using hand geometry and grayscale distribution features" proposed by Proceedings of 18th Conference on Pattern Recognition in 2006, which utilizes palm geometry and its gray-scale distribution. Features for biometric identification, however, such prior art is unable to correct the angle of rotation of a palm image to be tested that is not vertically placed (i.e., a palm image having a rotational angle) to obtain the correct relative position of the palm. Therefore, Ouyang, C.-S. et al., "An adaptive biometric system based on palm texture feature and LVQ neural network", Proceedings of 21th International Conference on Industrial, Engineering & Other Applications of Applied Intelligent Systems, 2006. According to a texture descriptor of a local fuzzy pattern (LFP), each sub-image of the palm to be tested (Subimage) is converted into a corresponding feature vector (Feature vector), and then utilized. The feature vectors detect the rotation angle and position of the palm to be tested, although the prior art described above can overcome the need for a specific angle and background brightness, the algorithm sensitivity is too high, and has a rotation angle. The problem of the above, however, when the prior art scans the image of the palm to be tested, if the rotation angle of the palm to be tested is greater than 50 degrees, a misjudgment action may occur, so that a legal 201232428 user cannot read the identification system by the hand. The image of the palm to be tested is unable to log in. shape. In addition, 〇uyang, C.-S. et al. proposed "An improved neuraI_netw〇rk based system with rotation detection mechanism" in internati〇nal conference on machine iearning and 2010, although it is acceptable to be tested. The image is scanned at an arbitrary rotation angle. However, in order to reduce the complexity of the recognition algorithm, the resolution of the image of the palm image to be tested is lowered, and the corresponding subimages (Subimage) are normalized into a histogram, so that all the finger images are made. The length, width and other information are missing. Therefore, although the prior art can increase the accuracy of identifying the position of the palm to be tested, the user who has the similar palm characteristics can be generated because the corresponding sub-image = missing part of the information. The situation that causes the false positive rate of the identification system to rise, so that the palm user can determine that a legitimate user is a non-user or an illegal user is determined to be a legitimate user. How to find - can improve the degree of measurement and can effectively reduce the identification such as + # A hand I image identification (four) two fees The method of making - palm recognition: the system identification accuracy and identification efficiency can effectively change the relevant researcher's continuous research to improve the goal. Values [Invention] Therefore, the object of the present invention is to provide a method suitable for Receiving, by the processor, an image of the object recognition palm, comprising the following steps: the processor to be tested corresponding to the drought is equipped with the processor to process the image of the palm of the object to obtain a group of 201232428 boundary pixels; With the processor, 拈(4)...the angle of rotation of the palm of the temple is measured; ,,- with δhai processing Is, according to the y value of each-boundary pixel coordinate in the boundary pixel to be tested; the change of the y value of the latter , i U / month] post-boundary pixel holder; ^ includes a plurality of; n is located in ° feature points, wherein the feature points include the number of points, the fingertip point, and the target point; the processor is assembled to bundle In addition, it includes the following sub-step 4 feature points to obtain a majority of sub-images: the allocation of money is 'based on the fingertip point and the corresponding two-point valley point, or ... oblique ... point 'point and - refers to the valley point, calculate the corresponding each The first point of the cusp Then, according to the connection between the midpoint of the Α8^^hai 4 and the corresponding fingertip point, five interval points are respectively paid, and the parent-interval point corresponding to the two boundary points in the boundary pixel of the group, and the six parents of the cage The point of the point is the first sub-image corresponding to the fingertip point, the midpoint, the finger point or the finger point 5 '5/, the target point of the sea; the processor is assembled according to the second target a first set line segment formed by the point, and according to the first set line segment being a square formed by the side length to obtain a specific point, and then obtaining a specific point according to the specific point, the target points, and the corresponding valley points a second sub-image; assembling the processor to normalize the brightness of the sub-images; and assembling the processor to convert each sub-image into a corresponding feature vector; assembling the processor, The urinary tract special feature vector is obtained and stored to make the bio-representative model corresponding to the 201232428 user. The above and other technical contents, features, and advantages of the present invention will be apparent from the following detailed description of the preferred embodiments. According to Ouyang, C.-S. et al., "An adaptive biometric system based on palm texture feature and LVQ neural network", Proceedings of 21th International Conference on Industrial, Engineering & Other Applications of Applied Intelligent Systems, 2006. In the related art, a palm biometric method includes the following procedures: a registration procedure and an identification and verification. The registration procedure includes receiving a palm feature vector of a set of legitimate users, and performing a learning vector quantization (LVQ2) algorithm operation to obtain a bio-representative model corresponding to the legal user ( Representative prototype ), and store the bio-representative models in a registration user database, where each bio-representative model has six feature vectors. The verification program includes receiving a palm eigenvector of a group of unknown users, and comparing the palm eigenvectors with six eigenvectors of each biometric model stored in the legitimate user database, if one exists When the six feature vectors of the biological representative model are equal to the palm feature vector of the unknown user, the unknown user will be regarded as a legitimate user 201232428 and accept the login request. If not, the legal user rejects Request for a request. Cough users will be treated as - non-use = hand ί: Ming features focus on: how to effectively get a set of legal types and store $ θ, for the corresponding creature to represent ι 佼 储存 储存 或 或 或 或 或 或In the case of Jiang I, the user's hand takes the feature vector and compares it with the one of the six special features of the edge CT method user database, and the character vector comparison. The unknown user. Referring to Figure 1, the palm of the present invention is in the form of a palm, and the preferred embodiment of the method is adapted to receive a user (four) with a processor. The palm image 'includes the following steps: Step 91 is to assemble the processor' to process the image of the palm to be tested. Step 92 is to assemble the processor to detect the rotation angle of the palm to be tested, step 93 is a group The processor is configured to search for the palm sign to be tested; and a# step 94 is to assemble the processor to obtain the feature vector of the palm to be tested according to the feature points. · The mother steps 91 to 94 are respectively described as follows: Referring to FIG. 2', step 91 is a step of processing the image of the palm to be tested, and has the following sub-steps: ~ Sub-step 9U is to assemble the processor to set a gray-scale pixel. Threshold of gray level τ, and according to the grayscale pixel threshold, the palm image to be tested is converted into a binary image (Binary image) 8 201232428, wherein the grayscale pixel threshold τ is set The following equation (factory) is shown: T · j ΜΛ Nl ..····(foot!) M, N represents the resolution of the image of the palm to be tested is ΜχΝ, and the coordinates of the image of the palm to be tested are (4) The pixel value (Pixel va]ue). In addition, the conversion mode 4 of the binary image is: when the processor determines that the coordinate of the palm image to be tested is (x, the value of the pixel of the pixel) is greater than the threshold value T of the grayscale pixel, the processing is combined. The pixel value is set to 255'. Conversely, when the pixel value is less than the grayscale pixel threshold τ, the processor is configured to set the pixel value to sub-step 912 to assemble the processor. Eliminating the noise points in the binary image according to a first (four) position media filter; sub-step (1) is to assemble the processor, using a LaPlaCianmter' to complete the a _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ The processor is assembled to sequentially mark the set of boundary pixels in the sub-step 914 in a sequential manner, by (heart, call, (...N-2), ..., (Μ-1, 〇), ( M9mi\ ' lM_2, Ni), (m_2, N_2), ..., (〇〇 is the joint step of detecting the rotation angle of the palm 3, 4, step 92, having the following sub-steps: 201232428 Sub-step 921 is to assemble the processor, set a horizontal line segment in which a plurality of strips are interlaced with the image of the palm to be tested, and determine a landmark value on a horizontal line segment and When the pixel coordinate values in the set of boundary pixels are equal, that is, the horizontal line segment has an interlacing point with the palm contour in the image of the palm to be tested; sub-step 922 is to assemble the processor, and select one with the palm contour. a horizontal line segment of the interlaced points P1 to P8 as a first reference line segment; sub-step 923 is to assemble the processor, select the point PM1 ' among the interlaced points p2 and p3 and according to the center point pM1 from the group A corresponding reference point pref is selected from the boundary pixels; in step 924, the processor is assembled, and the direction of the palm to be tested is determined according to the reference point & coordinate, and the y coordinate of the reference coordinate is greater than The staggered point y coordinate represents the palm of the hand to be tested with two: the direction of the tip of the finger toward the y direction, and vice versa, when the reference point is the coordinate; the direction of the tip of the finger is toward the +y direction; the direction of the palm to be measured is - a sudden Is a match The processor 'based on the interlaced points P3, P': two toward one mark point, wherein when the direction of the palm to be tested is the direction of the tip of the finger toward the +y direction, the point pM, τ ' of the beer points P3, P4 The way to calculate is that the interlace is calculated by the intersection of m' and the calculated boundary pixel of the group, the distance between the f of the X seat and the 7 coordinate is greater than the distance between the τ pixel and the midpoint PM2. The pixel is the first - 俨4 y ^, and the maximum distance is the direction of the tip of the finger. If the direction of the palm to be tested is in the boundary pixel of the interlace ^3 ,, between the X blocks 4 and the y coordinate is smaller than The pixel of the distance between each pixel of the midpoint pM2 201232428 coordinates and the midpoint PM2 is the first marker point; similarly, there is a maximum distance interlacing point P5 'P6# >- 6H processor according to these Μ and 忒4 interlaced points ρ5, Ρ6 two marked points; midpoint ΡΜ3 set - the first sub-step 926 is to assemble the processor, the slope of the point and the midpoint ΡΜ2 is - the first slope, the slope of the first - mark ΡΜ3 For a second slope ^ the first - the average of the point and midpoint rate is - the average slope; & The slope and the second oblique substep 927 are combined with the processor, according to the rotation angle of the response, for example, when the average slope is !:: The analogy can be known that the corresponding _ degree is 45 degrees. Referring to FIG. 5 and FIG. 6 in combination, step 93 is a step of searching for palm feature points, and has the following sub-steps: Sub-step 931 is to assemble the processor according to the hand to be tested: = find each of the set of boundary pixels - the change of the y value of the boundary pixel and the y value of the front and rear boundary elements, when the direction of the palm to be tested is the direction of the hand toward the imitation direction, the threshold value of the target boundary pixel ^ and the back boundary pixel ^ Value, the target edge:; = s broken is defined as __ refers to valley point (Valley_p〇int)' and if the y target boundary pixel sm has a y value greater than the previous boundary pixel %" and the back-boundary pixel sm" The value, the target boundary pixel, is defined as - fingertip point fingeriP-p〇int), so finally five fingertip points can be obtained 1~" Four finger points B1~B4, in the middle, the second finger The cusp point τ2 is the first 枳201232428 point, and the third fingertip point T3 measures the palm direction as the finger tip 'thumb; conversely, when the to-be-direction is toward the -y direction, a target boundary pixel n is 二--boundary pixel s".丨 and the back-boundary pixel sn+, the y value of the target boundary pixel s祧宕 泛 , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , And the Jiajiejie pixel Sm is defined as a finger valley = ❸ 32 ^ (four) processor 'according to the distance between the first fingertip point and the first valley point B1, γ does not prepare the distance in the set of boundary pixels The towel has the same ',' and is defined as - the first target point K1, similarly, the processor is assembled, according to the third, the , , , , and the fourth crystal large point Τ 4 and the third finger point are wide a boundary pixel having an equal distance in the rounded boundary pixel, and defining it as a second target 'point Κ2, and according to the fifth fingertip point 、5, having an equal distance in the set of boundary pixels with the = ΓΒ4' A boundary pixel 'and defines it as a third target point Κ3. Therefore, after this step is completed, five fingertip points η~(4) matching points βι~Β4, Κ_ feature points can be obtained. Κ3 specializes in a total of twelve palms. Figure 7' Step 94 is to obtain the palm of the hand to be tested, with the following sub-steps: s step step state is to match the processor, get image) 'where, including five fingers Image and - hand 2 elephant stupid (5) The processor according to the first target point K1, the first finger = image, and a fingertip point T1 to get a _ 'B1 and the first hand scorpion image, According to the first, second heart 12 201232428 points B1, B2 and the second fingertip point T2, a second finger image is obtained, according to the second and third finger points Β2, Β3 and the third fingertip Clicking Τ3 to obtain a ~-three-finger sub-image, and obtaining a fourth finger sub-image according to the second target point Κ2, the third finger valley Si^^B3, and the fourth fingertip point T4, according to the =1th target point K3, the fourth finger point B4 and the fifth finger point point T5 obtain a ~ finger image; & see Fig. 8 'The first finger image is now described as being set as follows_ The processor calculates the first a point m1 between the target point K1 and the first index point B1, and according to the middle point m 1 and the first fingertip point 1 The line segment is set at the length of one-sixth of the line segment, the length of the two-sixth line segment, the length of the three-sixth line segment, the length of the six-fourth line segment, and the length of the five-line segment. The interval points m2, m3, m4, m5, m6 are further extended according to the interval points m2 m m6 respectively to cross the contour of the first finger to a set of crossing points (F1, F2), (F3, F4), ( F5, F6), (F7, F8), (F9, F10), therefore, sequentially along the first target point 该, the intersection points • FI, F3, F5, F7, F9, the first finger The cusp Τ 1, the intersection points F10, F8, F6, F4, F2, the first finger point Β1 and the midpoint ml, obtain a closed first finger sub-image. The rest of the finger images are acquired in the same manner as the first finger image, and will not be described here. In addition, referring to FIG. 9 'the setting method of the palm image is as follows: the processor obtains a first set line segment according to the first target point K1 and the second target point K2, and then uses the first set line · The segment is one side long 13 201232428 Form a square to get a pen flute α 兴 The first target point is connected to a specific point s 1, and finally, the processor is combined with the processor, VL μ μ 孓 sequentially / The specific point S1 of the α, the first target point Κ1, the first finger point Β1, the 筮一匕4匕/, the ° second finger point Β3, the second target point Κ2′ and the third target Click Κ3, take p,, β_ to pass - the closed palm image can get the corresponding five finger images and therefore, see Figure 10 for a palm image. Referring to FIG. 7 'step 942 is to assemble the processor, normalize the brightness of the sub-images according to histogram equalization, so that the brightness of each sub-image is relatively uniform; and step 943 is a group The processor is configured to convert each sub-image of the palm to be tested into a corresponding feature vector according to a feature description pattern of the local fuzzy sample (LFP) (Texture and Like (4) sink). 'The conversion method is as follows: Suppose each pixel of a sub-image, and its corresponding luminance value is / (x, _y), and its neighboring region is the set of luminance values of a reference block. First, calculate the following vector. D{x,y) = {μ,(1),μ,(〇),^(1),μ,(1),^(〇)) .(ΚΪ) and (1)= l/(l + exp{-a(/(x,·,乃)-/(χ,;;))}), //,(0) = 1-^(1), where, and A(9) are A fuzzy attribution function 'is used to indicate the degree of difference between the reference pixel household (Χ'χ) and the pixel_P(x, less) in a neighborhood coded into 1 and 〇. Next, calculate the following vector ///min): 14 201232428 The binary number of the element, in total... the possible binary combinations of the group. Therefore, it is a good value to indicate the degree of coding. Then normalize it, take two, after the vector is called), ‘.·峋. Finally, all the sub-pictures are added and normalized, and the corresponding features of the sub-image can be obtained. The present invention can effectively recognize the image of the palm to be tested input at any position and any degree, and By accurately setting these sub-images, it is possible to obtain more accurate such feature vectors to reduce the false positive rate of the palm/squat system, so that the object of the present invention can be achieved. However, the above is only the preferred embodiment of the present invention, and the scope of the present invention can be limited thereto, that is, the simple equivalent change and modification made by the application of the present invention: Li Weiwei and the description of the invention. , is within the scope of the patent of the present invention. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a flow chart of a sub-step of processing a palm image to be tested; FIG. 2 is a flow chart of the substep of detecting a rotation angle of a palm; Figure 4 is a schematic diagram of an example of the rotation angle of the palm of the hand; Figure 5 is a flow chart of the substeps of the palm of the hand; Figure 6 is a schematic diagram of the feature points of the palm to be tested; FIG. 8 is a schematic diagram of setting a finger image; FIG. 9 is a schematic diagram of setting a palm image; and 15 201232428 is a preferred embodiment for obtaining the sub-images. schematic diagram.

16 201232428 【主要元件符號說明】 9 ..........手掌生物辨識方 921〜M7子步驟 法 931〜932子步驟 91-94·.··步驟 941〜943子步驟 911〜914子步驟16 201232428 [Description of main component symbols] 9 .......... Palm biometric side 921~M7 substep method 931~932 substep 91-94····Step 941~943 Substeps 911~914 Substep

1717

Claims (1)

201232428 七、申請專利範圍: 1. 一種手掌生物辨識方法, ..^ 處理器接收一使用 者手旱所對應之待測手掌影像, 手似1豕其包含以下步驟: 組配該處理器,以南 w處理轉測手掌影像,以 組邊界像素; 組配該處理器,以站.日,,斗# 以偵測该待測手掌之旋轉角度; 組配該處理器,依摅 依據忒待測手掌方向,依序搜尋該 =像素:每一邊界像素座標之y值與前後邊界像素 y值的變化情形,較多數個特徵點,"該等 特徵點包括多數個指谷點、指尖點及目標點; 組配該處理器,以M 根據該等特徵點得到多數個子影 像’其包括以下子步驟: 組配該處理器,招媸_ 根據彳S尖點與對應之二指谷 點,或一指尖點與針摩 — 味、 耵應之目標點與一指谷點,計 算出對應每一指尖f无制座# + 你批由 A點對應之中.點,再根據該等中點 與對應之指尖點的連線分 _ 逻深刀別侍到五個間隔點,且每 一間隔點於該組邊界像音中 外咕 仆诼常中對應得到二交越點,而 s亥等交越點與該指尘 扣尖點3亥中點、該等指谷點或該 ^合點與該目標點得到—對應之第-子影像; ;、且配4處理◎,根據二目標點所形成之一第一 设定線段,並根據該第一 豕邊弟°又疋線段為一邊長所形成 之正方形以得到-特定點,再根據該特定點、該等 目標點及對應之該等指谷點得到—第二子影像; 組配該處理器,將該等子影像之亮度正規化; 18 201232428 及 - 組配該處理器,將每一子影像轉換為一對應的 . 特徵向量; 及 組配該處理器,根據該等特徵向量,得到並儲存該 傳用者所對應之生物代表模型。 2.依據申請專利範圍第i項所述之手掌生物辨識方法其 中,在偵測該待測手掌之旋轉角度步驟中,包括以下子 ^ 步驟: 組配該處理器,根據一第一參考線段中二交錯點之 中心點,由該組邊界像素中選取一對應之參考點; 組配該處理器,根據該參考點座標與該等交錯點之 座標,以判斷該待測手掌方向;及 組配該處理器,根據該等交錯點設定一第一標記點 與一第二標記點’並據此一平均斜率;及組配該處理器 ,根據該平均斜率得到一對應之轉動角度。 • 3·依據申請專利範圍第2項所述之手掌生物辨識方法,其 中,在偵測該待測手掌之旋轉角度步驟中,還包括:組 配該處理器,分別根據相鄰之二交錯點設定該第一標記 點與該第二標記點’且計算出該等對應之中點,然後, 再根據該等中點及該第一、第二標記點計算出一第一斜 率與一第二斜率,並由該第一、第二斜率之平均值得到 該平均斜率。 4.依據申請專利範圍第2項所述之手掌生物辨識方法,盆 19 201232428 …“亥待測手掌之旋轉角度步驟中,當 ==標大於該等交錯點之y座標時,判斷該待測 肖為手指尖端朝彳方向,當該參考點座標之丫座 =於該等交錯點之y座標時,代表該待測手掌方向為 于知大端方向朝+y方向。 5. 6. =據申請專利範圍第2項所述之手掌生物辨識方法其 在偵測。亥待測手掌之旋轉角度步驟中,該第一參考 又是肖該待測手掌影像有八個《錯點之水平線段。201232428 VII. Patent application scope: 1. A palm biometric identification method, ..^ The processor receives a palm image to be tested corresponding to a user's hand drought, and the hand is like 1 豕, which includes the following steps: The south w processes the palm image, and sets the boundary pixel; the processor is assembled to stand the station. The day, the bucket # to detect the rotation angle of the palm to be tested; the processor is assembled, and the processor is tested according to In the palm direction, the = pixel is searched sequentially: the y value of each boundary pixel coordinate and the change of the y value of the front and rear boundary pixels, and more feature points, "these feature points include a plurality of finger points, fingertip points And the target point; the processor is assembled, and M obtains a plurality of sub-images according to the feature points. The method includes the following sub-steps: assembling the processor, recruiting _ according to the 尖S cusp and the corresponding two-point valley point, Or a fingertip and a needle---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- The connection between the midpoint and the corresponding fingertip point _ Logic deep knife does not serve five interval points, and each interval point in the group boundary sounds in the middle of the 咕 诼 诼 对应 对应 对应 对应 对应 对应 对应 对应 对应 对应 对应 对应 对应 对应 对应 , , , , , , , , , s s 3 midpoints, the index points or the joint points and the target points corresponding to the first-sub-image; and 4 processing ◎, according to the first target line formed by the two target points, And obtaining a specific point according to the square formed by the length of the first side and the line of the first side, and then obtaining the second specific image according to the specific point, the target points, and the corresponding valley points; The processor is configured to normalize the brightness of the sub-images; 18 201232428 and - the processor is configured to convert each sub-image into a corresponding eigenvector; and the processor is assembled according to the features Vector, obtain and store the bio-representative model corresponding to the passer. 2. The palm biometric identification method according to claim i, wherein in the step of detecting the rotation angle of the palm to be tested, the following substeps are included: assembling the processor according to a first reference line segment a center point of the two interlaced points, wherein a corresponding reference point is selected from the set of boundary pixels; the processor is assembled, and the coordinates of the reference point coordinates and the interlaced points are used to determine the direction of the palm to be tested; The processor sets a first marker point and a second marker point according to the interlaced points and averages the slope according to the intersection; and assembles the processor to obtain a corresponding rotation angle according to the average slope. 3. The method according to claim 2, wherein in the step of detecting the rotation angle of the palm to be tested, the method further comprises: assembling the processor, respectively, according to the adjacent two interlaced points Setting the first marker point and the second marker point ' and calculating the corresponding intermediate points, and then calculating a first slope and a second according to the midpoints and the first and second marker points The slope, and the average slope is obtained from the average of the first and second slopes. 4. According to the palm biometric identification method described in item 2 of the patent application scope, the basin 19 201232428 ... "in the rotation angle step of the palm to be tested, when the == label is greater than the y coordinate of the interlaced points, the judgment is to be determined. Xiao is the direction of the tip of the finger in the direction of the ,, when the yoke of the reference point coordinates = the y coordinate of the staggered point, the direction of the palm to be tested is the direction of the big end in the +y direction. 5. 6. = The palm biometric identification method described in claim 2 of the patent application is in the process of detecting the rotation angle of the palm to be tested, and the first reference is a horizontal line segment having eight "error points". 依據申請專利範圍第丨項所述之手f生物辨識方法,其 中該搜尋該待測手掌之所有特徵點之步驟還包括:組 ^處理H ’根據―指尖點與對應之指谷點之距離,設 在-亥組邊界像素中具有相等距離之一邊卩像素為〆 目標點。According to the hand f biometric method described in the scope of the patent application, the step of searching for all the feature points of the palm to be tested further includes: group ^ processing H ' according to the distance between the fingertip point and the corresponding finger point It is set to have one of the equal distances in the boundary pixel of the -H group, and the pixel is the target point. 2020
TW100103407A 2011-01-28 2011-01-28 Palm biometric method TWI531985B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW100103407A TWI531985B (en) 2011-01-28 2011-01-28 Palm biometric method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW100103407A TWI531985B (en) 2011-01-28 2011-01-28 Palm biometric method

Publications (2)

Publication Number Publication Date
TW201232428A true TW201232428A (en) 2012-08-01
TWI531985B TWI531985B (en) 2016-05-01

Family

ID=47069601

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100103407A TWI531985B (en) 2011-01-28 2011-01-28 Palm biometric method

Country Status (1)

Country Link
TW (1) TWI531985B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI658411B (en) * 2018-02-26 2019-05-01 關鍵禾芯科技股份有限公司 Non-directional finger palm print recognition method and non-directional finger palm print data establishment method
TWI737040B (en) * 2019-09-12 2021-08-21 大陸商敦泰電子(深圳)有限公司 Fingerprint recognition method, chip and electronic device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI658411B (en) * 2018-02-26 2019-05-01 關鍵禾芯科技股份有限公司 Non-directional finger palm print recognition method and non-directional finger palm print data establishment method
TWI737040B (en) * 2019-09-12 2021-08-21 大陸商敦泰電子(深圳)有限公司 Fingerprint recognition method, chip and electronic device

Also Published As

Publication number Publication date
TWI531985B (en) 2016-05-01

Similar Documents

Publication Publication Date Title
Ye et al. Collaborative refining for person re-identification with label noise
Wang et al. A fingerprint orientation model based on 2D Fourier expansion (FOMFE) and its application to singular-point detection and fingerprint indexing
Zhang et al. A robust, real-time ellipse detector
US9274607B2 (en) Authenticating a user using hand gesture
Emeršič et al. Convolutional encoder–decoder networks for pixel‐wise ear detection and segmentation
Lai et al. Motion segmentation via a sparsity constraint
CN103955950B (en) Image tracking method utilizing key point feature matching
CN115527079A (en) Palmprint sample generation method, device, equipment, medium and program product
CN111914903A (en) A generalized zero-sample target classification method, device and related equipment based on outer distribution sample detection
Liu et al. Online multiple object tracking using confidence score‐based appearance model learning and hierarchical data association
CN110263726B (en) Finger vein identification method and device based on deep correlation feature learning
CN104462550A (en) Pedestrian re-recognition method based on similarity and dissimilarity fusion ranking optimization
Spreeuwers Breaking the 99% barrier: optimisation of three‐dimensional face recognition
Ruzicka et al. Improving Sensor Interoperability between Contactless and Contact‐Based Fingerprints Using Pose Correction and Unwarping
CN105184764B (en) A kind of method for registering images based on real coding clonal selection algorithm
TW201232428A (en) Palm biometric identification method
Montagner et al. Staff removal using image operator learning
S. Kumar Robust multi‐view videos face recognition based on particle filter with immune genetic algorithm
Peng et al. Mitigating label noise using prompt-based hyperbolic meta-learning in open-set domain generalization
Kang et al. Combining random forest with multi-block local binary pattern feature selection for multiclass head pose estimation
Fatemeh Razavi et al. Integration of colour and uniform interlaced derivative patterns for object tracking
Mahdi et al. 3D facial matching by spiral convolutional metric learning and a biometric fusion-net of demographic properties
CN114821206B (en) Multi-modal image fusion classification method and system based on confrontation complementary features
CN112183156B (en) A living body detection method and device
Chang et al. Fast Online Upper Body Pose Estimation from Video.

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees