[go: up one dir, main page]

TW201137766A - Image based motion gesture recognition method and system thereof - Google Patents

Image based motion gesture recognition method and system thereof Download PDF

Info

Publication number
TW201137766A
TW201137766A TW99114005A TW99114005A TW201137766A TW 201137766 A TW201137766 A TW 201137766A TW 99114005 A TW99114005 A TW 99114005A TW 99114005 A TW99114005 A TW 99114005A TW 201137766 A TW201137766 A TW 201137766A
Authority
TW
Taiwan
Prior art keywords
image
hand
gesture
processing unit
movement
Prior art date
Application number
TW99114005A
Other languages
Chinese (zh)
Other versions
TWI431538B (en
Inventor
Jing-Wei Wang
Chung-Cheng Lou
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to TW99114005A priority Critical patent/TWI431538B/en
Publication of TW201137766A publication Critical patent/TW201137766A/en
Application granted granted Critical
Publication of TWI431538B publication Critical patent/TWI431538B/en

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

An image based motion gesture recognition method and system thereof are disclosed. In embodiment, a hand posture detection is performed according to the received image frames, to obtain a first hand posture. It is then determined whether the first hand posture matches a predefined starting posture or not. If the first hand posture matches said predefined starting posture, movement tracking is performed according to hand locations on image frames, to obtain a motion gesture. During said movement tracking, the hand posture detection is performed according to said image frames to obtain a second hand posture, and it is determined whether the second hand posture matches a predefined ending posture. If the second hand posture matches the predefined ending posture, the movement tracking is stopped. Therefore, reduce complexity motion gesture recognition can be reduced and the reliability in interaction can be improved.

Description

201137766 六、發明說明: 【發明所屬之技術領域】 本發明是有關於-種手部偵測系統,特 部配置感應器的基於影像之動作手 【先前技術】201137766 VI. Description of the Invention: [Technical Field] The present invention relates to a hand-detecting system, and an image-based action hand specially configured with a sensor [Prior Art]

::快速發展的娛樂系統而言,尤其是遊戲系統 如何讓使用者與電腦之間的互動介面更友善是一項曰秦 重要的課題。其中,透過電腦分析使用者之動作來^ 指令已成為未來最具可能性的互動方法。$而,傳統# =決方案往往需要在使用者手指上配置—感應器,此澤 雖然可以增加手部制的準雜,但是亦增加使用者於 負擔。另-較佳的方式為直接將使㈣的手部視為: 令下達器具,以影像處理的方式分析❹者的手部移^ 方式來輸入指令’控制電腦的作業系統或是週邊裝置 但是,此種傳統的影像分析方法過於複雜且不夠穩定。 •例如,已知-美國專利,其專利號MG2,繼, 露一種用以快速分析手勢以控制電腦的方法,1 像向量計算來決定使用者手部的位置,方㈣及大^ 接著,透過影像處理的方式來決定手勢,例如如果確切 ^的手部影像中有洞,表示使用者的拇指與食指相碰觸 擺出- OK的手勢。此外’此專利亦揭露可利用手 控制電腦顯示的屏幕顯示介面(0SD)。此習知技術的運 201137766 算,,於鼓’且容易在㈣輕變 穩定度不佳。 町座玍茨劍 命例如,另一已知美國專利,其專利號Am % 路一手勢辨識系統,其特徵在 , 戶θ, 、·^ 你κ便用者手上配置複數個 物的位置。其令,複數個標記物中此ff記 一第二標記物組,第一产却你,:二成第—私汜物組及 應器#測第-产Γ°、且係作為參考之用,而感 一步辨識出使用者手勢。 、,的移動以進 記物,無法伽徒手進行2知技料求使用者佩帶標 操作以徒手手勢或是移動軌跡與 退仃互動,疋—項及待解決的問題。 【發明内容】 有鑑於上述習知技蔹 就是在提供-種基於影;::手=^-目的 提高使用便利性及降低計算複雜二識r法’以達到 根據本發明之目的, 辨識方法包含下列步驟録於影像之動作手勢 複數張影像晝面執行一 $數張:像晝面,·根據此 判斷此第一手勢是否符 、'、’以传到一第一手勢; 手勢符合此預設開始手勢;如果此第-手部位置,執行一移動追縱=此=張影像晝面中 此移動追蹤之過程中 =-移動手勢;於執行 根據此魏張影像晝面執行此手 201137766 勢偵測,以得到一第二手勢;判斷此第二手勢是否符合 一預设結束手勢;如果此第二手勢符合此預設結束手 勢’停止此移動追蹤。 其中,如果此第二手勢不符合此預設結束手勢,持 續執行此移動追縱。:: For a fast-developing entertainment system, especially how the game system makes the interaction between the user and the computer more friendly is an important issue for Qin. Among them, the analysis of the user's actions through the computer ^ command has become the most promising interactive method in the future. $, the traditional # = solution often needs to be configured on the user's finger - the sensor, although this can increase the hand-made standard, but it also increases the burden on the user. In addition, the preferred method is to directly treat the hand of (4) as: the release device, analyzes the hand movement mode of the latter by image processing, and inputs the command 'control computer operating system or peripheral device. This traditional image analysis method is too complicated and not stable enough. • For example, the known US patent, its patent number MG2, followed by a method for quickly analyzing gestures to control a computer, 1 image vector calculation to determine the position of the user's hand, square (4) and large ^ then, through The way of image processing determines the gesture. For example, if there is a hole in the hand image of the exact ^, it means that the user's thumb touches the index finger and the OK gesture is displayed. In addition, this patent also discloses a screen display interface (0SD) that can be displayed by a hand control computer. This conventional technology is calculated as 201137766, which is easy to be in (4) lightly stable. For example, another known US patent, the patent number Am % Road, a gesture recognition system, is characterized by the position of a plurality of objects on the hands of the user θ, . Therefore, in the plurality of markers, the ff is recorded as a second marker group, and the first product is produced by you: the second component of the first-private group and the device #测第-产Γ°, and is used as a reference. And feel the user gesture in one step. , the movement of the object, can not be carried out by hand. 2 knowing the material to ask the user to wear the standard operation to interact with the hand gesture or the movement track and the retreat, the problem and the problem to be solved. SUMMARY OF THE INVENTION In view of the above-mentioned prior art, the present invention provides a method for improving the ease of use and reducing the computational complexity of the second method to achieve the object according to the present invention. The following steps are recorded in the action gesture of the image. The image is executed by a number of images: like a face, according to whether the first gesture is a sign, ', ' to pass a first gesture; the gesture conforms to the preset Start gesture; if this first-hand position, perform a movement tracking = this = image in the process of this movement tracking in the process of =- movement gesture; perform this hand according to this Wei Zhang image face 201137766 Measure to obtain a second gesture; determine whether the second gesture conforms to a preset end gesture; if the second gesture conforms to the preset end gesture 'stop the mobile tracking. If the second gesture does not meet the preset end gesture, the mobile tracking is continuously performed.

其中,執行此手勢偵測之步驟更包含偵測此複數張 =晝面之任-影像畫面是否存在有—手部影像;如果 此手部影像存在,則根據此手部影像取得—手部^ 像,根據此手部輪廓影像判斷—手部方向及 = 部方向及此手指數目辨識出此第一手= μ,,〜你籾延蹤之步驟 此複數張影像晝面中包含有此手部影像二::在母一 塊;估算此複數影像區塊 7 夕一影像區 尾之間的减個移動向量。 其中,本發明之基於影 含··紀錄此複數個移動向量以取H :勢辨識方法更包 移動軌跡以取得此移動手勢。于移動執跡;辨識此 其中,判斷此手部方向之步 廓影像所碰觸到的此影像金 匕3根據此手部輪 方向。 …面之-邊緣,來判斷此手部 其中,判斷此手指數目之+ 位計算以取得此手部輪廓影 ^包含執行—手掌方 心位置對此手部輪廓影像執心位置;根據此重 切割手部影像;根據此已 ;::割’以取得一已 手〜像判斷手指數目。 201137766 勢辨發:之目的再提出一種基於影像之動作手 -處理單:’匕含一儲存單元、一影像擷取單元、-第 处理早7L、一比對單元及一篦_考 _ 係儲存-預設開始手勢及-預設結束手勢?影::二 張影像晝面。第一處理單元係根此; !〜象畫面執行-手勢積測,以得到-第一手勢。3 I = 斷此第—手勢是否符合此預設開始手勢。如果 第1卢早疋判斷此第-手勢符合此預設開始手勢,則此 二手勢符合該預設 判斷此弟 移動追蹤。 H貝!此第-處理早元停止此 結束手勢時u對此第二手勢不符合此預設 匕第—處理早兀繼續此移動追蹤。 元、處理單元更包含—第—影像處理單 處理單元係偵測此複數張影像晝象=象 -:部影像。第二影像處理單元係根據 影 一手部輪廓影像。手勢 如像取侍 判斷一手部方M 識早7°係根據此手部輪廓影像 ::手褐及—手指數目,並根據此手 手私數目辨識出此第一手勢或此第二手勢。 201137766 移動ίί單此第^單元更包含-區塊偵測單元及-The step of performing the gesture detection further includes detecting whether the plurality of images of the plurality of images are present - the image of the hand is present; if the image of the hand exists, obtaining the hand image according to the hand image - the hand ^ For example, according to the hand contour image judgment - the direction of the hand and the direction of the finger and the number of the fingers identify the first hand = μ,, the step of your 籾, the step of the image contains the hand Image 2:: In the parent block; estimate the reduced motion vector between the end of the image area and the end of the image area. Wherein, the present invention records the plurality of motion vectors based on the image to obtain the H:potential recognition method to further move the trajectory to obtain the moving gesture. In the case of moving, the image is recognized by the step image of the hand direction, and the image 3 is touched according to the direction of the hand wheel. ...face-edge, to judge the hand, determine the number of the finger + position calculation to obtain the hand contour shadow ^ contains the execution - palm position of the palm of the hand contour image position; according to this heavy cutting hand Partial image; according to this already;:: cut 'to get a hand ~ like to judge the number of fingers. 201137766 Potential Discrimination: The purpose is to propose an image-based action-handling list: '匕 contains a storage unit, an image capture unit, - the first processing 7L, a comparison unit and a 篦 _ test _ storage - Preset start gesture and - preset end gesture? Shadow:: Two images inside. The first processing unit is rooted here; !~ like picture execution - gesture integration to get - first gesture. 3 I = Break this first—whether the gesture meets this preset start gesture. If the first gesture determines that the first gesture meets the preset start gesture, the second gesture conforms to the preset to determine the younger movement tracking. H Bay! This first-handling early element stops this end gesture when u does not conform to this preset for the second gesture. 匕The first step is to continue this movement tracking. The unit and the processing unit further include a - image processing unit. The processing unit detects the plurality of image objects = image -: image. The second image processing unit is based on a hand contour image. Gestures, such as taking a wait, judging a hand, M, and recognizing the 7° according to the hand contour image: hand brown and the number of fingers, and identifying the first gesture or the second gesture according to the number of the hand. 201137766 Mobile ίί single unit also contains - block detection unit and -

Id估含有此手部影像之至少—影像區塊。移‘ 里早疋,係估舁此複數影像區塊之間的複數個移動向量。 跡辨2’-,第二處理器更包含-軌跡辨識單元,此執 跡,此複數個移動向量以取得一移動執 並辨識此移動軌跡以取得此移動手勢。 觸到的:-二::辨識:兀係根據此手部輪廓影像所碰 則的此衫像晝面之一邊緣’來判斷此手部方向。 ~其中,此手勢辨識單元係執行一手掌方位 侍此手部輪廓影像之一重心 # 手部輪廓影像執行一手掌㈣置以=此f心位置對此 像,A扭祕L 〇 U取付一已切割手部影 再根據此已切割手部影像判斷手指數目。 【貪施方式】 勢辨=第^圖’其係為本發明之基於影像之動作手 影像揭取單元12、-第-處理單元… 如記,::二及一第二處理單元15。儲存單元11,例 結束儲存一預設開始手勢111及-預設 末乎勢112。衫像擷取單元 ⑵。影像擷取單元12較隹立一 ^取複數張衫像畫面 影像畫面。第-處理單元機,其可輸出連續 121執#一 +# # 糸根據此稷數張影像畫面 131 ’以得到一第一手勢132。比對 7 201137766 單元14係判斷此第一手勢 111。 132是否符合預設開始手勢 如果比對單元14判斷此第一手勢132符合此預設開 始手勢11卜則此第二處理單幻5根據此複數張影像晝 面121中手部位置,執行一移動追蹤151以取得一移動 手,152。於執行此移動追蹤151之過程中,此第一處 理半兀13根據此複數張影像晝自121,仍繼續或週期= 地執行手勢價測131,以得到一第二手勢133,若比對 ^ 14判斷此第二手勢133#合預設結束手冑112,則此 第二處理單元15停止執行移動追蹤151。當比對單元14 判斷第二手勢133不符合預設結束手勢112時,此 處理單元15持續執行移動追蹤151。 藉此,系統可先對使用者提示預設開始手勢iu及 預設結束手勢U2之樣態。當欲徒手輸人指令或資料, 則使用者可先擺出預設開始手勢⑴表示要開始輸入指 令,待系統辨識成功後,使用者改變手勢或移動手部來 進行操作。在操作期間,系統仍持續進行手勢辨識,一 方面確認欲輸人的齡H㈣確認使用者是否擺 出預設結束手勢112以結束操作。其中,預設開始手勢 111及預設結束手勢112可科為特別且十分明確的手 勢,以確保在使用者進行操作而改變手勢時系統不容易 誤判;此外,由於開始與結束的明確區隔,系統亦可簡 巧令手勢的辨識流程,進—步使徒手操作更為流暢, 知:尚糸統實現即時操作的可能性。 201137766 請參閱第2 ® ’其料本發明之基 勢辨識系統之實施例方塊圖。圖中 二像^動作手 憶體2卜-攝影機22、一第一處理=例包含-記 元U及一第二處理單元25。第一處理單元 ::: 第-影像:理單元231、一第二影像處理單元 二::::33。第一影像處理單元231係偵測複數 張衫像畫面121之任一影像書面 瓦双 •第3圖所示之手部影像-3Γ):21 二 元232根據手部影像236取得 衫像處理單 3圖所示之影像區域33)。例如,第二影如第 可先對手部影像说進行邊緣偵測^像232 廓㈣’接著以手部輪靡線32 =:部輪 圍之影像區域33作為手部輪廓影像237 6邊緣所 手勢辨識單元233根據手邱私托旦 部方向2 3 8及-手指數目輪㈣像2 3 7判斷-手 斷時,例如,可根據手^ 進订手部方向2邛之判 畫…:來像237所碰觸到的影像 =之影像區域33係接觸畫:向;3:右:如第三圖 之上邊緣,則手部面⑵ ㈣】之左邊緣,則手部方向定義為=觸係為影像畫 抑進仃手指數目239之 早元233可執行一手掌 時μ此貝細例之手勢辨識 237之-重心位彳异以取得手部輪廓影像 例如’可根據手掌的常見的二維: is] 9 201137766 狀選擇一力矩函式J(X,y),接著根據此J(X,y)計算一階力 矩以及二階力矩Moo、Μ丨〇、Μ〇ι、Μ&quot;、M20及Λ/02,如 以下列方程式所示: Μ〇° =ΣΣ7(^&gt;) x y Μί〇=ΣΣχ/(^&gt;0 χ y x y =ΣΣΛ^7(^,ν) χ y Μ2〇=ΣΣχ2Κ^ν) χ &gt;· Μα2=ΣΣ·ν%^) x y 对10及M01計算出重心位置(;ce, 接著’可根據财^、 八),如下列公式所示:Id estimates at least the image block containing this hand image. Shifting </ br> is to estimate the multiple motion vectors between the complex image blocks. The second processor further includes a track recognition unit, and the plurality of motion vectors obtains a motion and recognizes the motion track to obtain the motion gesture. Touched: - 2:: Identification: The 兀 is based on the edge of the hand contour image of the shirt to determine the direction of the hand. ~ Among them, this gesture recognition unit performs one palm orientation to serve one of the hand contour images. The hand contour image performs a palm (four) set to = this f heart position for this image, A twisted L 〇 U takes one The hand image is cut and the number of fingers is determined based on the cut hand image. [Greed mode] The situation is determined by the image-based action image removal unit 12, the first-processing unit, and the second processing unit 15. The storage unit 11, for example, ends storing a preset start gesture 111 and a preset end potential 112. The shirt is like a capture unit (2). The image capturing unit 12 takes a picture of a plurality of shirt images. The first processing unit can output a continuous number of 121 images # 糸 糸 according to the number of image frames 131 ′ to obtain a first gesture 132. Alignment 7 201137766 Unit 14 determines this first gesture 111. Whether the compliance with the preset start gesture is performed. If the comparison unit 14 determines that the first gesture 132 conforms to the preset start gesture 11 , the second processing single magic 5 performs a movement according to the position of the hand in the plurality of image planes 121 . Track 151 to get a moving hand, 152. During the execution of the movement tracking 151, the first processing half 13 performs the gesture price measurement 131 according to the plurality of image images from 121, to continue to obtain the second gesture 133, if the comparison is performed. ^ 14 determines that the second gesture 133# coincides with the preset end trick 112, and the second processing unit 15 stops executing the motion tracking 151. When the comparison unit 14 determines that the second gesture 133 does not conform to the preset end gesture 112, the processing unit 15 continues to execute the movement tracking 151. Thereby, the system can prompt the user to preset the start gesture iu and the preset end gesture U2. When you want to input instructions or data by hand, the user can first put out a preset start gesture (1) to indicate that the command is to be started. After the system is successfully recognized, the user changes the gesture or moves the hand to operate. During the operation, the system continues to perform gesture recognition, and confirms the age of the person to be entered H (4) to confirm whether the user has placed the preset end gesture 112 to end the operation. Wherein, the preset start gesture 111 and the preset end gesture 112 can be a special and very clear gesture to ensure that the system is not easy to misjudge when the user performs the operation and changes the gesture; in addition, due to the clear separation between the start and the end, The system can also make the gesture recognition process simple, and the step-by-step operation is smoother, knowing: the possibility of real-time operation. 201137766 Please refer to the block diagram of the embodiment of the potential identification system of the present invention. In the figure, the two images ^ action hand memory 2 - camera 22, a first process = example includes - symbol U and a second processing unit 25. The first processing unit ::: first image: the processing unit 231, a second image processing unit, two::::33. The first image processing unit 231 detects any image of the plurality of shirt images 121 and writes the hand image of the plurality of shirt images 121. Figure 3: Figure 3: Binary 232 obtains the shirt image processing sheet according to the hand image 236. 3 image area 33). For example, the second image can be edge-detected by the first hand image, and the image is detected by the hand rim line 32 =: the image area 33 of the hand wheel is used as the hand gesture of the hand contour image 237 6 The identification unit 233 judges according to the direction of the hand and the number of the fingers (2) and the number of the fingers (four) like 2 3 7 - when the hand is broken, for example, according to the hand 2, the direction of the hand is 2 ...... Image touched by 237 = image area 33 is contact painting: direction; 3: right: as the upper edge of the third figure, the left edge of the hand surface (2) (four), the hand direction is defined as = tactile For the image painting, the number of fingers 239, the early 233 can perform a palm when the gesture of the shell is 237 - the center of gravity is different to obtain the hand contour image such as 'common two-dimensional according to the palm: Is] 9 201137766 Select a moment function J (X, y), then calculate the first-order moment and second-order moment Moo, Μ丨〇, Μ〇ι, Μ &quot;, M20 and Λ / according to this J (X, y) 02, as shown by the following equation: Μ〇° =ΣΣ7(^&gt;) xy Μί〇=ΣΣχ/(^&gt;0 χ yxy =ΣΣΛ^7(^,ν) χ y Μ2〇=ΣΣχ2 ^ Ν) χ &gt; · Μα2 = ΣΣ · ν% ^) x y 10 and M01 on the calculated position of the center of gravity (; CE, then '^ according to Choi, eight), as shown in the following equation:

重心位置(Xc,八)如 ' y〇' Mqq ' λ/] 及寬L,如下列公式所 第4圖所示之位置41。再根據 Λ jyj 2() &amp; M02計算出手部矩型的長 示 a =The position of the center of gravity (Xc, VIII) such as ' y〇' Mqq ' λ/] and the width L are as shown in the fourth figure of Figure 4 at position 41. Then calculate the length of the hand rectangle according to Λ jyj 2() &amp; M02 a =

201137766 « ^ ^ ^ , 接者以重心位置41或冋 ^ 型的寬度L2的一丰作在坐广 1為圓心,手部矩 割出一圓形巴蛣 w 仫於手部輪廓影像43上切 体已切mrr㈣域料—_手部影像 一^手“像44可用以來判斷手指數目2 旱方位。若切割手部影像44 表示使用者的手掌方位為 預设值, 域分佈寬度大於高度,則表示使用:丄手像4 4的區 方向;若㈣手#1冑方位為水平 表示使用者Mi 分佈高度大於寬度,則 衣不使用者的手掌方位為垂直方向。 例干::,5圖’其繪示本發明之判斷手指數目之範 圖中,先從已切割手部影像44上辨識出一離 緣最較w 45,計算线45與重心位置41之 :::45再if d決定一 /值(例如㈣),接著取得- 德t // 的線段PP’,接著再計算已切割手部影 像一線段PP,重疊的次數來決定手指數目239。 接著,手勢辨識單元233再根據手部方向238及 指數目239辨識出第一手勢131或第二手勢132。實施 上’手勢辨識單元233可與一資料庫進行比對。請續灸 閱:6圖’其緣示本發明之用以辨識手勢之資料庫範‘ 不意圖。圖中,此資料庫係紀錄手指數目為0的握拳手 勢、^指數目&amp; 1的單指手勢以及手指數目為5的張掌 手勢等三種預設手勢的比對資料;此外,此資料庫亦將 此些比對貧料分類成從東(E)、從西(W)、從南(S)及從北 (N)延伸的四種手部方向;此外’此資料庫亦將此些比對 r r τ 11201137766 « ^ ^ ^ , Receiver with the center of gravity position 41 or the width of the L ^ type L2 a square in the seat wide 1 , the hand moment cut a round 蛣 w 仫 on the hand contour image 43 cut The body has been cut mrr (four) domain material - _ hand image - ^ hand "like 44 can be used to determine the number of fingers 2 dry orientation. If the hand image 44 is cut to indicate that the user's palm orientation is the default value, the domain distribution width is greater than the height, then Indicates the use: the direction of the area of the hand like 4 4; if the position of the (4) hand #1胄 is horizontal, the height of the user Mi is greater than the width, the orientation of the palm of the user is not vertical. Example::,5 In the exemplary diagram of determining the number of fingers in the present invention, a distance from the cut hand image 44 is first identified as w 45, and the calculated line 45 and the center of gravity position 41 are:::45 and then if d determines one / value (for example, (4)), then take the line segment PP' of -t //, and then calculate the segment PP of the cut hand image, the number of overlaps determines the number of fingers 239. Next, the gesture recognition unit 233 is further based on the hand The direction 238 and the number of fingers 239 identify the first gesture 131 or the second gesture 132. Applying the 'gesture recognition unit 233' can be compared with a database. Please continue the moxibustion: 6 figure 'the reason for the invention to identify the database of gestures' is not intended. In the figure, this database is a record The comparison information of three preset gestures, such as a fist gesture with a number of fingers of 0, a single-finger gesture of the number of &lt;1, and a palm gesture of the number of fingers; in addition, the database also compares these poor materials. Classified into four hand directions extending from East (E), West (W), South (S), and North (N); in addition, 'this database also compares these to rr τ 11

I 201137766I 201137766

資料分類成水平(H)、垂直(V)以及握拳(幻等三種手 位。手勢辨識單S 233便可根據手部方向238及手 目239於此資料庫查詢以取得相對應的手勢;例如,日 勢辨識單元233根據手部方向為從南方延伸 '手 I 掌直的三個資料可從此資料庫:查詢 方延伸、手指數目為丨且手掌方向為 :攸果 m tL ^ ^ ^ Π馮水千的二個資料可 攸此貝枓庫中查詢出手勢62,其可表示一向左指 手部方向238為從西方延伸、手指數目為$且 向 為水平的三個資料可從此資料庫中查詢出手勢⑺。ΒThe data is classified into horizontal (H), vertical (V), and fist-type (magic and other three hand positions. The gesture recognition list S 233 can be queried according to the hand direction 238 and the hand 239 to obtain the corresponding gesture; for example The day identification unit 233 is based on the direction of the hand as the three pieces extending from the south. The three pieces of data can be obtained from the database: the inquiring side, the number of fingers is 丨 and the direction of the palm is: capsule m tL ^ ^ ^ Π Feng Shuiqian The two materials can query the gesture 62 in the shell library, which can indicate that the left-hand finger direction 238 is extended from the west, the number of fingers is $, and the horizontal data can be queried from the database. Gesture (7).Β

Mi 凡25視需要可包含一區塊偵測單元 單Λ252及一執跡辨識翠元253。區塊 影像236之至二面121中包含有手部 ::异之間的複數個移動向”58。: 孰知,固在LI:估异為此技術領域之通常知識者所 移動向量25 辨識單元253紀錄複數個 乂取侍一移動執跡2 259以取得移動手勢152。 艾辨識移動執跡 晴參閱第7圖,其繪示本發 勢辨識方法之产&quot;^ 基於衫像之動作手 列步驟。^=垃\動作手勢辨識方法包含下Mi 凡 25 can include a block detection unit, single 252 and a trace identification, uiyuan 253. The block image 236 to the two sides 121 includes a hand: a plurality of movements between the different directions to 58.: 孰 ,, 固 LI: Estimate the movement vector 25 of the general knowledge of this technical field The unit 253 records a plurality of captures of a mobile walker 2 259 to obtain a move gesture 152. Ai recognition mobile walks clear, see Fig. 7, which shows the production of the power generation identification method &quot;^ based on the action of the shirt figure Column step. ^= trash\action gesture recognition method includes

、乂 接收複數張影像書面。於牛驟79去P 據此複數張影像書面執行一手熱伯、目』於步驟72根 勢。於步驟73判斷=則’以得到一第一手 勢;如果此第 ^勢疋否符合一預設開始手 弟手勢付合此預設開始手勢,則於步驟Μ 12 201137766 根據此複數張影像晝φ中手部 取得一移動手勢;若否,執行移動追縱以 t,於執行此移動追蹤之過財 ς驟2。在步驟75 面執行此手勢價測,以得到 '^此複數張影像晝 =第二手勢是否符合1設:束手勢在= :b6:, 手勢符合此預設結束手勢,於勢b果此第二 若否,則繼續執行步驟75。# T止此移動追蹤; 動手勢的複雜度’以及提“識準低追蹤及辨識移 請參閲第8圖,其繪示本發 施流程圖。圖中,在步驟81偵測複數貞測之實 影像晝面是否存在有-手部影像。如果二::二任- ^步驟82根據手部影像取得-手部輪廓 圖所示之影像區域33。在步驟 像’如第3 碰觸到的影像晝面之—邊緣,來Χ =輪廓影像所 像區域33之手部方向係判斷為手/方向。例如影 手掌方位計算以取得手部輪廓影像之步驟8 4執行一 4圖所示之重心位置41。接在 ^位置,如第 對手部輪廓影像執行一手掌切判,j根據重心位置 影像。在步驟86根據已切割手部=已切割手部 步驟87根據手部方向及手指數目辨指數目。在 需要亦可再根據手掌方位來判斷出手識勢出手勢。此外,視 請參閱第9圖,其繪示本發明 施流程圖。圖中,在步驟91取 ,:移動追縱之實 中包含有手部影像之至少—影 *數張影像晝面 數影像區塊之間的複數個移動向;。鬼在f步驟92估算複 里在步驟93紀錄複數 13 201137766 =:;ΧΓ—移動軌跡。在步—移動轨跡 Μ上所迷僅為舉例性, 離 更 本發明之精神與料,1、:制性者。任何未脫 ,均應包含於後附之申料㈣^之等效修改或變 【圖式簡單說明】 第1圖發明之基於影像之動作手勢辨識系心 第2圖==基於影像之動作手勢辨識— =3圖係為本發明之手部輪廊影像之範例示意圖; “圖係、為本發明之手掌切割之範例示意圖; =5圖係為本發明之判斷手指數目之範例示意圖; 6圖^為本發明之用以辨識手勢之資料庫範例示意 第7 ®係、為本發明之基於影像之動作手勢辨識方法之 流程圖; 无之 第8圖係、為本發明之執行手勢傾測之實施流程圖;以及 第9圖係為本發明之執行移動追蹤之實施流程圖。 201137766 【主要元件符號說明】 11 :儲存單元 111 :預設開始手勢 112 :預設結束手勢 12 :影像擷取單元 121 :影像畫面 13 :第一處理單元 131 :手勢偵測 • 132 :第一手勢 133 :第二手勢 14 :比對單元 15 :第二處理單元 151 :移動追蹤 152 :移動手勢 21 :記憶體 22 :攝影機 • 23 :第一處理單元 231 :第一影像處理單元 232 :第二影像處理單元 233 :手勢辨識單元 236、 31 :手部影像 237、 43 :手部輪廓影像 238 :手部方向 239 :手指數目 25 :第二處理單元 15 201137766 251 :區塊偵測單元 252 :移動向量單元 253 :軌跡辨識單元 257 :影像區塊 258 :移動向量 259 :移動執跡 32 : 手部輪廓線 33 : 影像區域 41 : 重心位置 44 : 已切割手部影像 45 : 尖端 61〜63 :手勢 71〜77 :步驟流程 81〜87 ·步驟流程 91〜94 :步驟流程, 接收 Receive multiple images in writing. In the case of the cattle to 79 to P, according to this multiple images, the implementation of a hand of hot, the goal in step 72 roots. In step 73, it is judged that the value is 'to obtain a first gesture; if the second potential is not consistent with a preset start hand gesture to pay the preset start gesture, then in step 2011 12 201137766, according to the plurality of images 昼 φ The middle hand obtains a moving gesture; if not, the mobile tracking is performed by t, and the mobile tracking is performed. Perform this gesture price measurement in step 75 to get '^ this multiple image 昼=the second gesture is consistent with 1 setting: beam gesture at = :b6:, the gesture conforms to this preset end gesture, and the potential b If the second is no, proceed to step 75. #T止动移动追踪; The complexity of the gestures' and the recognition of the low tracking and identification movements. Please refer to Figure 8, which shows the flow chart of the present application. In the figure, the complex guess is detected in step 81. Whether there is a - hand image on the image side of the image. If the second:: two - - step 82 is based on the hand image - the image area 33 shown in the hand contour map. In the step like '3rd touch The image of the image is the edge-edge, and the direction of the hand image of the contour image image area 33 is judged as the hand/direction. For example, the step of calculating the palm head orientation to obtain the hand contour image is performed as shown in FIG. The center of gravity position 41. Connected to the ^ position, such as the first hand contour image performs a palm cut, j according to the center of gravity position image. In step 86, according to the cut hand = cut hand step 87 according to the hand direction and the number of fingers Refers to the number. If necessary, you can also judge the gesture according to the palm position. In addition, please refer to Figure 9, which shows the flow chart of the present invention. In the figure, in step 91, the following is performed: Really contains at least the shadow of the hand image The image has a plurality of moving directions between the image blocks; the ghost estimates the complex in step 92 to record the complex number in step 93. 201137766 =:; ΧΓ - moving the trajectory. For the sake of exemplification, the spirit and material of the present invention, 1,: the manufacturer, any unremoved, should be included in the attached application (4) ^ equivalent modification or change [simple description of the figure] Figure 1 Inventive image-based motion gesture recognition system 2nd image == image-based motion gesture recognition - =3 diagram is a schematic diagram of the hand wheel image of the invention; "picture system, the palm of the invention is cut Example diagram; =5 diagram is an example diagram for judging the number of fingers in the present invention; 6 is a schematic diagram of a database for identifying gestures according to the present invention, and is an image-based gesture recognition method of the present invention. The flowchart of the present invention is a flow chart for implementing the gesture detection of the present invention; and FIG. 9 is a flowchart for implementing the execution of the mobile tracking according to the present invention. 201137766 [Description of main component symbols] 11 : Storage unit 111 : Preset start gesture 112 : Preset end gesture 12 : Image capture unit 121 : Image screen 13 : First processing unit 131 : Gesture detection • 132 : First gesture 133: second gesture 14: comparison unit 15: second processing unit 151: mobile tracking 152: moving gesture 21: memory 22: camera • 23: first processing unit 231: first image processing unit 232: second Image processing unit 233: gesture recognition unit 236, 31: hand image 237, 43: hand contour image 238: hand direction 239: number of fingers 25: second processing unit 15 201137766 251: block detection unit 252: movement Vector unit 253: Track recognition unit 257: Image block 258: Motion vector 259: Motion trace 32: Hand contour 33: Image area 41: Center of gravity position 44: Cut hand image 45: Tip 61 to 63: Gesture 71~77: Steps flow 81~87 · Step flow 91~94: Step flow

Claims (1)

201137766 七、申請專利範固: 1. 一種基於影像之動作手勢辨識方法,包含 接收複數張影像晝面; 根據該複數張影像晝面執行一 到一第一手勢; 手勢偵測,以得 判斷該第—手勢是否符合一預設開始手勢; 該複如數果張該合該預設開始手勢,則根據 取面中手部位置’執行-移動追縱以 於執行該移動追蹤之過程〃 ,以得到一第二手勢; 手勢是否符合一預設結束手勢,· 像晝面執行該手勢偵測 x據稷數張影 判斷該第 以 及 2. 移動二手勢符合該舰結束手勢,停止該 範=含1項所述之基於影像之動作手 執^^勢不符合該預設結束手勢,持續 如申請專利範囹笛 勢辨識方法,二二 之基於影像之動作手 -中執行該手勢偵測之步驟更包含· 在二該二張影像畫面之任-影像畫面是否存 I S} 17 201137766 « 如果該手部影像存在,聽據該 一手部輪廓影像; 〜像取侍 數目根據二手部輪廊影像判斷-手部方向及-手指 勢或根該據第該二手手部勢方向及該手指數目辨識出該第-手 4. 第 ' 項所述之基於影像之動作手 ^ ,、中執仃該移動追蹤之步驟更包含: 取得在每一該複數張影像晝面n 影像之至少一影像區塊;以及 匕…手部 5. 塊之_餘_動向量。 勢^方圍中第執44=之基於影像之動作手 &quot;中執仃該移動追縱之步 及 紀錄該複數個移動向量以取得一移動執::以 6.如申^該移動軌跡以取得該移動手勢。 勢辨二第3項基於影像之動作手 輪廓影像所碰觸到的二= 邊緣,來_該手部方向。㈣知像晝面之 7. ^申請專利範圍第3 勢辨識方法’其”斯該手c作手 執行—手掌 数目之步驟更包含·· 旱方位打·得該手部輪廟影像之 18 201137766 一重心位置; 根據該重心位置斜兮车 切割,以取得-已二執行-手掌 根據該已切割手部影像_該手指數目。 8. -種基於影像之動作手勢辨識系統,包含. 結手勢及一預設 :影像擷取單元’係擷取複數張影像晝面; 第處理單元,係根據該複數張旦,後金 行一手勢價測,以得到一第一手^數張讀畫面執 比對早元,係判斷 設開始手勢;以1 手勢是否符合該預 一第二處理單元,如果 手勢符合該預設開始手勢;第,判斷該第一 該複數張影像晝面中手部二二處理單元根據 取得一移動手勢; 置執仃一移動追蹤以 其申,於執行該移動追縱 理單元係根據該複數張影’該第一處 手勢符合該預設結束:勢該第二 該移動追蹤。 、一處理單元停止 9.如申請專利範圍8項 辨識系統,其中當該比t 2影像之動作手勢 ^對早70判斷該第二手勢不符 SJ 19 V V201137766 合該預狀束㈣時,該第二處理單元 追縱β 々該移動 10·如申請專利範圍第8項所述之基於影像 勢辨識系統’其中該第-處理單元更包含.乍手 面^係編複數張影像晝 心仗,5V像畺面内之一手部影像; 一第二影像處理單元,係根據該 一手部輪廓影像;以及 丨〜像取侍 ::方向及一手心數目,並根據該 手指數目辨識出該第-手勢或該第二手勢。亥 u.=申請專·圍第1G項所述之基於影像之動 勢辨識系統’其中該第二處理單元更包含·· *面一Λ塊=則單元,係取得在每一該複數張影像 旦面中包含有該手部影像之至少—影像區塊;以及 一移動向量單元,係估算該複數 的複數個移動向量。 〜紅塊之間 12.=C第11項所述之基於影像之動作手 統’其中該第二處理器更包含一執跡辨識 ^該執跡辨識單元係紀錄該複數個移動向量以 手移動執跡,並辨識該移動軌跡以取得該移動 如申請專利範圍第10項所述之基於影像之動作手 20 201137766 其中該手勢辨識單元係根據該手部輪 到的該影像晝面之一邊緣,來判斷該 14·ΐ申請專利_帛1G項所述之基於影像之動作丰 勢辨識系統,其中該手勢 手201137766 VII. Application for patents: 1. An image-based motion gesture recognition method, comprising receiving a plurality of image planes; performing a one-to-one first gesture according to the plurality of image planes; gesture detection, to determine the Whether the first gesture conforms to a preset start gesture; if the complex start score is the preset start gesture, the process is performed according to the hand position in the face-up to perform the movement tracking process to obtain a second gesture; whether the gesture conforms to a preset end gesture, such as performing the gesture detection x, determining the number according to the number of shadows, and 2. moving the second gesture to match the end gesture of the ship, stopping the fan=including The image-based action hand control method described in item 1 does not conform to the preset end gesture, and continues to perform the gesture detection step in the image-based action hand--such as the patent application fan 囹 flute identification method. More includes · In the two images of the two images - whether the image is stored IS} 17 201137766 « If the hand image exists, listen to the contour image of the hand; ~ image The number of servants is judged according to the image of the second-hand part of the porch - the direction of the hand and the direction of the finger or the root. The image-based action described in the fourth hand is identified according to the direction of the second-hand hand and the number of the finger. The step of performing the mobile tracking in the hand ^ , , the method further comprises: obtaining at least one image block of the n image in each of the plurality of images; and 匕 ... hand 5. _ remaining_moving vector of the block. In the middle of the square, the 44th-based image-based action player performs the move and the record of the plurality of movement vectors to obtain a mobile hold: to obtain the move track by 6. The move gesture. The second item of the second dimension is based on the two = edge of the action image of the action of the image, and the direction of the hand. (4) The image of the image is 7. The patent application scope 3rd potential identification method 'the' is the hand to perform the hand--the number of palms is more than ··································································· Position of the center of gravity; according to the position of the center of gravity, the vehicle is cut to obtain - two execution - the palm according to the number of fingers that have been cut. The number of fingers is 8. The image-based gesture recognition system includes: knot gesture and one Preset: the image capturing unit 'takes a plurality of image planes; the first processing unit is based on the plural number of cards, and the following is a gesture price measurement to obtain a first hand reading frame comparison In the early element, determining whether to start the gesture; whether the 1 gesture conforms to the pre-second processing unit, if the gesture conforms to the preset start gesture; and, determining the first two-dimensional processing unit in the first plurality of images According to the acquisition of a movement gesture; the execution of the movement tracking is performed, and the execution of the movement tracking unit is based on the plurality of shadows. The first gesture conforms to the preset end: the second movement tracking 1. A processing unit is stopped. 9. For example, the patent system has eight identification systems, wherein when the action gesture of the t 2 image is determined by the early 70, the second gesture is inconsistent with SJ 19 V V201137766 and the pre-beam (four), The second processing unit tracks β 々 the movement 10. The image-based potential recognition system as described in claim 8 wherein the first processing unit further includes a plurality of images. 5V is a hand image in the face; a second image processing unit is based on the contour image of the hand; and the image is: the direction of the hand and the number of the hand, and the number is identified according to the number of fingers - Gesture or the second gesture. Hai u.=Application of the image-based motion recognition system described in Item 1G, wherein the second processing unit further includes a face-to-face unit Obtaining at least one image block of the hand image in each of the plurality of image planes; and a motion vector unit estimating a plurality of motion vectors of the complex number. Between the red blocks 12.=C Image-based 11 The second processor further includes a trace identification. The trace recognition unit records the plurality of motion vectors to manually move the trace, and recognizes the movement track to obtain the movement as claimed in claim 10 The image-based action hand 20 201137766, wherein the gesture recognition unit determines the image-based action described in the patent application 帛 帛 1G according to the edge of the image face that is the hand of the hand Abundance identification system, in which the gesture hand 勢辨識系統, 廓影像所碰觸 手部方向。 位計算以取得該手部輪摩影像之-ϋΓΛ掌方 該重心位置對料部㈣f彡純行f置,根據 取得一已切割手部影像手旱切割,以 判斷該手指數目。 根據w已切割手部影像The potential recognition system, where the image touches the direction of the hand. The position is calculated to obtain the image of the hand wheel image. The center of gravity is placed on the material part (4) f彡 pure line f, and the number of fingers is determined according to the hand cut image of the cut hand image. According to w has cut the hand image ί S] 21S S] 21
TW99114005A 2010-04-30 2010-04-30 Image based motion gesture recognition method and system thereof TWI431538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW99114005A TWI431538B (en) 2010-04-30 2010-04-30 Image based motion gesture recognition method and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW99114005A TWI431538B (en) 2010-04-30 2010-04-30 Image based motion gesture recognition method and system thereof

Publications (2)

Publication Number Publication Date
TW201137766A true TW201137766A (en) 2011-11-01
TWI431538B TWI431538B (en) 2014-03-21

Family

ID=46759663

Family Applications (1)

Application Number Title Priority Date Filing Date
TW99114005A TWI431538B (en) 2010-04-30 2010-04-30 Image based motion gesture recognition method and system thereof

Country Status (1)

Country Link
TW (1) TWI431538B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI471814B (en) * 2012-07-18 2015-02-01 Pixart Imaging Inc Method for determining gesture with improving background influence and apparatus thereof
TWI490755B (en) * 2012-06-20 2015-07-01 Pixart Imaging Inc Input system
US9117138B2 (en) 2012-09-05 2015-08-25 Industrial Technology Research Institute Method and apparatus for object positioning by using depth images
US9189073B2 (en) 2011-12-23 2015-11-17 Intel Corporation Transition mechanism for computing system utilizing user sensing
TWI514195B (en) * 2012-05-10 2015-12-21 Intel Corp Gesture responsive image capture control and/or operation on image
TWI553512B (en) * 2015-01-07 2016-10-11 國立臺灣科技大學 A method for recognizing and tracking gesture
US9678574B2 (en) 2011-12-23 2017-06-13 Intel Corporation Computing system utilizing three-dimensional manipulation command gestures
US9684379B2 (en) 2011-12-23 2017-06-20 Intel Corporation Computing system utilizing coordinated two-hand command gestures
US10324535B2 (en) 2011-12-23 2019-06-18 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678574B2 (en) 2011-12-23 2017-06-13 Intel Corporation Computing system utilizing three-dimensional manipulation command gestures
US10345911B2 (en) 2011-12-23 2019-07-09 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures
US12436621B2 (en) 2011-12-23 2025-10-07 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures
US9189073B2 (en) 2011-12-23 2015-11-17 Intel Corporation Transition mechanism for computing system utilizing user sensing
US9684379B2 (en) 2011-12-23 2017-06-20 Intel Corporation Computing system utilizing coordinated two-hand command gestures
US11360566B2 (en) 2011-12-23 2022-06-14 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures
US10324535B2 (en) 2011-12-23 2019-06-18 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures
US11941181B2 (en) 2011-12-23 2024-03-26 Intel Corporation Mechanism to provide visual feedback regarding computing system command gestures
TWI514195B (en) * 2012-05-10 2015-12-21 Intel Corp Gesture responsive image capture control and/or operation on image
TWI490755B (en) * 2012-06-20 2015-07-01 Pixart Imaging Inc Input system
US9842249B2 (en) 2012-07-18 2017-12-12 Pixart Imaging Inc. Gesture recognition method and apparatus with improved background suppression
TWI471814B (en) * 2012-07-18 2015-02-01 Pixart Imaging Inc Method for determining gesture with improving background influence and apparatus thereof
US9117138B2 (en) 2012-09-05 2015-08-25 Industrial Technology Research Institute Method and apparatus for object positioning by using depth images
TWI553512B (en) * 2015-01-07 2016-10-11 國立臺灣科技大學 A method for recognizing and tracking gesture

Also Published As

Publication number Publication date
TWI431538B (en) 2014-03-21

Similar Documents

Publication Publication Date Title
TW201137766A (en) Image based motion gesture recognition method and system thereof
US8373654B2 (en) Image based motion gesture recognition method and system thereof
CN105518575B (en) Hand Interaction with Natural User Interface
RU2605370C2 (en) System for recognition and tracking of fingers
KR102110811B1 (en) System and method for human computer interaction
CN104541232B (en) Multi-modal touch-screen emulator
US8593402B2 (en) Spatial-input-based cursor projection systems and methods
TW201120681A (en) Method and system for operating electric apparatus
CN103970264B (en) Gesture recognition and control method and device
CN102236409A (en) Image-based gesture recognition method and system
AU2012268589A1 (en) System for finger recognition and tracking
CN103150020A (en) Three-dimensional finger control operation method and system
TW201123031A (en) Robot and method for recognizing human faces and gestures thereof
CA2786852A1 (en) Handles interactions for human-computer interface
EP2718899A2 (en) System for recognizing an open or closed hand
JP6349800B2 (en) Gesture recognition device and method for controlling gesture recognition device
US20150378440A1 (en) Dynamically Directing Interpretation of Input Data Based on Contextual Information
TWI574177B (en) Input device, machine, input method and recording medium
JP6651388B2 (en) Gesture modeling device, gesture modeling method, program for gesture modeling system, and gesture modeling system
CN114821630A (en) Static gesture recognition method and system and electronic equipment
CN105630134A (en) Operation event identification method and apparatus
JP5756762B2 (en) Gesture recognition device and program thereof
CN114510142B (en) Gesture recognition method based on two-dimensional image, gesture recognition system based on two-dimensional image and electronic equipment
CN104951211B (en) A kind of information processing method and electronic equipment
JP2013080433A (en) Gesture recognition device and program for the same