[go: up one dir, main page]

CN102053702A - Dynamic gesture control system and method - Google Patents

Dynamic gesture control system and method Download PDF

Info

Publication number
CN102053702A
CN102053702A CN2010105193093A CN201010519309A CN102053702A CN 102053702 A CN102053702 A CN 102053702A CN 2010105193093 A CN2010105193093 A CN 2010105193093A CN 201010519309 A CN201010519309 A CN 201010519309A CN 102053702 A CN102053702 A CN 102053702A
Authority
CN
China
Prior art keywords
gesture
dynamic gesture
point
dynamic
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2010105193093A
Other languages
Chinese (zh)
Inventor
潘文平
邢建芳
赵晓庚
岳健
王冬翠
耿征
张赵行
龚华军
沈春林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN2010105193093A priority Critical patent/CN102053702A/en
Publication of CN102053702A publication Critical patent/CN102053702A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

本发明公布了一种动态手势控制系统与方法,所述系统包括计算机、摄像头,摄像头直接与计算机相连。所述方法包括包括静态手势识别阶段、动态手势跟踪阶段和动态手势识别阶段。本发明(1)非接触式,用户无需任何辅助设备,仅通过摄像头即可实现用户动态手势的跟踪和识别;(2)无需进行标定,简化了系统的使用,手势控制启动快速、跟踪快速、识别准确;(3)成本低廉。目前,各类数据手套的购置与维护价格昂贵,而普通摄像头价格低廉且无需维护,这极大降低了手势控制系统的购置与维护成本。

Figure 201010519309

The invention discloses a dynamic gesture control system and method. The system includes a computer and a camera, and the camera is directly connected to the computer. The method includes a static gesture recognition stage, a dynamic gesture tracking stage and a dynamic gesture recognition stage. The present invention (1) is non-contact, the user does not need any auxiliary equipment, and the tracking and recognition of the user's dynamic gesture can be realized only through the camera; Accurate identification; (3) Low cost. At present, the purchase and maintenance of various data gloves are expensive, while ordinary cameras are cheap and maintenance-free, which greatly reduces the purchase and maintenance costs of gesture control systems.

Figure 201010519309

Description

动态手势控制系统与方法Dynamic Gesture Control System and Method

技术领域technical field

本发明所述的动态手势控制系统与方法,属于一种基于视频的数据处理系统与方法,适用于虚拟现实的人机交互环节、大型互动投影系统、显示系统特别是体三维显示系统的人机交互环节。The dynamic gesture control system and method described in the present invention belong to a video-based data processing system and method, and are suitable for human-computer interaction in virtual reality, large-scale interactive projection systems, and display systems, especially human-computer interaction in volumetric three-dimensional display systems. Interactive link.

背景技术Background technique

交互技术是虚拟现实系统的一项关键技术,也是体三维显示等新型显示系统的一项关键技术。交互技术实现了人与机器的互动、现实世界与虚拟世界的互动。通过动态手势控制实现交互的前提,一是实时获得用户手部的位置等信息,二是识别手势轨迹的意义。前者为手势跟踪、后者为手势识别。Interactive technology is a key technology of virtual reality system, and also a key technology of new display systems such as volumetric three-dimensional display. Interaction technology realizes the interaction between man and machine, the interaction between the real world and the virtual world. The premise of realizing interaction through dynamic gesture control is to obtain information such as the position of the user's hand in real time, and to recognize the meaning of the gesture trajectory. The former is gesture tracking and the latter is gesture recognition.

目前,手势跟踪主要分为基于数据手套的手势跟踪和基于视频的手势跟踪。At present, gesture tracking is mainly divided into data glove-based gesture tracking and video-based gesture tracking.

参见“石教英著,《虚拟现实基础及实用算法》,科学出版社,2002”,系统阐述了数据手套的相关内容:数据手套将手指和手掌伸曲时的各种姿态转化为数字信号送给计算机,用于识别和执行,进而实现人机交互。标准配置是每个手指上有两个传感器,控制装在手指背面的两条光纤环或其它测量元件,用来测量手指主要关节的弯曲角度。数据手套还可提供测量大拇指的并拢/张开以及上翘/下翘角度的传感器作为选件。See "Shi Jiaoying, "Basics and Practical Algorithms of Virtual Reality", Science Press, 2002", which systematically expounds the relevant content of the data glove: the data glove converts various gestures of fingers and palms into digital signals to send For the computer, it is used for recognition and execution, and then realizes human-computer interaction. The standard configuration is that there are two sensors on each finger, controlling two fiber optic rings or other measuring elements mounted on the back of the finger, which are used to measure the bending angle of the major joints of the finger. The data glove can also provide sensors for measuring the closing/opening and up/down angle of the thumb as an option.

数据手套可以精确的获得手指的部位和姿态信息,但是其不足同样明显:(1)结构复杂,佩戴不便,影响了人与机器间自然而直观的交互;(2)价格不菲,具有高精度的精密的传感器部件增加了制造成本。Data gloves can accurately obtain finger position and posture information, but its shortcomings are also obvious: (1) The structure is complex and inconvenient to wear, which affects the natural and intuitive interaction between humans and machines; (2) It is expensive and has high precision The precision sensor components increase the manufacturing cost.

基于视频的手势跟踪,主要通过摄像头跟踪佩戴在手指上的特殊颜色标记或跟踪手持式辅助设备如SONY公司的PlayStation Move等实现。参见“T. Grossman, D. Wigdor, R. Balakrishnan, Multi-Finger Gestural Interaction with 3D Volumetric Displays. Proceedings of the 17th Annual ACM Symposium on User interface Software and Technology. UIST '04 Santa Fe, NM, USA, October 24-27, 2004:61-70”,通过在手指上佩戴特殊颜色的标记,真三维显示系统 Perspecta允许用户在半球形显示器外壳上实现基于视频的手势控制。以上基于视频的手势跟踪方法尽管通过辅助设备提高了跟踪的精度,但同样影响了人与机器间自然而直观的交互。Video-based gesture tracking is mainly achieved through cameras tracking special color markers worn on fingers or tracking handheld auxiliary devices such as SONY's PlayStation Move. See "T. Grossman, D. Wigdor, R. Balakrishnan, Multi-Finger Gestural Interaction with 3D Volumetric Displays. Proceedings of the 17th Annual ACM Symposium on User interface Software and Technology. UIST '04 Ober US to Santa Fe, N M -27, 2004: 61-70", By wearing special colored markers on the fingers, the true 3D display system Perspecta allows users to realize video-based gesture control on a hemispherical display housing. Although the above video-based gesture tracking methods improve the tracking accuracy through auxiliary equipment, they also affect the natural and intuitive interaction between humans and machines.

发明内容Contents of the invention

本发明目的是针对现有技术存在的缺陷提供一种基于计算机视觉的、非接触式的、成本低廉、轻便且实用的动态手势控制系统与方法,实现动态手势控制功能。所要解决的技术问题包括:手势跟踪的快速启动、动态手势的快速跟踪、动态手势的准确识别。The purpose of the present invention is to provide a computer vision-based, non-contact, low-cost, portable and practical dynamic gesture control system and method to realize the dynamic gesture control function. The technical problems to be solved include: fast start of gesture tracking, fast tracking of dynamic gestures, and accurate recognition of dynamic gestures.

本发明为实现上述目的,采用如下技术方案:In order to achieve the above object, the present invention adopts the following technical solutions:

本发明动态手势控制系统,包括计算机、摄像头,摄像头直接与计算机相连。The dynamic gesture control system of the present invention includes a computer and a camera, and the camera is directly connected with the computer.

动态手势控制系统的控制方法,包括静态手势识别阶段、动态手势跟踪阶段和动态手势识别阶段;A control method for a dynamic gesture control system, including a static gesture recognition stage, a dynamic gesture tracking stage and a dynamic gesture recognition stage;

(1)静态手势检测阶段(1) Static gesture detection stage

按以下步骤检测完全张开的静态手势:Proceed as follows to detect a fully open static gesture:

1)对于当前帧与前一帧帧差的二值图像,提取静态手外部轮廓;1) For the binary image of the frame difference between the current frame and the previous frame, extract the external contour of the static hand;

2)按Sklansky算法计算外部轮廓的凸包, 凸包的顶点即为指尖点;2) Calculate the convex hull of the external contour according to the Sklansky algorithm, and the vertex of the convex hull is the fingertip point;

3)计算外部轮廓的凸起缺陷及其深度,即各个缺陷点到凸包的距离;设置最大深度值为                                                ,取深度超过的缺陷点为指间缝隙点;3) Calculate the convex defect and its depth of the outer contour, that is, the distance from each defect point to the convex hull; set the maximum depth to , taking the depth beyond The defect point is the gap point between fingers;

4)若指尖数为5且指间缝隙数为4,静态手势检测成功;以凸起缺陷点集;所确定的矩形区域为初始跟踪区域,以所有指尖点和凸起缺陷点确定的矩形区域为手部边界区域,建立手部肤色分布模型并退出手势检测;否则读取新的输入帧并进入步骤1);4) If the number of fingertips is 5 and the number of gaps between fingers is 4, the static gesture detection is successful; the set of raised defect points; the determined rectangular area is the initial tracking area, determined by all fingertip points and raised defect points The rectangular area is the hand boundary area, establish the hand skin color distribution model and exit the gesture detection; otherwise read the new input frame and enter step 1);

(2)动态手势跟踪阶段(2) Dynamic gesture tracking stage

首先按光流发更新跟踪区域内各个特征点的位置,各个特征点的位置即不能远离中心位置,也不能相互重叠; First update the position of each feature point in the tracking area according to the optical flow, and the position of each feature point can neither be far from the center nor overlap each other;

(3)动态手势识别阶段(3) Dynamic gesture recognition stage

a)对输入的动态手势I包含的轨迹按八方向Freeman链码进行量化处理,得到观测值序列O={b 1 ,b 2 ,,b m };a) Quantify the trajectory contained in the input dynamic gesture I according to the eight-direction Freeman chain code, and obtain the observation value sequence O={ b 1 , b 2 , , b m };

b)计算动态手势HMM库中各模型()产生观测值序列O的概率:;采用离散HMM模型即DHMM,记为,其中:

Figure 300359DEST_PATH_IMAGE006
是初始概率向量,
Figure 2010105193093100002DEST_PATH_IMAGE007
为状态数;
Figure 570935DEST_PATH_IMAGE008
是状态概率转移矩阵;
Figure 2010105193093100002DEST_PATH_IMAGE009
是观测值概率矩阵,
Figure 254814DEST_PATH_IMAGE010
为观测符号数;l是手势HMM库中的模型数,手势编号从1到l;b) Calculate each model in the dynamic gesture HMM library ( ) produces the probability of the sequence of observations O: ; Using the discrete HMM model, namely DHMM, denoted as ,in:
Figure 300359DEST_PATH_IMAGE006
is the initial probability vector,
Figure 2010105193093100002DEST_PATH_IMAGE007
is the state number;
Figure 570935DEST_PATH_IMAGE008
is the state probability transition matrix;
Figure 2010105193093100002DEST_PATH_IMAGE009
is the observation probability matrix,
Figure 254814DEST_PATH_IMAGE010
is the number of observation symbols; l is the number of models in the gesture HMM library, and the gesture numbers are from 1 to l ;

c)取中最大值

Figure 952642DEST_PATH_IMAGE012
,如果
Figure 2010105193093100002DEST_PATH_IMAGE013
Figure 95042DEST_PATH_IMAGE014
为一确定阈值, j即为识别出的手势编号。c) take medium maximum
Figure 952642DEST_PATH_IMAGE012
,if
Figure 2010105193093100002DEST_PATH_IMAGE013
,
Figure 95042DEST_PATH_IMAGE014
is a certain threshold, and j is the recognized gesture number.

其中,步骤(2)动态手势跟踪阶段先将与中心位置超过一定范围的特征点按肤色分布模型调整到与中心位置接近且位于肤色像素点之上,然后将距离太接近的任意两个特征点中的其中一个调整到阈值之外的距离,使得所有的特征点都不会相互重叠,随后更新所有特征点的中心。Among them, step (2) in the dynamic gesture tracking stage first adjusts the feature points that are beyond a certain range from the center position to be close to the center position and above the skin color pixel point according to the skin color distribution model, and then any two feature points that are too close to each other One of them adjusts the distance beyond the threshold so that all feature points do not overlap with each other, and then updates the centers of all feature points.

其中,步骤(3)动态手势跟踪阶段仅对手部中心位置的轨迹曲线进行特征提取:对于手势轨迹线上相邻的两个采样点即中心位置

Figure 2010105193093100002DEST_PATH_IMAGE015
Figure 169308DEST_PATH_IMAGE016
,计算其连线
Figure 2010105193093100002DEST_PATH_IMAGE017
与水平线的夹角
Figure 713553DEST_PATH_IMAGE018
,根据夹角
Figure 644600DEST_PATH_IMAGE018
按八方向Freeman链码对进行编码,编码值为
Figure 2010105193093100002DEST_PATH_IMAGE019
。Among them, the step (3) dynamic gesture tracking stage only performs feature extraction on the trajectory curve of the center position of the hand: for the two adjacent sampling points on the gesture trajectory line, the center position
Figure 2010105193093100002DEST_PATH_IMAGE015
,
Figure 169308DEST_PATH_IMAGE016
, calculate its connection
Figure 2010105193093100002DEST_PATH_IMAGE017
angle with horizontal
Figure 713553DEST_PATH_IMAGE018
, according to the angle
Figure 644600DEST_PATH_IMAGE018
According to the eight directions Freeman chain code pair To encode, the encoded value is
Figure 2010105193093100002DEST_PATH_IMAGE019
.

其中,只有在当前采样点与前一有效采样点的间距大于阈值D T时,当前采样点才为被设置为有效采样点。 Wherein , only when the distance between the current sampling point and the previous valid sampling point is greater than the threshold DT , the current sampling point is set as a valid sampling point.

本发明的有益效果是:The beneficial effects of the present invention are:

与数据手套、特殊颜色标记和手持式游戏手柄等辅助设备相比,具有如下明显优势:(1)非接触式,用户无需任何辅助设备,仅通过摄像头即可实现用户动态手势的跟踪和识别;(2)无需进行标定,简化了系统的使用,手势控制启动快速、跟踪快速、识别准确;(3)成本低廉。目前,各类数据手套的购置与维护价格昂贵,而普通摄像头价格低廉且无需维护,这极大降低了手势控制系统的购置与维护成本。Compared with auxiliary equipment such as data gloves, special color markings, and handheld game controllers, it has the following obvious advantages: (1) Non-contact, users do not need any auxiliary equipment, and the user's dynamic gestures can be tracked and recognized only through the camera; (2) There is no need for calibration, which simplifies the use of the system, and the gesture control starts quickly, tracks quickly, and recognizes accurately; (3) The cost is low. At present, the purchase and maintenance of various data gloves are expensive, while ordinary cameras are cheap and maintenance-free, which greatly reduces the purchase and maintenance costs of gesture control systems.

附图说明Description of drawings

图1是动态手势控制系统的构成框图。Figure 1 is a block diagram of the dynamic gesture control system.

图2是静态手势检测的原理图。Figure 2 is a schematic diagram of static gesture detection.

1-    指尖1- fingertip

2-    指间缝隙2- Between the fingers

3-    腕部凹点3- Wrist pit

图3是动态手势跟踪阶段特征点位置调整的原理图。Fig. 3 is a schematic diagram of feature point position adjustment in the dynamic gesture tracking phase.

1-    中心点1- center point

2-    特征点。2- Feature points.

具体实施方式Detailed ways

首先,如图1所示的系统构成框图进行系统布局,然后通过摄像头采集用户手势运动信息,通过计算机完成静态手势的识别、动态手势的跟踪和动态手势的识别。First, the system configuration block diagram shown in Figure 1 is used for system layout, and then the user gesture motion information is collected through the camera, and the static gesture recognition, dynamic gesture tracking and dynamic gesture recognition are completed through the computer.

(1) 静态手势检测阶段(1) Static gesture detection stage

按以下步骤检测完全张开的静态手势:Proceed as follows to detect a fully open static gesture:

1)对于当前帧与前一帧帧差的二值图像,提取其外部轮廓。1) For the binary image of the frame difference between the current frame and the previous frame, extract its outer contour.

2)按Sklansky算法计算外部轮廓的凸包, 凸包的顶点即为指尖点(去除位于腕部的凸包顶点,如图2中第3类点),如图2中第1类点。2) Calculate the convex hull of the external contour according to the Sklansky algorithm, and the apex of the convex hull is the fingertip point (remove the convex hull apex located at the wrist, as shown in Figure 2, the third type of point), as shown in Figure 2, the first type of point.

3)计算外部轮廓的凸起缺陷及其深度,即各个缺陷点到凸包的距离。记最大深度值为

Figure 11307DEST_PATH_IMAGE001
,取深度超过
Figure 472376DEST_PATH_IMAGE002
的缺陷点为指间缝隙点,如图2中第2类点。3) Calculate the convex defects of the outer contour and their depths, that is, the distance from each defect point to the convex hull. Note that the maximum depth value is
Figure 11307DEST_PATH_IMAGE001
, taking the depth beyond
Figure 472376DEST_PATH_IMAGE002
The defect point is the inter-finger gap point, such as the second type of point in Figure 2.

4)若指尖数为5且指间缝隙数为4,静态手势检测成功。以凸起缺陷点集(如图2,包括第1类和第3类点)所确定的矩形区域为初始跟踪区域,以所有指尖点和凸起缺陷点确定的矩形区域为手部边界区域,建立手部肤色分布模型并退出手势检测;否则读取新的输入帧并进入步骤1)。4) If the number of fingertips is 5 and the number of gaps between fingers is 4, the static gesture detection is successful. The rectangular area determined by the raised defect point set (as shown in Figure 2, including the first and third types of points) is the initial tracking area, and the rectangular area determined by all fingertip points and raised defect points is the hand boundary area , establish a hand skin color distribution model and exit gesture detection; otherwise, read a new input frame and enter step 1).

以上静态手势检测算法充分利用手部区域的凸包顶点和凸起缺陷点,具有方向不变性,并且Sklansky算法复杂度低,因此保证了手势控制启动的准确和快速。The above static gesture detection algorithm makes full use of the convex hull vertices and convex defect points in the hand area, has direction invariance, and the Sklansky algorithm has low complexity, thus ensuring the accuracy and speed of gesture control startup.

(2)动态手势跟踪阶段(2) Dynamic gesture tracking stage

在图3中,首先按光流发更新跟踪区域内各个特征点(图3中第2类点)的位置。此外,按照“群”(Flock)的理论,各个特征点的位置即不能远离中心位置(图3中第1点),也不能相互重叠。因此先将与中心位置超过一定范围的特征点按肤色分布模型调整到与中心位置接近且位于肤色像素点之上,然后将距离太接近的任意两个特征点中的其中一个调整到阈值之外的距离,使得所有的特征点都不会相互重叠,随后更新所有特征点的中心。这样的处理使得特征点的中心位置变化更为稳定和平滑,提高了跟踪过程的可靠性。In Figure 3, the position of each feature point (the second type of point in Figure 3) in the tracking area is firstly updated according to the optical flow. In addition, according to the theory of "group" (Flock), the position of each feature point can neither be far from the center position (point 1 in Figure 3), nor can it overlap with each other. Therefore, first adjust the feature points that are beyond a certain range from the center position to be close to the center position and above the skin color pixel point according to the skin color distribution model, and then adjust one of any two feature points that are too close to the threshold. so that all feature points do not overlap with each other, and then update the centers of all feature points. Such processing makes the change of the central position of the feature point more stable and smooth, and improves the reliability of the tracking process.

以上跟踪算法综合了光流法和特征点群算法的优点,使得非接触式的手势跟踪过程准确而快速。The above tracking algorithm combines the advantages of the optical flow method and the feature point group algorithm, making the non-contact gesture tracking process accurate and fast.

(2)动态手势识别阶段(2) Dynamic gesture recognition stage

为提高手势识别的实时性能,在此仅对手部中心位置的轨迹曲线进行特征提取。对于手势轨迹线上相邻的两个采样点(中心位置)

Figure 839903DEST_PATH_IMAGE015
Figure 19212DEST_PATH_IMAGE016
,计算其连线
Figure 226159DEST_PATH_IMAGE017
与水平线的夹角,根据
Figure 80163DEST_PATH_IMAGE018
,可按八方向Freeman链码对
Figure 746767DEST_PATH_IMAGE017
进行编码,码值为
Figure 763265DEST_PATH_IMAGE020
。此外,为避免采样点过于密集、编码过长,只有在当前采样点与前一有效采样点的间距大于阈值D T时,当前采样点才为被设置为有效采样点。In order to improve the real-time performance of gesture recognition, feature extraction is only performed on the trajectory curve of the center position of the hand. For two adjacent sampling points (center position) on the gesture trajectory line
Figure 839903DEST_PATH_IMAGE015
,
Figure 19212DEST_PATH_IMAGE016
, calculate its connection
Figure 226159DEST_PATH_IMAGE017
angle with horizontal ,according to
Figure 80163DEST_PATH_IMAGE018
, according to the eight-direction Freeman chain code pair
Figure 746767DEST_PATH_IMAGE017
To encode, the code value is
Figure 763265DEST_PATH_IMAGE020
. In addition, in order to avoid too dense sampling points and too long encoding, only when the distance between the current sampling point and the previous valid sampling point is greater than the threshold DT , the current sampling point is set as a valid sampling point.

动态手势识别的具体过程是:The specific process of dynamic gesture recognition is:

1)对输入的动态手势I包含的轨迹按八方向Freeman链码进行量化处理,得到其观测值序列O={b 1 ,b 2 ,,b m }。1) Quantify the trajectory contained in the input dynamic gesture I according to the eight-direction Freeman chain code, and obtain its observation value sequence O={ b 1 , b 2 , , b m }.

2)计算动态手势HMM库中各模型()产生O的概率:

Figure 136609DEST_PATH_IMAGE004
。(各个动态手势HMM模型按Baum-Welch算法训练得到,训练样本同样按1)方式编码)2) Calculate each model in the dynamic gesture HMM library ( ) produces the probability of O:
Figure 136609DEST_PATH_IMAGE004
. (Each dynamic gesture HMM model is trained according to the Baum-Welch algorithm, and the training samples are also coded according to 1) method)

3)取

Figure 845939DEST_PATH_IMAGE011
中最大值,如果
Figure 554449DEST_PATH_IMAGE013
Figure 844616DEST_PATH_IMAGE014
为一确定阈值), j即为识别出的手势编号。3) take
Figure 845939DEST_PATH_IMAGE011
medium maximum ,if
Figure 554449DEST_PATH_IMAGE013
(
Figure 844616DEST_PATH_IMAGE014
is a certain threshold), j is the recognized gesture number.

以上动态手势识别算法,选取八方向Freeman链码而不是十六向链码对动态手势轨迹进行编码,是对系统实时性能和编码精度的折中。For the above dynamic gesture recognition algorithm, the eight-direction Freeman chain code is selected instead of the sixteen-direction chain code to encode the dynamic gesture trajectory, which is a compromise between the real-time performance of the system and the encoding accuracy.

Claims (5)

1.一种动态手势控制系统,其特征在于包括计算机、摄像头,摄像头直接与计算机相连。1. A dynamic gesture control system is characterized in that it comprises a computer and a camera, and the camera is directly connected with the computer. 2.一种如权利要求1所述的动态手势控制系统的控制方法,其特征在于包括静态手势识别阶段、动态手势跟踪阶段和动态手势识别阶段;2. A control method of a dynamic gesture control system as claimed in claim 1, characterized in that comprising a static gesture recognition stage, a dynamic gesture tracking stage and a dynamic gesture recognition stage; 静态手势检测阶段按以下步骤检测完全张开的静态手势:The static gesture detection phase detects fully open static gestures as follows: 1)对于当前帧与前一帧帧差的二值图像,提取静态手外部轮廓;1) For the binary image of the frame difference between the current frame and the previous frame, extract the external contour of the static hand; 2)按Sklansky算法计算外部轮廓的凸包, 凸包的顶点即为指尖点;2) Calculate the convex hull of the external contour according to the Sklansky algorithm, and the vertex of the convex hull is the fingertip point; 3)计算外部轮廓的凸起缺陷及其深度,即各个缺陷点到凸包的距离;设置最大深度值为                                                
Figure 2010105193093100001DEST_PATH_IMAGE001
,取深度超过
Figure 363DEST_PATH_IMAGE002
的缺陷点为指间缝隙点;
3) Calculate the convex defect and its depth of the outer contour, that is, the distance from each defect point to the convex hull; set the maximum depth to
Figure 2010105193093100001DEST_PATH_IMAGE001
, taking the depth beyond
Figure 363DEST_PATH_IMAGE002
The defect point is the gap point between fingers;
4)若指尖数为5且指间缝隙数为4,静态手势检测成功;以凸起缺陷点集;所确定的矩形区域为初始跟踪区域,以所有指尖点和凸起缺陷点确定的矩形区域为手部边界区域,建立手部肤色分布模型并退出手势检测;否则读取新的输入帧并进入步骤1);4) If the number of fingertips is 5 and the number of gaps between fingers is 4, the static gesture detection is successful; the set of raised defect points; the determined rectangular area is the initial tracking area, determined by all fingertip points and raised defect points The rectangular area is the hand boundary area, establish the hand skin color distribution model and exit the gesture detection; otherwise read the new input frame and enter step 1); (2)动态手势跟踪阶段(2) Dynamic gesture tracking stage 首先按光流发更新跟踪区域内各个特征点的位置,各个特征点的位置即不能远离中心位置,也不能相互重叠; First update the position of each feature point in the tracking area according to the optical flow, and the position of each feature point can neither be far from the center nor overlap each other; (3)动态手势识别阶段(3) Dynamic gesture recognition stage a)对输入的动态手势I包含的轨迹按八方向Freeman链码进行量化处理,得到观测值序列O={b 1 ,b 2 ,,b m };a) Quantify the trajectory contained in the input dynamic gesture I according to the eight-direction Freeman chain code, and obtain the observation value sequence O={ b 1 , b 2 , , b m }; b)计算动态手势HMM库中各模型(
Figure 2010105193093100001DEST_PATH_IMAGE003
)产生观测值序列O的概率:
Figure 979820DEST_PATH_IMAGE004
;采用离散HMM模型即DHMM,记为
Figure 2010105193093100001DEST_PATH_IMAGE005
,其中:
Figure 821874DEST_PATH_IMAGE006
是初始概率向量,
Figure 2010105193093100001DEST_PATH_IMAGE007
为状态数;
Figure 279400DEST_PATH_IMAGE008
是状态概率转移矩阵;
Figure 2010105193093100001DEST_PATH_IMAGE009
是观测值概率矩阵,
Figure 156089DEST_PATH_IMAGE010
为观测符号数;l是手势HMM库中的模型数,手势编号从1到l
b) Calculate each model in the dynamic gesture HMM library (
Figure 2010105193093100001DEST_PATH_IMAGE003
) produces the probability of the sequence of observations O:
Figure 979820DEST_PATH_IMAGE004
; Using the discrete HMM model, namely DHMM, denoted as
Figure 2010105193093100001DEST_PATH_IMAGE005
,in:
Figure 821874DEST_PATH_IMAGE006
is the initial probability vector,
Figure 2010105193093100001DEST_PATH_IMAGE007
is the state number;
Figure 279400DEST_PATH_IMAGE008
is the state probability transition matrix;
Figure 2010105193093100001DEST_PATH_IMAGE009
is the observation probability matrix,
Figure 156089DEST_PATH_IMAGE010
is the number of observation symbols; l is the number of models in the gesture HMM library, and the gesture numbers are from 1 to l ;
c)取
Figure 2010105193093100001DEST_PATH_IMAGE011
中最大值
Figure 35009DEST_PATH_IMAGE012
,如果
Figure 2010105193093100001DEST_PATH_IMAGE013
Figure 364359DEST_PATH_IMAGE014
为一确定阈值, j即为识别出的手势编号。
c) take
Figure 2010105193093100001DEST_PATH_IMAGE011
medium maximum
Figure 35009DEST_PATH_IMAGE012
,if
Figure 2010105193093100001DEST_PATH_IMAGE013
,
Figure 364359DEST_PATH_IMAGE014
is a certain threshold, and j is the recognized gesture number.
3.根据权利要求2所述的动态手势控制系统的控制方法,其特征在于步骤(2)动态手势跟踪阶段先将与中心位置超过一定范围的特征点按肤色分布模型调整到与中心位置接近且位于肤色像素点之上,然后将距离太接近的任意两个特征点中的其中一个调整到阈值之外的距离,使得所有的特征点都不会相互重叠,随后更新所有特征点的中心。3. The control method of the dynamic gesture control system according to claim 2, characterized in that in the step (2) of the dynamic gesture tracking stage, the feature points that exceed a certain range from the center position are adjusted to be close to the center position according to the skin color distribution model. Located on the skin color pixel, then adjust one of the two feature points that are too close to the distance beyond the threshold, so that all feature points will not overlap each other, and then update the centers of all feature points. 4.根据权利要求2所述的动态手势控制系统的控制方法,其特征在于步骤(3)动态手势跟踪阶段仅对手部中心位置的轨迹曲线进行特征提取:对于手势轨迹线上相邻的两个采样点即中心位置
Figure 2010105193093100001DEST_PATH_IMAGE015
,计算其连线
Figure 2010105193093100001DEST_PATH_IMAGE017
与水平线的夹角,根据夹角
Figure 412453DEST_PATH_IMAGE018
按八方向Freeman链码对
Figure 166782DEST_PATH_IMAGE017
进行编码,编码值为
4. The control method of the dynamic gesture control system according to claim 2, characterized in that step (3) in the dynamic gesture tracking stage only performs feature extraction on the trajectory curve of the center position of the hand: for two adjacent gesture trajectory lines The sampling point is the center position
Figure 2010105193093100001DEST_PATH_IMAGE015
, , calculate its connection
Figure 2010105193093100001DEST_PATH_IMAGE017
angle with horizontal , according to the angle
Figure 412453DEST_PATH_IMAGE018
According to the eight directions Freeman chain code pair
Figure 166782DEST_PATH_IMAGE017
To encode, the encoded value is .
5.根据权利要求3所述的动态手势控制系统的控制方法,其特征在于只有在当前采样点与前一有效采样点的间距大于阈值D T时,当前采样点才为被设置为有效采样点。5. The control method of the dynamic gesture control system according to claim 3, wherein only when the distance between the current sampling point and the previous effective sampling point is greater than the threshold value DT , the current sampling point is set as an effective sampling point .
CN2010105193093A 2010-10-26 2010-10-26 Dynamic gesture control system and method Pending CN102053702A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010105193093A CN102053702A (en) 2010-10-26 2010-10-26 Dynamic gesture control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010105193093A CN102053702A (en) 2010-10-26 2010-10-26 Dynamic gesture control system and method

Publications (1)

Publication Number Publication Date
CN102053702A true CN102053702A (en) 2011-05-11

Family

ID=43958102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010105193093A Pending CN102053702A (en) 2010-10-26 2010-10-26 Dynamic gesture control system and method

Country Status (1)

Country Link
CN (1) CN102053702A (en)

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509079A (en) * 2011-11-04 2012-06-20 康佳集团股份有限公司 Real-time gesture tracking method and tracking system
CN102955567A (en) * 2011-08-31 2013-03-06 德信互动科技(北京)有限公司 Man-machine interaction system and method
CN102981623A (en) * 2012-11-30 2013-03-20 深圳先进技术研究院 Method and system for triggering input instruction
WO2012126426A3 (en) * 2012-05-21 2013-04-25 华为技术有限公司 Method and device for contact-free control by hand gesture
CN103365404A (en) * 2012-04-01 2013-10-23 联想(北京)有限公司 Human-computer interaction method and human-computer interaction device
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
CN103616954A (en) * 2013-12-06 2014-03-05 Tcl通讯(宁波)有限公司 Virtual keyboard system, implementation method and mobile terminal
CN104102347A (en) * 2014-07-09 2014-10-15 东莞万士达液晶显示器有限公司 Fingertip positioning method and fingertip positioning terminal
CN104298354A (en) * 2014-10-11 2015-01-21 河海大学 Man-machine interaction gesture recognition method
CN104656883A (en) * 2013-11-20 2015-05-27 江南大学 Gesture acquisition system based on multiple acceleration sensors and ZigBee network
CN104766038A (en) * 2014-01-02 2015-07-08 株式会社理光 Palm opening and closing action recognition method and device
CN104915011A (en) * 2015-06-28 2015-09-16 合肥金诺数码科技股份有限公司 Open environment gesture interaction game system
CN105224146A (en) * 2015-09-25 2016-01-06 厦门求实智能网络设备有限公司 Projection-type virtual interacting control panel and control method thereof
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
CN105900376A (en) * 2014-01-06 2016-08-24 三星电子株式会社 Home device control apparatus and control method using wearable device
CN105912974A (en) * 2015-12-18 2016-08-31 乐视致新电子科技(天津)有限公司 Gesture identification method and apparatus
CN105988566A (en) * 2015-02-11 2016-10-05 联想(北京)有限公司 Information processing method and electronic device
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US9495613B2 (en) 2012-01-17 2016-11-15 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging using formed difference images
CN106502379A (en) * 2016-09-12 2017-03-15 深圳奥比中光科技有限公司 A kind of exchange method and interactive system, the acquisition methods of relative depth
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
CN106557173A (en) * 2016-11-29 2017-04-05 重庆重智机器人研究院有限公司 Dynamic gesture identification method and device
CN106610716A (en) * 2015-10-21 2017-05-03 华为技术有限公司 Gesture recognition method and device
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
CN106934818A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of hand exercise tracking and system
CN106934333A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of gesture identification method and system
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
CN107835970A (en) * 2015-06-02 2018-03-23 大众汽车有限公司 Navigator, user interface, and method for assisting a user when interacting with the user interface
US9945660B2 (en) 2012-01-17 2018-04-17 Leap Motion, Inc. Systems and methods of locating a control object appendage in three dimensional (3D) space
CN108496142A (en) * 2017-04-07 2018-09-04 深圳市柔宇科技有限公司 A kind of gesture identification method and relevant apparatus
CN108870757A (en) * 2018-06-29 2018-11-23 哈尔滨拓博科技有限公司 A kind of controlling device for water heater and control method based on plane gesture identification
CN109144260A (en) * 2018-08-24 2019-01-04 上海商汤智能科技有限公司 Dynamic action detection method, dynamic action control method and device
CN109480903A (en) * 2018-12-25 2019-03-19 无锡祥生医疗科技股份有限公司 Imaging method, the apparatus and system of ultrasonic diagnostic equipment
US10296085B2 (en) 2014-03-05 2019-05-21 Markantus Ag Relatively simple and inexpensive finger operated control device including piezoelectric sensors for gesture input, and method thereof
CN109976338A (en) * 2019-03-14 2019-07-05 山东大学 A kind of multi-modal quadruped robot man-machine interactive system and method
CN110059580A (en) * 2019-03-27 2019-07-26 长春理工大学 A kind of dynamic hand gesture recognition Enhancement Method based on leap motion
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US10739862B2 (en) 2013-01-15 2020-08-11 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
CN112115801A (en) * 2020-08-25 2020-12-22 深圳市优必选科技股份有限公司 Dynamic gesture recognition method, device, storage medium and terminal device
US11221681B2 (en) 2017-12-22 2022-01-11 Beijing Sensetime Technology Development Co., Ltd Methods and apparatuses for recognizing dynamic gesture, and control methods and apparatuses using gesture interaction
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US12154238B2 (en) 2014-05-20 2024-11-26 Ultrahaptics IP Two Limited Wearable augmented reality devices with object detection and tracking
US12260023B2 (en) 2012-01-17 2025-03-25 Ultrahaptics IP Two Limited Systems and methods for machine control
US12299207B2 (en) 2015-01-16 2025-05-13 Ultrahaptics IP Two Limited Mode switching for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
US12314478B2 (en) 2014-05-14 2025-05-27 Ultrahaptics IP Two Limited Systems and methods of tracking moving hands and recognizing gestural interactions

Cited By (104)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102955567A (en) * 2011-08-31 2013-03-06 德信互动科技(北京)有限公司 Man-machine interaction system and method
CN102509079A (en) * 2011-11-04 2012-06-20 康佳集团股份有限公司 Real-time gesture tracking method and tracking system
US11782516B2 (en) 2012-01-17 2023-10-10 Ultrahaptics IP Two Limited Differentiating a detected object from a background using a gaussian brightness falloff pattern
US9436998B2 (en) 2012-01-17 2016-09-06 Leap Motion, Inc. Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections
US9778752B2 (en) 2012-01-17 2017-10-03 Leap Motion, Inc. Systems and methods for machine control
US8638989B2 (en) 2012-01-17 2014-01-28 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US11720180B2 (en) 2012-01-17 2023-08-08 Ultrahaptics IP Two Limited Systems and methods for machine control
US11308711B2 (en) 2012-01-17 2022-04-19 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9767345B2 (en) 2012-01-17 2017-09-19 Leap Motion, Inc. Systems and methods of constructing three-dimensional (3D) model of an object using image cross-sections
US9934580B2 (en) 2012-01-17 2018-04-03 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10767982B2 (en) 2012-01-17 2020-09-08 Ultrahaptics IP Two Limited Systems and methods of locating a control object appendage in three dimensional (3D) space
US10699155B2 (en) 2012-01-17 2020-06-30 Ultrahaptics IP Two Limited Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US10691219B2 (en) 2012-01-17 2020-06-23 Ultrahaptics IP Two Limited Systems and methods for machine control
US9153028B2 (en) 2012-01-17 2015-10-06 Leap Motion, Inc. Systems and methods for capturing motion in three-dimensional space
US9741136B2 (en) 2012-01-17 2017-08-22 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9945660B2 (en) 2012-01-17 2018-04-17 Leap Motion, Inc. Systems and methods of locating a control object appendage in three dimensional (3D) space
US9697643B2 (en) 2012-01-17 2017-07-04 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US10565784B2 (en) 2012-01-17 2020-02-18 Ultrahaptics IP Two Limited Systems and methods for authenticating a user according to a hand of the user moving in a three-dimensional (3D) space
US9679215B2 (en) 2012-01-17 2017-06-13 Leap Motion, Inc. Systems and methods for machine control
US11994377B2 (en) 2012-01-17 2024-05-28 Ultrahaptics IP Two Limited Systems and methods of locating a control object appendage in three dimensional (3D) space
US10410411B2 (en) 2012-01-17 2019-09-10 Leap Motion, Inc. Systems and methods of object shape and position determination in three-dimensional (3D) space
US9672441B2 (en) 2012-01-17 2017-06-06 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US9495613B2 (en) 2012-01-17 2016-11-15 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging using formed difference images
US10366308B2 (en) 2012-01-17 2019-07-30 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
US12086327B2 (en) 2012-01-17 2024-09-10 Ultrahaptics IP Two Limited Differentiating a detected object from a background using a gaussian brightness falloff pattern
US12260023B2 (en) 2012-01-17 2025-03-25 Ultrahaptics IP Two Limited Systems and methods for machine control
US9626591B2 (en) 2012-01-17 2017-04-18 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging
US9652668B2 (en) 2012-01-17 2017-05-16 Leap Motion, Inc. Enhanced contrast for object detection and characterization by optical imaging based on differences between images
CN103365404B (en) * 2012-04-01 2016-07-06 联想(北京)有限公司 A kind of method and apparatus of man-machine interaction
CN103365404A (en) * 2012-04-01 2013-10-23 联想(北京)有限公司 Human-computer interaction method and human-computer interaction device
WO2012126426A3 (en) * 2012-05-21 2013-04-25 华为技术有限公司 Method and device for contact-free control by hand gesture
US8866781B2 (en) 2012-05-21 2014-10-21 Huawei Technologies Co., Ltd. Contactless gesture-based control method and apparatus
US9285893B2 (en) 2012-11-08 2016-03-15 Leap Motion, Inc. Object detection and tracking with variable-field illumination devices
CN102981623A (en) * 2012-11-30 2013-03-20 深圳先进技术研究院 Method and system for triggering input instruction
US10609285B2 (en) 2013-01-07 2020-03-31 Ultrahaptics IP Two Limited Power consumption in motion-capture systems
US9465461B2 (en) 2013-01-08 2016-10-11 Leap Motion, Inc. Object detection and tracking with audio and optical signals
US10097754B2 (en) 2013-01-08 2018-10-09 Leap Motion, Inc. Power consumption in motion-capture systems with audio and optical signals
US12405673B2 (en) 2013-01-15 2025-09-02 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11740705B2 (en) 2013-01-15 2023-08-29 Ultrahaptics IP Two Limited Method and system for controlling a machine according to a characteristic of a control object
US11353962B2 (en) 2013-01-15 2022-06-07 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US12204695B2 (en) 2013-01-15 2025-01-21 Ultrahaptics IP Two Limited Dynamic, free-space user interactions for machine control
US10739862B2 (en) 2013-01-15 2020-08-11 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11874970B2 (en) 2013-01-15 2024-01-16 Ultrahaptics IP Two Limited Free-space user interface and control using virtual constructs
US11693115B2 (en) 2013-03-15 2023-07-04 Ultrahaptics IP Two Limited Determining positional information of an object in space
US10585193B2 (en) 2013-03-15 2020-03-10 Ultrahaptics IP Two Limited Determining positional information of an object in space
US12306301B2 (en) 2013-03-15 2025-05-20 Ultrahaptics IP Two Limited Determining positional information of an object in space
US11099653B2 (en) 2013-04-26 2021-08-24 Ultrahaptics IP Two Limited Machine responsiveness to dynamic user movements and gestures
US12333081B2 (en) 2013-04-26 2025-06-17 Ultrahaptics IP Two Limited Interacting with a machine using gestures in first and second user-specific virtual planes
US9916009B2 (en) 2013-04-26 2018-03-13 Leap Motion, Inc. Non-tactile interface systems and methods
US10452151B2 (en) 2013-04-26 2019-10-22 Ultrahaptics IP Two Limited Non-tactile interface systems and methods
US12236528B2 (en) 2013-08-29 2025-02-25 Ultrahaptics IP Two Limited Determining spans and span lengths of a control object in a free space gesture control environment
US11776208B2 (en) 2013-08-29 2023-10-03 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11282273B2 (en) 2013-08-29 2022-03-22 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11461966B1 (en) 2013-08-29 2022-10-04 Ultrahaptics IP Two Limited Determining spans and span lengths of a control object in a free space gesture control environment
US12086935B2 (en) 2013-08-29 2024-09-10 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US10846942B1 (en) 2013-08-29 2020-11-24 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US12242312B2 (en) 2013-10-03 2025-03-04 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US11775033B2 (en) 2013-10-03 2023-10-03 Ultrahaptics IP Two Limited Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation
US12265761B2 (en) 2013-10-31 2025-04-01 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
US11868687B2 (en) 2013-10-31 2024-01-09 Ultrahaptics IP Two Limited Predictive information for free space gesture control and communication
CN104656883A (en) * 2013-11-20 2015-05-27 江南大学 Gesture acquisition system based on multiple acceleration sensors and ZigBee network
CN103616954A (en) * 2013-12-06 2014-03-05 Tcl通讯(宁波)有限公司 Virtual keyboard system, implementation method and mobile terminal
CN104766038A (en) * 2014-01-02 2015-07-08 株式会社理光 Palm opening and closing action recognition method and device
CN104766038B (en) * 2014-01-02 2018-05-18 株式会社理光 The recognition methods of palm opening and closing movement and device
CN105900376A (en) * 2014-01-06 2016-08-24 三星电子株式会社 Home device control apparatus and control method using wearable device
US9613262B2 (en) 2014-01-15 2017-04-04 Leap Motion, Inc. Object detection and tracking for providing a virtual device experience
US10296085B2 (en) 2014-03-05 2019-05-21 Markantus Ag Relatively simple and inexpensive finger operated control device including piezoelectric sensors for gesture input, and method thereof
US12314478B2 (en) 2014-05-14 2025-05-27 Ultrahaptics IP Two Limited Systems and methods of tracking moving hands and recognizing gestural interactions
US12154238B2 (en) 2014-05-20 2024-11-26 Ultrahaptics IP Two Limited Wearable augmented reality devices with object detection and tracking
CN104102347A (en) * 2014-07-09 2014-10-15 东莞万士达液晶显示器有限公司 Fingertip positioning method and fingertip positioning terminal
US11778159B2 (en) 2014-08-08 2023-10-03 Ultrahaptics IP Two Limited Augmented reality with motion sensing
US12095969B2 (en) 2014-08-08 2024-09-17 Ultrahaptics IP Two Limited Augmented reality with motion sensing
CN104298354A (en) * 2014-10-11 2015-01-21 河海大学 Man-machine interaction gesture recognition method
US12299207B2 (en) 2015-01-16 2025-05-13 Ultrahaptics IP Two Limited Mode switching for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
CN105988566A (en) * 2015-02-11 2016-10-05 联想(北京)有限公司 Information processing method and electronic device
CN105988566B (en) * 2015-02-11 2019-05-31 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107835970A (en) * 2015-06-02 2018-03-23 大众汽车有限公司 Navigator, user interface, and method for assisting a user when interacting with the user interface
CN104915011A (en) * 2015-06-28 2015-09-16 合肥金诺数码科技股份有限公司 Open environment gesture interaction game system
CN105224146A (en) * 2015-09-25 2016-01-06 厦门求实智能网络设备有限公司 Projection-type virtual interacting control panel and control method thereof
CN106610716A (en) * 2015-10-21 2017-05-03 华为技术有限公司 Gesture recognition method and device
US10732724B2 (en) 2015-10-21 2020-08-04 Huawei Technologies Co., Ltd. Gesture recognition method and apparatus
CN106610716B (en) * 2015-10-21 2019-08-27 华为技术有限公司 A gesture recognition method and device
CN105912974A (en) * 2015-12-18 2016-08-31 乐视致新电子科技(天津)有限公司 Gesture identification method and apparatus
CN106934818A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of hand exercise tracking and system
CN106934333B (en) * 2015-12-31 2021-07-20 芋头科技(杭州)有限公司 Gesture recognition method and system
CN106934333A (en) * 2015-12-31 2017-07-07 芋头科技(杭州)有限公司 A kind of gesture identification method and system
CN106934818B (en) * 2015-12-31 2020-07-28 芋头科技(杭州)有限公司 Hand motion tracking method and system
CN106502379B (en) * 2016-09-12 2019-05-31 深圳奥比中光科技有限公司 A kind of acquisition methods of exchange method and interactive system, relative depth
CN106502379A (en) * 2016-09-12 2017-03-15 深圳奥比中光科技有限公司 A kind of exchange method and interactive system, the acquisition methods of relative depth
CN106557173A (en) * 2016-11-29 2017-04-05 重庆重智机器人研究院有限公司 Dynamic gesture identification method and device
CN106557173B (en) * 2016-11-29 2019-10-18 重庆重智机器人研究院有限公司 Dynamic gesture identification method and device
CN108496142B (en) * 2017-04-07 2021-04-27 深圳市柔宇科技股份有限公司 Gesture recognition method and related device
WO2018184233A1 (en) * 2017-04-07 2018-10-11 深圳市柔宇科技有限公司 Hand gesture recognition method and related device
CN108496142A (en) * 2017-04-07 2018-09-04 深圳市柔宇科技有限公司 A kind of gesture identification method and relevant apparatus
US11221681B2 (en) 2017-12-22 2022-01-11 Beijing Sensetime Technology Development Co., Ltd Methods and apparatuses for recognizing dynamic gesture, and control methods and apparatuses using gesture interaction
CN108870757A (en) * 2018-06-29 2018-11-23 哈尔滨拓博科技有限公司 A kind of controlling device for water heater and control method based on plane gesture identification
US11455836B2 (en) 2018-08-24 2022-09-27 Shanghai Sensetime Intelligent Technology Co., Ltd. Dynamic motion detection method and apparatus, and storage medium
CN109144260A (en) * 2018-08-24 2019-01-04 上海商汤智能科技有限公司 Dynamic action detection method, dynamic action control method and device
CN109480903A (en) * 2018-12-25 2019-03-19 无锡祥生医疗科技股份有限公司 Imaging method, the apparatus and system of ultrasonic diagnostic equipment
CN109976338A (en) * 2019-03-14 2019-07-05 山东大学 A kind of multi-modal quadruped robot man-machine interactive system and method
CN110059580A (en) * 2019-03-27 2019-07-26 长春理工大学 A kind of dynamic hand gesture recognition Enhancement Method based on leap motion
CN112115801B (en) * 2020-08-25 2023-11-24 深圳市优必选科技股份有限公司 Dynamic gesture recognition method and device, storage medium and terminal equipment
CN112115801A (en) * 2020-08-25 2020-12-22 深圳市优必选科技股份有限公司 Dynamic gesture recognition method, device, storage medium and terminal device
WO2022041613A1 (en) * 2020-08-25 2022-03-03 深圳市优必选科技股份有限公司 Dynamic gesture recognition method and apparatus, and storage medium and terminal device

Similar Documents

Publication Publication Date Title
CN102053702A (en) Dynamic gesture control system and method
US12141366B2 (en) Gesture recognition system and method of using same
US11093769B2 (en) Stroke extraction in free space
US10261595B1 (en) High resolution tracking and response to hand gestures through three dimensions
Ren et al. Robust part-based hand gesture recognition using kinect sensor
Ren et al. Depth camera based hand gesture recognition and its applications in human-computer-interaction
WO2021129064A9 (en) Posture acquisition method and device, and key point coordinate positioning model training method and device
CN103941866B (en) Three-dimensional gesture recognizing method based on Kinect depth image
Prisacariu et al. 3D hand tracking for human computer interaction
CN102999152B (en) A kind of gesture motion recognition methods and system
Zhu et al. Vision based hand gesture recognition
Pan et al. A real-time multi-cue hand tracking algorithm based on computer vision
CN107450714A (en) Man-machine interaction support test system based on augmented reality and image recognition
CN102047203B (en) Pose-based control using 3D information extracted within an extended depth of field
US20130120250A1 (en) Gesture recognition system and method
CN111444764A (en) Gesture recognition method based on depth residual error network
Wu et al. Vision-based fingertip tracking utilizing curvature points clustering and hash model representation
CN107256083A (en) Many finger method for real time tracking based on KINECT
CN106502390A (en) A kind of visual human's interactive system and method based on dynamic 3D Handwritten Digit Recognitions
Jain et al. Human computer interaction–Hand gesture recognition
Roy et al. Real time hand gesture based user friendly human computer interaction system
Enkhbat et al. Handkey: An efficient hand typing recognition using cnn for virtual keyboard
Schlattmann et al. Markerless 4 gestures 6 DOF real‐time visual tracking of the human hand with automatic initialization
Raees et al. Thumb inclination-based manipulation and exploration, a machine learning based interaction technique for virtual environments
Bellarbi et al. Hand gesture recognition using contour based method for tabletop surfaces

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20110511