[go: up one dir, main page]

CN102814814A - Kinect-based man-machine interaction method for two-arm robot - Google Patents

Kinect-based man-machine interaction method for two-arm robot Download PDF

Info

Publication number
CN102814814A
CN102814814A CN2012102673153A CN201210267315A CN102814814A CN 102814814 A CN102814814 A CN 102814814A CN 2012102673153 A CN2012102673153 A CN 2012102673153A CN 201210267315 A CN201210267315 A CN 201210267315A CN 102814814 A CN102814814 A CN 102814814A
Authority
CN
China
Prior art keywords
arm
manipulator
point
mapping point
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102673153A
Other languages
Chinese (zh)
Other versions
CN102814814B (en
Inventor
张平
杜广龙
李泽玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201210267315.3A priority Critical patent/CN102814814B/en
Publication of CN102814814A publication Critical patent/CN102814814A/en
Application granted granted Critical
Publication of CN102814814B publication Critical patent/CN102814814B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

本发明公开了一种不接触的双臂机器人基于Kinect的人机交互方法,它允许操作者通过自然的人手三维动作控制六自由度虚拟机器人来完成操作任务;该范明基于视觉的人机交互来帮助操作者控制机器人、了解机器人动作和周围环境。该发明实现了基于视觉引导的半自主共享控制,而视觉引导的半自主共享控制则使得具有目标抓取任务的末端执行器获得了更精确的位置和范围,使得抓取更精确。

Figure 201210267315

The invention discloses a human-computer interaction method based on Kinect for a non-contact dual-arm robot, which allows the operator to control a six-degree-of-freedom virtual robot through natural three-dimensional movements of human hands to complete operating tasks; the fan Ming human-computer interaction method based on vision To help the operator control the robot, understand the robot's actions and the surrounding environment. The invention realizes the vision-guided semi-autonomous shared control, and the vision-guided semi-autonomous shared control enables the end effector with the target grasping task to obtain a more precise position and range, making the grasping more precise.

Figure 201210267315

Description

Kinect-based man-machine interaction method for double-arm robot
Technical Field
The invention relates to the technical field of human-computer interaction, in particular to a human-computer interaction method of a double-arm robot based on Kinect.
Background
When humans are not suitable for the robot site, it is necessary to manipulate a remote robot with an operator in an unstructured dynamic environment.
Mechanical and other contact interfaces used for teleoperation require unnatural human motion to achieve the target operational task, which often interferes with human body motion.
The existing vision-based man-machine exchange mode only uses a few degrees of freedom of hands, or needs an operator to do large-scale movement, and the operation is very laborious.
Commonly used teleoperational mechanical human-machine interfaces include a variety of operator worn devices such as skeletal mechanical devices, tool gloves, electromagnetic or inertial motion tracking sensors, or myoelectric sensitivity sensors. However, these worn devices, sensors and also their wiring may hinder flexible operator action. Other contact interfaces such as robotic manipulators, dials, joysticks, computer mice and computer graphics interfaces require unnatural actions by the operator or require the operator to learn how to operate. The vision-based interaction technique uses a control mode combining gesture recognition and voice recognition, which allows an operator to give commands more naturally and intuitively without physical limitations. However, under such control, the movements of the robot being controlled are all composed of simple commands, such as determining the position and orientation of the end effector, grasping and also a series of smaller tasks such as moving up and down, rotating, pausing, and continuing and also changing the mode of operation. This technique can be difficult to operate when the complexity of the required robotic manipulator action is high. Single-camera hand motion tracking has been applied to robotic teleoperation and simulation of teleoperation. However, motion commands are limited to only a subset of the true end effector degrees of freedom. Furthermore, gestures that are not related to a specific operation task are used to change the operation mode of the robot, so these all cause an unnatural communication of operation tasks. Ideally, a natural approach is needed that does not break down complex tasks into limited instructions and no limitations of the contact physical device.
Disclosure of Invention
The invention aims to provide a Kinect-based man-machine interaction method for a double-arm robot, aiming at the technical defects of the existing man-machine interaction method.
In order to achieve the purpose, the invention adopts the technical scheme that:
the man-machine interaction method of the double-arm robot based on the Kinect is provided, wherein the Kinect is a 3D somatosensory camera; the method comprises the following steps:
s1, establishing a coordinate system for the arm of the operator through the 3D motion sensing camera;
s2, extracting the skeleton of the arm image of the operator through a computer;
s3, determining the positions of the index finger and the thumb through image analysis including the arm;
s4, solving the mutation problem by using a critical damping vibration model;
and S5, performing shaking processing by using the over-damped vibration model.
Preferably, the step S1 includes:
s11, firstly establishing a basic coordinate system, and calibrating index finger tip, thumb tip, and the position between index finger tip and thumb tip is the upper arm.
Preferably, the step S2 includes the steps of:
and S21, acquiring the depth image by using the Kinect, and performing skeleton extraction on the depth image.
Preferably, the step S3 includes the steps of:
s31, setting the arm mapping point of an operator as P, wherein different parts are marked as P1-PN; firstly, establishing a palm bounding box: firstly connecting P4 with P5 to obtain a connecting line P45; then taking P5 as the center point of the bottom surface and the connecting line P45 as the central axis; establishing a palm bounding box B with the length L, the width W and the height H; statistically finding that the palm surrounding box B is to surround the palm; l, W and H only take the following values:
Figure 118506DEST_PATH_IMAGE001
(1) (ii) a Wherein,
Figure 231956DEST_PATH_IMAGE002
is the length of segment P45;
s32, calculating the distance between the palm bounding box B and P5 for all points B' in the bounding box B, the point B2 is the farthest point from P5 in the bounding box B:
Figure 468771DEST_PATH_IMAGE003
(2);
s33, connecting P5 and B2 to obtain a connecting line PB, and calculating the distance from all points B' in the bounding box B to the connecting line PB, so that the point B1 is the farthest point from the connecting line PB in the bounding box B:
Figure 630762DEST_PATH_IMAGE004
(3)
s34, selecting arm human arm joint points as reference points, and establishing a left reference system on the left arm points
Figure 957838DEST_PATH_IMAGE005
,
Figure 117555DEST_PATH_IMAGE006
,
Figure 641334DEST_PATH_IMAGE007
Establishing a right reference frame at the right arm point
Figure 788598DEST_PATH_IMAGE009
,
Figure 243851DEST_PATH_IMAGE010
Then P8 is relative to the left reference frame
Figure 81357DEST_PATH_IMAGE005
,
Figure 647467DEST_PATH_IMAGE006
,
Figure 683556DEST_PATH_IMAGE007
The coordinates of (a) are:
Figure 745928DEST_PATH_IMAGE011
(4);
p5 relative to the right frame of reference
Figure 805151DEST_PATH_IMAGE008
Figure 174952DEST_PATH_IMAGE009
,
Figure 206493DEST_PATH_IMAGE010
The coordinates of (a) are:
Figure 3548DEST_PATH_IMAGE012
(5);
s35, calculating mapping points of the left-hand-left robot motion space; calculating mapping points from the right hand to the motion space of the right robot;
wherein, the motion space of the left hand relative to the left shoulder is:,
Figure 897128DEST_PATH_IMAGE014
,
Figure 907809DEST_PATH_IMAGE015
(ii) a The motion space of the left robot is:
Figure 813449DEST_PATH_IMAGE016
,
Figure 909581DEST_PATH_IMAGE017
,
Figure 558868DEST_PATH_IMAGE018
(ii) a The mapping point from the left hand to the left robot motion space is:
Figure 689635DEST_PATH_IMAGE019
(6);
wherein, the motion space of the right hand relative to the right shoulder is:
Figure 264710DEST_PATH_IMAGE020
,
Figure 582559DEST_PATH_IMAGE021
,
Figure 97854DEST_PATH_IMAGE022
and the motion space of the right robot is as follows:
Figure 20811DEST_PATH_IMAGE023
,
Figure 330569DEST_PATH_IMAGE024
,
Figure 73397DEST_PATH_IMAGE025
(ii) a Then the mapping points of the right hand to the right robot motion space are:
Figure 64487DEST_PATH_IMAGE026
(7)。
preferably, the step S4 includes the steps of:
s41, simulating an elastic force F between the arm mapping point P and the tail end E of the robot, and simulating the tail end mass of the manipulator to be M; the position of the arm mapping point P is not directly equal to the tail end E of the manipulator, but is drawn up by an elastic force F; wherein, the arm mapping point P is used as a driving source, the tail end E of the mechanical hand is used as a driven object, and the elastic force F acts on the tail end E of the mechanical hand to enable the tail end E of the mechanical hand to approach the arm mapping point P; the elastic force F satisfies the following relationship:
Figure 218781DEST_PATH_IMAGE027
(8);
wherein K is the hooke coefficient,
Figure 11026DEST_PATH_IMAGE028
the radius of the action range of the elastic force F is shown, and D is the distance between the arm mapping point and the tail end E of the robot; when the distance between the arm mapping point and the tail end of the robot exceedsWhen the robot is in use, the tail end of the manipulator is not pulled by the elastic force F any more; therefore, when the extraction of the human arm joint points fails or is wrong, the front and back positions of the human arm joint points are mutated, so that the arm mapping point P is also mutated; at this time, the distance between the arm mapping point P and the end E of the manipulator must be larger than
Figure 299847DEST_PATH_IMAGE028
If the elastic force F is 0, the tail end E of the manipulator is not pulled by the arm mapping point P any more, so that the sudden change influence caused by the extraction failure or error of the human arm joint point is filtered;
s42, if the elastic force F is 0, the tail end E of the manipulator continues to move, and the system is uncontrollable; to prevent this, the resistance μ is assumed to satisfy the following relationship at the manipulator end E:
Figure 259712DEST_PATH_IMAGE029
(9) (ii) a Wherein,
Figure 347492DEST_PATH_IMAGE030
in order to be a damping coefficient of the damping,
Figure 127229DEST_PATH_IMAGE031
when the distance between the tail end E of the manipulator and the arm mapping point PIs less than
Figure 460121DEST_PATH_IMAGE028
The resistance μ is proportional to the relative velocity v of the two points; when v is 0, the tail end E of the manipulator stops approaching the arm mapping point P; when the distance between the tail end E of the manipulator and the arm mapping point P is larger than
Figure 274494DEST_PATH_IMAGE028
When the resistance is infinite, the tail end of the manipulator stops moving immediately; the kinetic equation at this time is:
Figure 96956DEST_PATH_IMAGE032
(10) (ii) a Wherein,
Figure 301673DEST_PATH_IMAGE033
s43, setting
Figure 500573DEST_PATH_IMAGE034
At a sampling time of
Figure 107135DEST_PATH_IMAGE035
(ii) a In order to make the end E of the manipulator reach the arm mapping point P in each sampling time, let
Figure 100498DEST_PATH_IMAGE036
The distance function is then:
Figure 293976DEST_PATH_IMAGE037
(10);
wherein,
Figure 819952DEST_PATH_IMAGE039
the integration constant is the time when the end E of the robot does not move periodically and returns to the equilibrium position, i.e. the arm mapping point, within a sampling time.
Preferably, in the step S4, when
Figure 921900DEST_PATH_IMAGE040
When the extraction of the human arm joint points fails, the positions of the human arm joint points are subjected to mutation, the traction force borne by the tail end E of the manipulator is 0, and the tail end E of the manipulator stops moving immediately due to infinite resistance; when in use
Figure 163526DEST_PATH_IMAGE041
At this time, the robot control system is a critical damping vibration.
Preferably, the step S5 uses an over-damped vibration model; the over-damped vibration model is as follows: when the arm mapping point P vibrates at the tail end E of the manipulator, the attraction force generated by the arm mapping point P to the tail end E of the manipulator is not enough to counteract the resistance force applied to the tail end E of the manipulator, so that the tail end E of the manipulator cannot be driven to move,
is provided withAt this time, the distance function is:
Figure 19803DEST_PATH_IMAGE043
(11);
wherein,
Figure 558232DEST_PATH_IMAGE038
Figure 21574DEST_PATH_IMAGE039
is an integration constant, determined by initial conditions; this condition is over-damped vibration, no vibration occurs; thereby eliminating the jitter phenomenon.
Compared with the prior art, the invention has the following beneficial effects: the Kinect is a 3D motion-sensing camera (development code "Project natural"), and introduces functions such as real-time motion capture, image recognition, microphone input, voice recognition, community interaction, and the like. The invention provides a Kinect-based man-machine interaction method for a non-contact double-arm robot, which allows an operator to control a six-degree-of-freedom virtual robot to complete an operation task through natural three-dimensional actions of hands; the paradigm is based on visual human-computer interaction to help the operator control the robot, understand the robot actions and the surrounding environment. The invention realizes semi-autonomous sharing control based on visual guidance, and the semi-autonomous sharing control based on visual guidance enables the end effector with the target grabbing task to obtain more accurate position and range, so that grabbing is more accurate.
Drawings
FIG. 1 is a schematic view of a joint point of a human arm;
FIG. 2 is a left-hand coordinate system modeling diagram;
FIG. 3 is a right hand coordinate system modeling diagram.
Detailed Description
The following describes the object of the present invention in further detail with reference to the drawings and specific examples, which are not repeated herein, but the embodiments of the present invention are not limited to the following examples. The materials and processing methods employed in the present invention are those conventional in the art, unless otherwise specified.
The invention provides a man-machine interaction method of a double-arm robot based on Kinect, wherein the Kinect is a 3D motion sensing camera; the method comprises the following steps:
s1, establishing a coordinate system for the arm of the operator through the 3D motion sensing camera;
s2, extracting the skeleton of the arm image of the operator through a computer;
s3, determining the positions of the index finger and the thumb through image analysis including the arm;
s4, solving the mutation problem by using a critical damping vibration model;
and S5, performing shaking processing by using the over-damped vibration model.
In step S1, a basic Kinect coordinate system is first established, and the goal is to calibrate the index fingertip (S) ((S))I) Tip of thumb: (T) The part between the index finger tip and the thumb tip: (B) Also the upper arm (U) The position of (a).
Step S2 performs skeleton extraction using the depth image acquired by the Kinect, as in fig. 1.
The step S3 includes the steps of:
s31, it is the palm bounding box that needs to be established first: connecting P4 with P5 to obtain a connecting line P45, and establishing a palm bounding box B with the length L, the width W and the height H by taking P5 as a bottom center point and the connecting line P45 as a central axis; as shown in fig. 2 and 3; this palm bounding box B can enclose the palm wherein, finds through statistics, L, W and H only need take the following value just can make the palm bounding box enclose the palm:
Figure 365968DEST_PATH_IMAGE001
(1) (ii) a Wherein,
Figure 34584DEST_PATH_IMAGE002
is the length of the line segment P45.
S32, calculating the distance from all points B' in the bounding box B to P5, the point B2 is the farthest point from P5 in the bounding box B:
Figure 806231DEST_PATH_IMAGE003
(2);
s33, connecting P5 and B2 to obtain a connecting line PB, and calculating the distance from all points B' in the bounding box B to the connecting line PB, so that the point B1 is the farthest point from the connecting line PB in the bounding box B:
Figure 694553DEST_PATH_IMAGE004
(3);
s34, selecting arm human arm joint points as reference points, and establishing a left reference system on the left arm points
Figure 577058DEST_PATH_IMAGE005
,
Figure 929542DEST_PATH_IMAGE006
,
Figure 809774DEST_PATH_IMAGE007
Establishing a right reference frame at the right arm point
Figure 247708DEST_PATH_IMAGE008
Figure 871588DEST_PATH_IMAGE009
,,。
Then P8 is relative to the left reference frame
Figure 631175DEST_PATH_IMAGE005
,
Figure 556406DEST_PATH_IMAGE006
,The coordinates of (a) are:
Figure 45473DEST_PATH_IMAGE011
(4);
then P5 is relative to the right frame of reference
Figure 1928DEST_PATH_IMAGE008
Figure 414455DEST_PATH_IMAGE009
,
Figure 442454DEST_PATH_IMAGE010
The coordinates of (a) are:
Figure 561719DEST_PATH_IMAGE012
(5)。
s35, setting the motion space of the left hand relative to the left shoulder as follows:
Figure 16971DEST_PATH_IMAGE013
,,
Figure 653544DEST_PATH_IMAGE015
(ii) a The motion space of the right hand relative to the right shoulder is:
Figure 892895DEST_PATH_IMAGE020
,
Figure 519049DEST_PATH_IMAGE021
,the motion space of the left robot is as follows:
Figure 948073DEST_PATH_IMAGE016
,
Figure 104248DEST_PATH_IMAGE017
,the motion space of the right robot is as follows:,
Figure 794883DEST_PATH_IMAGE024
,
the mapping point from the left hand to the left robot motion space is:
Figure 711203DEST_PATH_IMAGE019
(6);
then the mapping points of the right hand to the right robot motion space are:
Figure 10598DEST_PATH_IMAGE026
(7)。
the step S4 includes the steps of:
s41, simulating an elastic force F between the arm mapping point P and the tail end E of the robot, and simulating that the tail end mass of the manipulator is M, wherein the position of the hand mapping point P is not directly equal to that of the tail end E of the manipulator but is drawn up by the elastic force F; the hand projection point P is used as a driving source, the hand end E is used as a driven object, and the elastic force F acts on the hand end E to make the hand end E approach the hand projection point P. The elastic force F satisfies the following relationship:
Figure 722202DEST_PATH_IMAGE027
(8);
wherein K is the hooke coefficient,
Figure 259493DEST_PATH_IMAGE028
the radius of the acting force range is D, the distance between the mapping point of the human hand and the tail end of the robot is D, and when the distance between the mapping point of the human hand and the tail end of the robot exceeds
Figure 398351DEST_PATH_IMAGE028
When the mechanical arm is in use, the tail end of the mechanical arm is not pulled by the elastic force any more, and the mechanical arm is disconnected like a spring with the stretching length exceeding the limit. Therefore, when the extraction of the arm joint point of the human body fails or is wrong, the front and back positions of the arm joint point of the human body are mutated, so that the arm mapping point P is also mutated, and the distance between the arm mapping point P and the tail end E of the manipulator is certainly larger than that between the arm mapping point P and the tail end E of the manipulatorAnd if the elastic force is 0, the tail end of the manipulator is not pulled by the arm mapping point any more, so that the sudden change influence caused by the extraction failure or error of the human arm joint point is filtered.
S42, when the tail end E of the mechanical hand is very close to the arm mapping point or the distance between the tail end of the mechanical hand and the arm mapping point is larger thanWhen the elastic force is basically 0, the tail end of the manipulator has a certain speed, and if the tail end of the manipulator is not influenced by other forces, the tail end of the manipulator can continuously move towards a certain direction, so that the system is uncontrollable.
To solve this problem, a virtual resistance μ is applied to the manipulator tip while moving, which satisfies the following condition:
Figure 590669DEST_PATH_IMAGE029
(9) (ii) a Wherein
Figure 900428DEST_PATH_IMAGE030
In order to be a damping coefficient of the damping,
Figure 908835DEST_PATH_IMAGE044
. When the mechanical armThe distance between the tail end E and the arm mapping point P is less than
Figure 962242DEST_PATH_IMAGE028
When the resistance is in direct proportion to the relative speed v of the two points, the tail end E of the manipulator stops approaching after approaching the arm mapping point to a certain degree; when the distance between the tail end E of the manipulator and the arm mapping point P is larger than
Figure 5284DEST_PATH_IMAGE028
When the resistance is infinite, the tail end of the manipulator stops moving immediately; the kinetic equation at this time is:(10) (ii) a Wherein,
Figure 205814DEST_PATH_IMAGE033
s43, orderIn order to make the end E of the manipulator reach the arm mapping point P within each sampling time, let
Figure 491619DEST_PATH_IMAGE036
The distance function is then:
Figure 346443DEST_PATH_IMAGE037
(10)
wherein
Figure 329442DEST_PATH_IMAGE038
Figure 724652DEST_PATH_IMAGE039
The integration constant is the time when the end of the robot does not move periodically and returns to the equilibrium position (arm mapping point) within a sampling time.
S44, assuming that the sampling time of the algorithm isWhen the distance between the end E of the manipulator and the arm mapping point P is D, the distance between the end E of the manipulator and the arm mapping point P is D
Figure 33590DEST_PATH_IMAGE040
When the extraction of the human arm joint point fails, the position of the human arm joint point is subjected to mutation, the traction force applied to the tail end E of the manipulator is 0, and the tail end E of the manipulator stops moving immediately due to infinite resistance. When in use
Figure 300623DEST_PATH_IMAGE041
At this time, the system is critically damped to vibrate.
The step S5 uses an over-damped vibration model. When the arm mapping point P vibrates in a small range of the robot end E, the attraction force generated by the arm mapping point P on the robot end E is not enough to counteract the resistance force applied to the robot end E, so that the robot end E cannot be driven to move, which is an over-damped vibration model. Order to
Figure 201321DEST_PATH_IMAGE042
At this time, the algorithm model is an over-damped vibration model, and the distance function is:
Figure 870200DEST_PATH_IMAGE043
(11);
wherein
Figure 535668DEST_PATH_IMAGE038
Figure 289997DEST_PATH_IMAGE039
Is an integration constant, determined by initial conditions. This condition is over-damped vibration, no vibration occurs; thereby eliminating the jitter phenomenon.
The above embodiments are merely preferred embodiments of the present invention, and are not intended to limit the scope of the present invention. All equivalent changes and modifications made according to the present disclosure are intended to be covered by the scope of the claims of the present invention.

Claims (7)

1.一种双臂机器人基于Kinect的人机交互方法,其中,Kinect为3D体感摄影机;其特征在于包括以下步骤: 1. a human-computer interaction method based on Kinect for a dual-arm robot, wherein Kinect is a 3D somatosensory camera; it is characterized in that comprising the following steps: S1、通过3D体感摄影机对操作者的手臂建立坐标系; S1. Establish a coordinate system for the operator's arm through the 3D somatosensory camera; S2、通过计算机对操作者的手臂图像进行骨架提取; S2, carry out skeleton extraction to operator's arm image by computer; S3、通过包含了手臂的图像分析确定食指和拇指的位置; S3. Determine the positions of the index finger and the thumb through image analysis including the arm; S4、使用临界阻尼振动模型解决突变问题; S4, using the critically damped vibration model to solve the sudden change problem; S5、使用过阻尼振动模型进行抖动处理。 S5. Vibration processing is performed using an overdamped vibration model. 2.根据权利要求1所述的双臂机器人基于Kinect的人机交互方法,其特征在于,所述步骤S1包括: 2. the human-computer interaction method based on Kinect of dual-arm robot according to claim 1, is characterized in that, described step S1 comprises: S11、首先建立基本坐标系,标定食指尖,拇指尖,食指尖和拇指尖之间的部位还有上手臂。 S11. First establish a basic coordinate system, and calibrate the tip of the index finger, the tip of the thumb, the part between the tip of the index finger and the tip of the thumb, and the upper arm. 3.根据权利要求1所述的双臂机器人基于Kinect的无标记人机交互方法,其特征在于:所述步骤S2包括以下步骤: 3. the double-arm robot according to claim 1 is based on the Kinect-free human-computer interaction method, it is characterized in that: described step S2 comprises the following steps: S21、使用Kinect获取深度图像,对深度图像进行骨架提取。 S21. Use the Kinect to acquire a depth image, and perform skeleton extraction on the depth image. 4.根据权利要求1所述的双臂机器人基于Kinect的人机交互方法,其特征在于:所述步骤S3包括以下步骤: 4. the human-computer interaction method based on Kinect of dual-arm robot according to claim 1, is characterized in that: described step S3 comprises the following steps: S31、设操作者的手臂映射点为P,其中,不同的部位标注为P1~ PN;首先建立手掌包围盒:先连接P4与P5得到连线P45;然后以P5为底面中心点,连线P45为中心轴;建立一个长为L,宽为W,高为H的手掌包围盒B;通过统计发现,该手掌包围盒B若要将手掌包围其中; L,W和H只的取值如下: S31. Set the operator's arm mapping point as P, where different parts are marked as P1~PN; first establish the palm bounding box: first connect P4 and P5 to obtain the connection line P45; then use P5 as the center point of the bottom surface, and connect the line P45 is the central axis; establish a palm bounding box B with a length of L, a width of W, and a height of H; through statistics, it is found that if the palm bounding box B is to surround the palm; the values of L, W and H are as follows:   (1);其中,
Figure 643172DEST_PATH_IMAGE002
为线段P45的长度;
(1); where,
Figure 643172DEST_PATH_IMAGE002
is the length of line segment P45;
S32、对手掌包围盒B中的所有点B’,计算其到P5之间的距离,则B2点为包围盒B中离P5最远的点: S32. For all points B' in the palm bounding box B, calculate the distance between it and P5, then point B2 is the farthest point from P5 in the bounding box B:
Figure 842072DEST_PATH_IMAGE003
(2);
Figure 842072DEST_PATH_IMAGE003
(2);
S33、连接P5与B2得到连线PB,对于包围盒B中的所有点B’,计算其到连线PB的距离,则B1点为包围盒B中离连线PB最远的点: S33. Connect P5 and B2 to obtain the connection line PB. For all points B' in the bounding box B, calculate the distance to the connection line PB, then point B1 is the point farthest from the connection line PB in the bounding box B:
Figure 245372DEST_PATH_IMAGE004
(3);
Figure 245372DEST_PATH_IMAGE004
(3);
S34、选取胳膊人体手臂关节点作为参考点,在左胳膊点上建立左参考系
Figure 566631DEST_PATH_IMAGE005
,
Figure 320961DEST_PATH_IMAGE006
,
Figure 57973DEST_PATH_IMAGE007
,在右胳膊点上建立右参考系, 
Figure 886568DEST_PATH_IMAGE009
,
Figure 862615DEST_PATH_IMAGE010
S34. Select the arm joint point of the human body as the reference point, and establish the left reference system on the left arm point
Figure 566631DEST_PATH_IMAGE005
,
Figure 320961DEST_PATH_IMAGE006
,
Figure 57973DEST_PATH_IMAGE007
, establish the right reference frame on the right arm point ,
Figure 886568DEST_PATH_IMAGE009
,
Figure 862615DEST_PATH_IMAGE010
;
则P8相对于左参考系
Figure 403317DEST_PATH_IMAGE005
,
Figure 361303DEST_PATH_IMAGE006
,的坐标为:
Figure 425391DEST_PATH_IMAGE011
(4);
Then P8 relative to the left frame of reference
Figure 403317DEST_PATH_IMAGE005
,
Figure 361303DEST_PATH_IMAGE006
, The coordinates are:
Figure 425391DEST_PATH_IMAGE011
(4);
P5相对于右参考系
Figure 566522DEST_PATH_IMAGE008
, 
Figure 64499DEST_PATH_IMAGE009
,
Figure 383616DEST_PATH_IMAGE010
的坐标为:
Figure 599834DEST_PATH_IMAGE012
(5);
P5 is relative to the right frame of reference
Figure 566522DEST_PATH_IMAGE008
,
Figure 64499DEST_PATH_IMAGE009
,
Figure 383616DEST_PATH_IMAGE010
The coordinates are:
Figure 599834DEST_PATH_IMAGE012
(5);
S35、左手到左机器人运动空间映射点的计算;右手到右机器人运动空间映射点的计算; S35. Calculation of the mapping point from the left hand to the left robot motion space; calculation of the mapping point from the right hand to the right robot motion space; 其中,设左手相对于左肩膀的运动空间为: 
Figure 482339DEST_PATH_IMAGE013
,
Figure 897140DEST_PATH_IMAGE014
,
Figure 839688DEST_PATH_IMAGE015
;左机器人的运动空间为:
Figure 277623DEST_PATH_IMAGE016
,
Figure 9825DEST_PATH_IMAGE017
,
Figure 482394DEST_PATH_IMAGE018
;则左手到左机器人运动空间映射点为:
Among them, let the movement space of the left hand relative to the left shoulder be:
Figure 482339DEST_PATH_IMAGE013
,
Figure 897140DEST_PATH_IMAGE014
,
Figure 839688DEST_PATH_IMAGE015
;The movement space of the left robot is:
Figure 277623DEST_PATH_IMAGE016
,
Figure 9825DEST_PATH_IMAGE017
,
Figure 482394DEST_PATH_IMAGE018
; Then the mapping point from the left hand to the left robot motion space is:
   (6); (6); 其中,设右手相对于右肩膀的运动空间为:
Figure 317812DEST_PATH_IMAGE020
,
Figure 542120DEST_PATH_IMAGE021
,,右机器人的运动空间为:,
Figure 379123DEST_PATH_IMAGE024
,;则右手到右机器人运动空间映射点为:
Among them, let the movement space of the right hand relative to the right shoulder be:
Figure 317812DEST_PATH_IMAGE020
,
Figure 542120DEST_PATH_IMAGE021
, , the motion space of the right robot is: ,
Figure 379123DEST_PATH_IMAGE024
, ; Then the mapping point from the right hand to the right robot motion space is:
(7)。 (7).
5.根据权利要求1所述的双臂机器人基于Kinect的人机交互方法,其特征在于:所述步骤S4包括以下步骤: 5. the human-computer interaction method based on Kinect of dual-arm robot according to claim 1, is characterized in that: described step S4 comprises the following steps: S41、在手臂映射点P与机器人末端E之间虚拟弹性力F,并虚拟机械手末端质量为M;手臂映射点P位置并不是直接等同于机械手末端E,而是通过弹性力F来拉拢;其中,手臂映射点P作为驱动源,机械手末端E作为被驱动对象,弹性力F作用于机械手末端E上使得机械手末端E趋向手臂映射点P;弹性力F满足以下关系: S41. A virtual elastic force F is created between the arm mapping point P and the end E of the robot, and the mass of the end of the virtual manipulator is M; the position of the arm mapping point P is not directly equal to the end E of the manipulator, but is pulled together by the elastic force F; , the arm mapping point P is used as the driving source, the end of the manipulator E is the driven object, and the elastic force F acts on the end of the manipulator E so that the end of the manipulator E tends to the arm mapping point P; the elastic force F satisfies the following relationship:
Figure 840694DEST_PATH_IMAGE027
(8);
Figure 840694DEST_PATH_IMAGE027
(8);
其中,k为胡克系数,为弹性力F的作用范围半径,D为手臂映射点与机器人末端E的距离;当手臂映射点与机器人末端的距离超过
Figure 621142DEST_PATH_IMAGE028
时,则机械手末端不再受到弹性力F牵引;故当出现人体手臂人体手臂关节点提取失败或错误的时候,人体手臂关节点前后位置出现突变,导致手臂映射点P也发生突变现象;此时,手臂映射点P与机械手末端E的距离必定大于
Figure 657231DEST_PATH_IMAGE028
,弹性力F为0,则机械手末端E不再受手臂映射点P牵引,从而过滤了人体手臂关节点提取失败或错误带来的突变影响;
Among them, k is Hooke's coefficient, is the radius of the action range of the elastic force F, and D is the distance between the arm mapping point and the robot end E; when the distance between the arm mapping point and the robot end exceeds
Figure 621142DEST_PATH_IMAGE028
, the end of the manipulator is no longer pulled by the elastic force F; therefore, when the extraction of the joint points of the human arm fails or is wrong, the front and rear positions of the joint points of the human arm are mutated, resulting in a sudden change in the mapping point P of the arm; at this time , the distance between the arm mapping point P and the end E of the manipulator must be greater than
Figure 657231DEST_PATH_IMAGE028
, the elastic force F is 0, then the end E of the manipulator is no longer pulled by the arm mapping point P, thus filtering the mutation effect caused by the failure or error of the human arm joint point extraction;
S42、若弹性力F为0时,机械手末端E继续运动,则会导致系统不可控;为防止上述情况,在机械手末端E虚拟阻力μ,阻力μ满足如下关系: S42. If the elastic force F is 0, the end E of the manipulator continues to move, which will cause the system to be uncontrollable; in order to prevent the above situation, the virtual resistance μ at the end E of the manipulator, the resistance μ satisfies the following relationship:
Figure 345701DEST_PATH_IMAGE029
  (9);其中,
Figure 732820DEST_PATH_IMAGE030
为阻尼系数,
Figure 837043DEST_PATH_IMAGE031
Figure 345701DEST_PATH_IMAGE029
(9); where,
Figure 732820DEST_PATH_IMAGE030
is the damping coefficient,
Figure 837043DEST_PATH_IMAGE031
;
当机械手末端E与手臂映射点P的距离小于
Figure 806267DEST_PATH_IMAGE028
时,阻力μ与两点的相对速度v成正比;v为0时,机械手末端E停止靠近手臂映射点P;当机械手末端E与手臂映射点P的距离大于时,阻力为无穷大,机械手末端马上停止运动;则此时的动力学方程为:
When the distance between the manipulator end E and the arm mapping point P is less than
Figure 806267DEST_PATH_IMAGE028
, the resistance μ is proportional to the relative velocity v of the two points; when v is 0, the end E of the manipulator stops approaching the arm mapping point P; when the distance between the end E of the manipulator and the arm mapping point P is greater than , the resistance is infinite, and the end of the manipulator stops moving immediately; then the dynamic equation at this time is:
Figure 274474DEST_PATH_IMAGE032
(10);其中,
Figure 447967DEST_PATH_IMAGE033
Figure 274474DEST_PATH_IMAGE032
(10); where,
Figure 447967DEST_PATH_IMAGE033
;
S43、设
Figure 458648DEST_PATH_IMAGE034
,采样时间为
Figure 738188DEST_PATH_IMAGE035
;为了让机械手末端E在每个采样时间内达到手臂映射点P,则令
Figure 709687DEST_PATH_IMAGE036
,此时距离函数为:
S43, setting
Figure 458648DEST_PATH_IMAGE034
, the sampling time is
Figure 738188DEST_PATH_IMAGE035
; In order to let the end E of the manipulator reach the arm mapping point P in each sampling time, then let
Figure 709687DEST_PATH_IMAGE036
, then the distance function is:
Figure 155711DEST_PATH_IMAGE037
(10);
Figure 155711DEST_PATH_IMAGE037
(10);
其中,
Figure 286478DEST_PATH_IMAGE038
Figure 487653DEST_PATH_IMAGE039
为积分常数,此时机械手末端E不会做周期运动,在一个采样时间内回到平衡位置,即手臂映射点。
in,
Figure 286478DEST_PATH_IMAGE038
,
Figure 487653DEST_PATH_IMAGE039
is the integral constant, at this time the end E of the manipulator will not perform periodic motion, and return to the equilibrium position within one sampling time, that is, the arm mapping point.
6.根据权利要求5所述的双臂机器人基于Kinect的人机交互方法,其特征在于:所述步骤S4中,当
Figure 71081DEST_PATH_IMAGE040
时,认为人体手臂关节点提取失败导致人体手臂关节点位置发生突变现象,此时机械手末端E所受的牵引力为0,而由于阻力无穷大,此时机械手末端E立刻停止运动;当
Figure 320796DEST_PATH_IMAGE041
时,此时机器人控制系统处于临界阻尼振动模型。
6. the human-computer interaction method based on Kinect of dual-arm robot according to claim 5, is characterized in that: in described step S4, when
Figure 71081DEST_PATH_IMAGE040
At this time, it is considered that the failure to extract the joint points of the human arm leads to a sudden change in the position of the joint points of the human arm. At this time, the traction force on the end E of the manipulator is 0, and due to the infinite resistance, the end E of the manipulator stops moving immediately; when
Figure 320796DEST_PATH_IMAGE041
, the robot control system is in the critically damped vibration model.
7.根据权利要求5所述的双臂机器人基于Kinect的人机交互方法,其特征在于:所述步骤S5使用了过阻尼振动模型;过阻尼振动模型为:当手臂映射点P在机械手末端E振动时,手臂映射点P对机械手末端E产生的引力不足以抵消机械手末端E受到的阻力,从而不能驱动机械手末端E运动, 7. The human-computer interaction method based on Kinect for a dual-arm robot according to claim 5, characterized in that: said step S5 uses an over-damped vibration model; the over-damped vibration model is: when the arm mapping point P is at the end of the manipulator E When vibrating, the gravitational force generated by the arm mapping point P on the end E of the manipulator is not enough to offset the resistance of the end E of the manipulator, so that it cannot drive the end E of the manipulator to move.
Figure 620584DEST_PATH_IMAGE042
时,此时距离函数为:
set up
Figure 620584DEST_PATH_IMAGE042
, then the distance function is:
(11); (11); 其中,
Figure 1067DEST_PATH_IMAGE038
Figure 116791DEST_PATH_IMAGE039
为积分常数,由初始条件决定;这种情况为过阻尼振动,无振动发生;从而消除抖动现象。
in,
Figure 1067DEST_PATH_IMAGE038
,
Figure 116791DEST_PATH_IMAGE039
is the integral constant, determined by the initial conditions; in this case, it is an over-damped vibration, and no vibration occurs; thereby eliminating the jitter phenomenon.
CN201210267315.3A 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot Expired - Fee Related CN102814814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210267315.3A CN102814814B (en) 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210267315.3A CN102814814B (en) 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot

Publications (2)

Publication Number Publication Date
CN102814814A true CN102814814A (en) 2012-12-12
CN102814814B CN102814814B (en) 2015-07-01

Family

ID=47299436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210267315.3A Expired - Fee Related CN102814814B (en) 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot

Country Status (1)

Country Link
CN (1) CN102814814B (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103112007A (en) * 2013-02-06 2013-05-22 华南理工大学 Human-machine interaction method based on mixing sensor
CN103279206A (en) * 2013-06-15 2013-09-04 苏州时运机器人有限公司 Robot control system with gesture-sensing teaching machine
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103398702A (en) * 2013-08-05 2013-11-20 青岛海通机器人系统有限公司 Mobile-robot remote control apparatus and control technology
CN103440037A (en) * 2013-08-21 2013-12-11 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information
CN104308844A (en) * 2014-08-25 2015-01-28 中国石油大学(华东) Somatosensory control method of five-finger bionic mechanical arm
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN104680525A (en) * 2015-02-12 2015-06-03 南通大学 Automatic human body fall-down detection method based on Kinect depth image
CN104850120A (en) * 2015-03-19 2015-08-19 武汉科技大学 Wheel type mobile robot navigation method based on IHDR self-learning frame
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree
CN105719279A (en) * 2016-01-15 2016-06-29 上海交通大学 Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method
CN105904457A (en) * 2016-05-16 2016-08-31 西北工业大学 Heterogeneous redundant mechanical arm control method based on position tracker and data glove
CN105943163A (en) * 2016-06-27 2016-09-21 重庆金山科技(集团)有限公司 Minimally invasive surgery robot and control device thereof
CN106644090A (en) * 2016-12-29 2017-05-10 中南大学 A kinect-based job search demeanor testing method and system
CN106774896A (en) * 2016-12-19 2017-05-31 吉林大学 A kind of sitting posture hand assembly line model is worth evaluating system
CN106971050A (en) * 2017-04-18 2017-07-21 华南理工大学 A kind of Darwin joint of robot Mapping Resolution methods based on Kinect
CN107030692A (en) * 2017-03-28 2017-08-11 浙江大学 A method and system for manipulator teleoperation based on perception enhancement
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN107272593A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of robot body-sensing programmed method based on Kinect
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
WO2018219194A1 (en) * 2017-06-02 2018-12-06 东南大学 Cyber arm-based teleoperation system for space station robot
CN109108970A (en) * 2018-08-22 2019-01-01 南通大学 A kind of reciprocating mechanical arm control method based on bone nodal information
CN109330688A (en) * 2018-12-10 2019-02-15 中山市环能缪特斯医疗器械科技有限公司 Safe self-checking type endoscope auxiliary manipulator and intelligent control system thereof
CN109584868A (en) * 2013-05-20 2019-04-05 英特尔公司 Natural Human-Computer Interaction for virtual personal assistant system
CN115922692A (en) * 2022-10-27 2023-04-07 西北工业大学 A Human-Robot Multimodal Interaction Method Oriented to Teleoperation
CN119260709A (en) * 2024-09-27 2025-01-07 泰志达智能科技(苏州)有限公司 A method for controlling a robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8123622B1 (en) * 2011-06-03 2012-02-28 Nyko Technologies, Inc. Lens accessory for video game sensor device
CN102470530A (en) * 2009-11-24 2012-05-23 株式会社丰田自动织机 Method of producing teaching data of robot and robot teaching system
CN202257985U (en) * 2011-10-27 2012-05-30 温萍萍 Interactive image device suitable for autism children

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102470530A (en) * 2009-11-24 2012-05-23 株式会社丰田自动织机 Method of producing teaching data of robot and robot teaching system
US8123622B1 (en) * 2011-06-03 2012-02-28 Nyko Technologies, Inc. Lens accessory for video game sensor device
CN202257985U (en) * 2011-10-27 2012-05-30 温萍萍 Interactive image device suitable for autism children

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
GUANGLONG DU ET AL.: "Markerless Kinect-Based Hand Tracking for Robot Teleoperation", 《INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS》, vol. 9, no. 36, 13 July 2012 (2012-07-13), pages 3 - 2 *
姚争为: "大型实时互动增强现实系统中的若干问题研究", 《中国博士学位论文全文数据库》, 15 January 2011 (2011-01-15), pages 54 *
王明东: "基于Kinect骨骼跟踪功能实现PC的手势控制", 《漳州职业技术学院学报》, vol. 14, no. 2, 30 June 2012 (2012-06-30) *

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103112007A (en) * 2013-02-06 2013-05-22 华南理工大学 Human-machine interaction method based on mixing sensor
CN103112007B (en) * 2013-02-06 2015-10-28 华南理工大学 Based on the man-machine interaction method of hybrid sensor
US11609631B2 (en) 2013-05-20 2023-03-21 Intel Corporation Natural human-computer interaction for virtual personal assistant systems
CN109584868A (en) * 2013-05-20 2019-04-05 英特尔公司 Natural Human-Computer Interaction for virtual personal assistant system
CN109584868B (en) * 2013-05-20 2022-12-13 英特尔公司 Natural Human-Computer Interaction for Virtual Personal Assistant Systems
US12099651B2 (en) 2013-05-20 2024-09-24 Intel Corporation Natural human-computer interaction for virtual personal assistant systems
US12399560B2 (en) 2013-05-20 2025-08-26 Intel Corporation Natural human-computer interaction for virtual personal assistant systems
CN103279206A (en) * 2013-06-15 2013-09-04 苏州时运机器人有限公司 Robot control system with gesture-sensing teaching machine
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103386683B (en) * 2013-07-31 2015-04-08 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103398702A (en) * 2013-08-05 2013-11-20 青岛海通机器人系统有限公司 Mobile-robot remote control apparatus and control technology
CN103398702B (en) * 2013-08-05 2015-08-19 青岛海通机器人系统有限公司 A kind of mobile robot's remote operation controller and manipulation technology thereof
CN103440037B (en) * 2013-08-21 2017-02-08 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information
CN103440037A (en) * 2013-08-21 2013-12-11 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information
CN104308844A (en) * 2014-08-25 2015-01-28 中国石油大学(华东) Somatosensory control method of five-finger bionic mechanical arm
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN104680525B (en) * 2015-02-12 2017-05-10 南通大学 Automatic human body fall-down detection method based on Kinect depth image
CN104680525A (en) * 2015-02-12 2015-06-03 南通大学 Automatic human body fall-down detection method based on Kinect depth image
CN104850120A (en) * 2015-03-19 2015-08-19 武汉科技大学 Wheel type mobile robot navigation method based on IHDR self-learning frame
CN104850120B (en) * 2015-03-19 2017-11-10 武汉科技大学 Wheeled mobile robot air navigation aid based on IHDR autonomous learning frameworks
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree
CN105719279A (en) * 2016-01-15 2016-06-29 上海交通大学 Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method
CN105719279B (en) * 2016-01-15 2018-07-13 上海交通大学 Based on the modeling of cylindroid trunk and arm regions segmentation and arm framework extraction method
CN105904457A (en) * 2016-05-16 2016-08-31 西北工业大学 Heterogeneous redundant mechanical arm control method based on position tracker and data glove
CN105943163A (en) * 2016-06-27 2016-09-21 重庆金山科技(集团)有限公司 Minimally invasive surgery robot and control device thereof
CN106774896B (en) * 2016-12-19 2018-03-13 吉林大学 A kind of sitting posture hand assembly line model is worth evaluating system
CN106774896A (en) * 2016-12-19 2017-05-31 吉林大学 A kind of sitting posture hand assembly line model is worth evaluating system
CN106644090A (en) * 2016-12-29 2017-05-10 中南大学 A kinect-based job search demeanor testing method and system
CN107030692A (en) * 2017-03-28 2017-08-11 浙江大学 A method and system for manipulator teleoperation based on perception enhancement
CN106971050A (en) * 2017-04-18 2017-07-21 华南理工大学 A kind of Darwin joint of robot Mapping Resolution methods based on Kinect
CN106971050B (en) * 2017-04-18 2020-04-28 华南理工大学 A Kinect-based Darwin Robot Joint Mapping Analysis Method
CN107272593A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of robot body-sensing programmed method based on Kinect
WO2018219194A1 (en) * 2017-06-02 2018-12-06 东南大学 Cyber arm-based teleoperation system for space station robot
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
CN109108970A (en) * 2018-08-22 2019-01-01 南通大学 A kind of reciprocating mechanical arm control method based on bone nodal information
CN109108970B (en) * 2018-08-22 2021-11-09 南通大学 An Interactive Manipulator Control Method Based on Skeletal Node Information
CN109330688A (en) * 2018-12-10 2019-02-15 中山市环能缪特斯医疗器械科技有限公司 Safe self-checking type endoscope auxiliary manipulator and intelligent control system thereof
CN115922692A (en) * 2022-10-27 2023-04-07 西北工业大学 A Human-Robot Multimodal Interaction Method Oriented to Teleoperation
CN115922692B (en) * 2022-10-27 2024-08-30 西北工业大学 A human-robot multimodal interaction method for teleoperation
CN119260709A (en) * 2024-09-27 2025-01-07 泰志达智能科技(苏州)有限公司 A method for controlling a robot

Also Published As

Publication number Publication date
CN102814814B (en) 2015-07-01

Similar Documents

Publication Publication Date Title
CN102814814B (en) Kinect-based man-machine interaction method for two-arm robot
US10919152B1 (en) Teleoperating of robots with tasks by mapping to human operator pose
CN103112007B (en) Based on the man-machine interaction method of hybrid sensor
Naceri et al. Towards a virtual reality interface for remote robotic teleoperation
Liu et al. High-fidelity grasping in virtual reality using a glove-based system
Pan et al. Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device
CN110385694B (en) Robot motion teaching device, robot system, and robot control device
Du et al. Markerless human–robot interface for dual robot manipulators using Kinect sensor
Fritsche et al. First-person tele-operation of a humanoid robot
JP3742879B2 (en) Robot arm / hand operation control method, robot arm / hand operation control system
US20220101477A1 (en) Visual Interface And Communications Techniques For Use With Robots
Zubrycki et al. Using integrated vision systems: three gears and leap motion, to control a 3-finger dexterous gripper
Dwivedi et al. Combining electromyography and fiducial marker based tracking for intuitive telemanipulation with a robot arm hand system
Chen et al. A human–robot interface for mobile manipulator
Falck et al. DE VITO: A dual-arm, high degree-of-freedom, lightweight, inexpensive, passive upper-limb exoskeleton for robot teleoperation
CN114714358A (en) Method and system for teleoperation of mechanical arm based on gesture protocol
Li et al. Neural learning and kalman filtering enhanced teaching by demonstration for a baxter robot
CN1947960A (en) Environment-identification and proceeding work type real-man like robot
Grasshoff et al. 7dof hand and arm tracking for teleoperation of anthropomorphic robots
CN111002295A (en) A teaching glove and teaching system for a two-finger grasping robot
Tanzini et al. New interaction metaphors to control a hydraulic working machine's arm
Sharma et al. Design and implementation of robotic hand control using gesture recognition
Zhou et al. Admittance visuomotor policy learning for general-purpose contact-rich manipulations
Bai et al. Kinect-based hand tracking for first-person-perspective robotic arm teleoperation
CN110815210A (en) Novel remote control method based on natural human-computer interface and augmented reality

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150701

CF01 Termination of patent right due to non-payment of annual fee