[go: up one dir, main page]

CN112704491B - Lower limb gait prediction method based on attitude sensor and dynamic capture template data - Google Patents

Lower limb gait prediction method based on attitude sensor and dynamic capture template data Download PDF

Info

Publication number
CN112704491B
CN112704491B CN202011583121.5A CN202011583121A CN112704491B CN 112704491 B CN112704491 B CN 112704491B CN 202011583121 A CN202011583121 A CN 202011583121A CN 112704491 B CN112704491 B CN 112704491B
Authority
CN
China
Prior art keywords
data
angle
thigh
foot
euler
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011583121.5A
Other languages
Chinese (zh)
Other versions
CN112704491A (en
Inventor
王念峰
张新浩
张宪民
黄伟聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Flexwarm Advanced Materials & Technology Co ltd
South China University of Technology SCUT
Original Assignee
Guangdong Flexwarm Advanced Materials & Technology Co ltd
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Flexwarm Advanced Materials & Technology Co ltd, South China University of Technology SCUT filed Critical Guangdong Flexwarm Advanced Materials & Technology Co ltd
Priority to CN202011583121.5A priority Critical patent/CN112704491B/en
Publication of CN112704491A publication Critical patent/CN112704491A/en
Application granted granted Critical
Publication of CN112704491B publication Critical patent/CN112704491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/112Gait analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/285Selection of pattern recognition techniques, e.g. of classifiers in a multi-classifier system

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physiology (AREA)
  • Pathology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Rehabilitation Tools (AREA)

Abstract

本发明涉及模式识别领域,为基于姿态传感器和动补模板数据的下肢步态预测方法,首先采集人体在行走过程中右侧的大腿、小腿和脚的欧拉角数据,将一个步态周期内欧拉角数据的对应关系作为动捕模板数据;通过姿态传感器实时获得右侧大腿的欧拉角;基于右侧大腿的欧拉角度和动捕模板数据,得到当前右侧大腿角度所对应的右侧小腿和脚的欧拉角,保存为右侧实时动捕模板数据;基于左右腿的相位对应关系,根据右侧实时动捕模板数据得到左侧的大腿、小腿和脚的欧拉角;通过左右侧的大腿、小腿和脚的欧拉角数据,实时预测下肢的运动。本方法通过一个姿态传感器得到下肢主要部位的位置信息,对整个步态过程进行预测,降低了模式识别的成本。

Figure 202011583121

The invention relates to the field of pattern recognition, and is a lower limb gait prediction method based on an attitude sensor and motion compensation template data. First, the Euler angle data of the right thigh, calf and foot of the human body during walking are collected, The corresponding relationship of the Euler angle data is used as the motion capture template data; the Euler angle of the right thigh is obtained in real time through the attitude sensor; based on the Euler angle of the right thigh and the motion capture template data, the right thigh angle corresponding to the current right thigh angle is obtained. The Euler angles of the lateral calf and foot are saved as the right real-time motion capture template data; based on the phase correspondence between the left and right legs, the Euler angles of the left thigh, calf and foot are obtained according to the right real-time motion capture template data; Euler angle data of the left and right thighs, calves and feet to predict the motion of the lower limbs in real time. The method obtains the position information of the main parts of the lower limbs through an attitude sensor, and predicts the entire gait process, thereby reducing the cost of pattern recognition.

Figure 202011583121

Description

Lower limb gait prediction method based on attitude sensor and dynamic capture template data
Technical Field
The invention relates to the field of pattern recognition, in particular to a lower limb gait prediction method based on attitude sensors and motion capture template data, which can be used for reasonably predicting the attitudes of other parts of a lower limb under the condition of using a single attitude sensor.
Background
Control of various body aids such as exoskeletons, rehabilitation devices, and smart prostheses require lower limb-based motion. The data based on various sensors are used for identifying the action of the lower limbs, and the method has important significance for controlling various peripheral devices of the human body and evaluating the motion of the human body.
The sole pressure sensor is low in price and can be conveniently integrated to the sole of the shoe. The detection of heel strike and toe off by plantar pressure can effectively divide gait cycles. The attitude sensor integrating the inertial measurement unit and the computing chip has small volume and can be very conveniently integrated into a gait recognition system. The Euler angle of the fixed part can be obtained in real time by fixing the attitude sensor on the lower limb.
The thighs, calves and feet are the main components of the lower limb and are collectively referred to as the lower limb rods for convenience. Because the lower limbs always have two legs, namely the left leg and the right leg, the lower limb rods of the lower limbs are six in total. If each lower limb rod piece is pasted with a posture sensor, the motion of the lower limbs of the human body can be acquired in real time. Too many attitude sensors can increase the wearing complexity of the recognition system, greatly increasing its cost. In many control processes directed to model movements, it becomes important to use a small number of sensors to reasonably predict the movements of the entire lower limb without particularly demanding high precision on the angles of the various lower limb rods.
Disclosure of Invention
Aiming at the technical problems in the prior art, the invention provides a lower limb gait prediction method based on attitude sensors and dynamic capture template data, position information of main parts of lower limbs is obtained through one attitude sensor, the whole gait process is predicted, the purpose of reasonably predicting the whole lower limb movement by using a small number of sensors is realized, and the cost of pattern recognition is reduced.
The invention is realized by the following technical scheme: the lower limb gait prediction method based on the attitude sensor and the dynamic capture template data comprises the following steps:
step S1, collecting Euler angle data of the right thigh, the right calf and the right foot of the human body in the walking process, and taking the corresponding relation of the Euler angle data of the thigh, the right calf and the right foot in one gait cycle as dynamic capturing template data;
s2, fixing the attitude sensor on the front side of the thigh of the right leg of the human body, and obtaining the Euler angle of the thigh on the right side in real time;
step S3, based on the Euler angle of the right thigh and the dynamic capturing template data, obtaining the Euler angles of the right shank and foot corresponding to the current right thigh angle, and storing the Euler angles as right real-time dynamic capturing template data;
step S4, based on the phase corresponding relation of the left leg and the right leg, according to the stored data of the right real-time dynamic capturing template, the Euler angles of the left thigh, the left leg and the left foot are obtained;
and step S5, predicting the motion of the lower limbs in real time according to the Euler angle data of the thighs, the calves and the feet on the left and right sides.
In a preferred embodiment, step S3 is to not update the data for the right calf and foot when the angle of the right thigh exceeds the thigh angle in the kinetic capture template data while updating the euler angle data for the calf and foot based on the euler angle of the right thigh and the kinetic capture template data.
In a preferred embodiment, step S4 is to store the angle change data of the previous right leg model rod in a gait cycle, which is called real-time model data, through a buffer; and according to the real-time predicted gait cycle, based on the real-time model data, taking the data of the latter half of the predicted gait cycle of the current position of the right leg as the data of each model rod piece of the left leg.
Compared with the prior art, the invention has the following beneficial effects: according to the attitude sensor data of the right thigh, the dynamic capture template data measured by the motion capture system and the symmetry of the left and right foot step state phases, the position information of the right shank and foot and the left thigh, shank and foot, namely Euler angle data, is deduced according to the angle of the right thigh. The invention obtains the position information of the main part of the lower limb through one attitude sensor, realizes the prediction of the whole gait process, realizes the purpose of reasonably predicting the whole lower limb movement by using a small number of sensors, and reduces the cost of pattern recognition.
Drawings
FIG. 1 is an abstract projection of a human body in the sagittal plane;
FIG. 2 is a flow chart of a lower extremity gait prediction method in an embodiment of the invention;
FIG. 3 is a graph of Euler angles of lower limb rods of the right leg in a gait cycle as measured by the motion capture system;
FIG. 4 is a schematic diagram showing a comparison between the knee joint and ankle joint before and after the data mutation point is buffered.
Detailed Description
The following describes in detail embodiments of the present invention with reference to the drawings and examples, but the embodiments of the present invention are not limited thereto.
Examples
FIG. 1 shows an abstract projection of a human body in the sagittal plane, using a total of θ17Seven angles represent the pose of each model rod. The model rods included in the virtual lower limbs include linear rods and triangular rods. The upper body, thigh, and calf are all represented by straight bar members. For a straight rod, the vertical direction is defined as the starting line, and the angle of the rod is defined as the included angle between the model straight line and the vertical starting line. If the angle from the vertical starting line to the rod piece is anticlockwise, the angle is positive, otherwise, the angle is negative. Since the virtual lower extremities only study the movement of the lower extremities, by default the entire upper body is stationary during walking, the angle θ of the model bar that will represent the upper body7Defined as a constant of 0. For the triangle rod piece, namely the abstract of the foot, the vertical direction is also taken as the starting line, and the included angle between the normal of the straight line at the bottom of the triangle rod piece and the starting line is taken as the angle of the triangle model rod piece. Similarly, if the angle from the start line to the lever is counterclockwise, the angle is positive, whereas the angle is negative. Except the model rod piece of the upper body, the angles of the other six model rod pieces are respectively theta16
The lower limb gait prediction method based on the attitude sensor and the dynamic capture template data in the embodiment as shown in fig. 2 comprises the following steps:
step S1, as shown in fig. 3, the euler angle data of the thigh, the calf, and the foot on the right side are collected by using the motion capture system, and the correspondence relationship between the euler angle data of the thigh, the calf, and the foot in one gait cycle is used as the motion capture template data.
And step S2, fixing the attitude sensor on the front side of the thigh of the right leg of the human body, and obtaining the Euler angle of the thigh on the right side in real time.
In the step, an attitude sensor capable of detecting the Euler angle, the angular velocity and the acceleration is fixed on the front side of the right thigh of the human body. The real-time Euler angle of the right thigh can be collected in real time when the human body moves. The coordinate axis of the Euler angle is static relative to the ground, and can well reflect the motion of the lower limbs of the human body in space.
And step S3, acquiring Euler angle data of right crus and feet corresponding to the current right thigh angle based on the Euler angle of the right thigh and the dynamic capture template data, and storing the Euler angle data as right real-time dynamic capture template data.
The angle relation of the thigh, the calf and the foot on the right side in the walking process of the human body is measured by using the motion capture system, so that the angle change relation of the thigh, the calf and the foot on the right side in the whole gait cycle, namely the dynamic capture template data, is obtained. Therefore, after the euler angles of the right thigh are measured by the attitude sensor, the euler angle data of the lower leg and the foot measured in the dynamic capture template data corresponding to the thigh angles can be obtained through the euler angles of the right thigh and the dynamic capture template data measured by the motion capture system.
In this step, since walking gait is still unstable, when euler angle data of the lower leg and foot are updated according to euler angle of right thigh and kinetic capture template data, occasionally, the angle of the upper leg may exceed the angle of the upper leg in the kinetic capture template data, and the strategy adopted in the method is to abandon updating for the frame, namely, not updating data of the right lower leg and foot. There is also a case where the gait often ends up in 100% of the cycle, and therefore the angle of the knee joint and the ankle joint obtained by the transformation often changes abruptly, resulting in "leg and foot jump"; thus at this timeAnd buffering the ankle joint and knee joint angle data at the position where the sudden change occurs by using a buffering method. As shown in FIG. 4, the pre-mutation value is x, and the post-mutation angle data is x'iThe buffered value is yiThen the linear buffering yields the value:
yi=x′i+(x-x′i)×(1-i/bmax)
wherein i is a buffer counter, bmaxThe total buffer time (number of frames).
And step S4, based on the phase corresponding relation of the left leg and the right leg, according to the stored right real-time dynamic capture template data, obtaining Euler angle data of the left thigh, the left shank and the left foot.
In this step, since the phases of the left leg and the right leg are different by a half cycle and the swing amplitudes of the left and right legs are kept the same during walking, the position prediction of each lower limb rod of the left leg is given based on the prediction data of the right leg according to the symmetric relationship of the left and right legs.
In this step, in order to make the virtual lower limb have a more natural follow-up effect in the dynamic change of the gait, the angle change data of the former model rod piece of the right leg in one gait cycle is stored through a cache region, which is called as real-time model data. And then according to the real-time predicted gait cycle, based on the real-time model data, taking the data of the latter half of the predicted gait cycle of the current position of the right leg as the data of each model rod piece of the left leg. Therefore, when the thigh angle and the gait cycle are obviously changed, the left leg and the right leg can still generate symmetrical motion, so that the motion of the virtual lower limb is more stable.
And step S5, predicting the motion of the lower limbs in real time according to Euler angle data of six lower limb rod pieces including the left and right thighs, the lower legs and the feet.
In a continuous gait, a complete gait is considered to be between two adjacent heel strikes of the right foot. The plantar pressure sensor of the portable experimental system worn by the experimental subject can provide plantar pressure information, and the time of a complete gait cycle can be known in real time through two adjacent heel touchdowns. However, in an actual real-time data acquisition scenario, this cycle capture method cannot be implemented, and it can be determined that one gait cycle starts by heel strike, but the position of the lower limb in the gait cycle cannot be determined before the next heel strike, so that the length of the gait cycle can only be accurately known when the current gait cycle ends. It is important to be able to estimate the length of the gait cycle at the beginning of the gait cycle. When heel strike is detected, the average is taken as a prediction of the length of the current gait cycle by averaging the lengths of the first five gait cycles. Because walking is a relatively smooth process, the length of the current period can be effectively estimated by averaging the first five gait periods, and the average value can reduce the disturbance of walking sudden change gait periods to a certain extent.
However, when the gait changes significantly, the gait cycle and the range of the change of the angle of the thigh will both change significantly, so that if the angle of the model rod of the left leg extends half a cycle behind the right leg according to the gait data collected by the dynamic capture template, the left and right leg gait will be uncoordinated. In order to enable the virtual lower limb to have a more natural follow-up effect in the dynamic change of the gait, angle change data of the previous model rod piece of the right leg in one period is stored through a buffer area and is called as real-time model data. And then according to the real-time predicted period, based on the real-time model data, taking the data of the latter half of the predicted period of the current position of the right foot as the data of each model rod piece of the left leg. Therefore, when the thigh angle and the gait cycle are obviously changed, the left leg and the right leg can still generate symmetrical motion, so that the motion of the virtual lower limb is more stable.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (4)

1. The lower limb gait prediction method based on the attitude sensor and the dynamic capture template data is characterized by comprising the following steps of:
step S1, collecting Euler angle data of the right thigh, the right calf and the right foot of the human body in the walking process, and taking the corresponding relation of the Euler angle data of the thigh, the right calf and the right foot in one gait cycle as dynamic capturing template data;
s2, fixing the attitude sensor on the front side of the thigh of the right leg of the human body, and obtaining the Euler angle of the thigh on the right side in real time;
step S3, based on the Euler angle of the right thigh and the dynamic capturing template data, obtaining the Euler angles of the right shank and foot corresponding to the current right thigh angle, and storing the Euler angles as right real-time dynamic capturing template data;
step S4, based on the phase corresponding relation of the left leg and the right leg, according to the stored data of the right real-time dynamic capturing template, the Euler angles of the left thigh, the left leg and the left foot are obtained;
step S5, predicting the motion of the lower limbs in real time according to the Euler angle data of the thighs, the shanks and the feet on the left and right sides;
step S4, storing angle change data of the previous right leg model rod piece in a gait cycle through a buffer area, wherein the angle change data are called real-time model data; and according to the real-time predicted gait cycle, based on the real-time model data, taking the data of the latter half of the predicted gait cycle of the current position of the right leg as the data of each model rod piece of the left leg.
2. The lower limb gait prediction method according to claim 1, wherein step S1 is performed by using a motion capture system to collect euler angle data of the right thigh, calf and foot.
3. The lower limb gait prediction method according to claim 1, wherein, in step S3, when euler angle data of the lower leg and foot are updated based on the euler angle of the right thigh and the kinetic capture template data, data of the right lower leg and foot are not updated when the angle of the right thigh exceeds the thigh angle in the kinetic capture template data.
4. According to claim 1 or3 the lower limb gait prediction method of step S3, wherein when the knee joint and the ankle joint have a sudden change in angle, ankle joint and knee joint angle data at the point of the sudden change are buffered using a buffer method, and the pre-mutation value is x and the post-mutation angle data is x'iThe buffered value is yiThen the linear buffering yields the value:
yi=x′i+(x-x′i)×(1-i/bmax)
wherein i is a buffer counter, bmaxThe total buffer time.
CN202011583121.5A 2020-12-28 2020-12-28 Lower limb gait prediction method based on attitude sensor and dynamic capture template data Active CN112704491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011583121.5A CN112704491B (en) 2020-12-28 2020-12-28 Lower limb gait prediction method based on attitude sensor and dynamic capture template data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011583121.5A CN112704491B (en) 2020-12-28 2020-12-28 Lower limb gait prediction method based on attitude sensor and dynamic capture template data

Publications (2)

Publication Number Publication Date
CN112704491A CN112704491A (en) 2021-04-27
CN112704491B true CN112704491B (en) 2022-01-28

Family

ID=75545883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011583121.5A Active CN112704491B (en) 2020-12-28 2020-12-28 Lower limb gait prediction method based on attitude sensor and dynamic capture template data

Country Status (1)

Country Link
CN (1) CN112704491B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113509174B (en) * 2021-05-27 2023-09-01 中国科学院深圳先进技术研究院 Method, apparatus and storage medium for estimating joint angle
CN113962247B (en) * 2021-09-30 2024-05-03 华南理工大学 Gait prediction method and system based on standard library matching
CN114376566A (en) * 2022-02-16 2022-04-22 常州大学 Symmetry evaluation method for lower limb segments during hand load
CN114587346B (en) * 2022-03-25 2024-07-12 中电海康集团有限公司 Human lower limb movement monitoring method and system based on IMU
CN114558280B (en) * 2022-04-24 2022-08-02 之江实验室 Multi-scene intelligent sports equipment based on double-leg posture prediction and use method thereof
CN115192002A (en) * 2022-06-22 2022-10-18 清华大学 User's limb movement disorder degree measurement system, method and computer device
CN119424158A (en) * 2024-11-20 2025-02-14 华中科技大学 A foot drop assisting method based on flexible ankle exoskeleton

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102801924A (en) * 2012-07-20 2012-11-28 合肥工业大学 Television program host interaction system based on Kinect
CN202604831U (en) * 2012-04-26 2012-12-19 西北大学 Limb motion parameter collecting and processing device
CN107180235A (en) * 2017-06-01 2017-09-19 陕西科技大学 Human Action Recognition Algorithm Based on Kinect
CN107491259A (en) * 2016-06-11 2017-12-19 苹果公司 Activity and body-building renewal
CN107536613A (en) * 2016-06-29 2018-01-05 深圳光启合众科技有限公司 Robot and its human body lower limbs Gait Recognition apparatus and method
CN109343713A (en) * 2018-10-31 2019-02-15 重庆子元科技有限公司 A kind of human action mapping method based on Inertial Measurement Unit
CN109991979A (en) * 2019-03-29 2019-07-09 华中科技大学 A gait planning method for lower limb robots for complex environments
CN110537921A (en) * 2019-08-28 2019-12-06 华南理工大学 A Portable Gait Multi-Sensing Data Acquisition System
CN111248919A (en) * 2020-01-20 2020-06-09 深圳市丞辉威世智能科技有限公司 Gait recognition method, device, equipment and readable storage medium
CN111427697A (en) * 2020-03-18 2020-07-17 深圳市瑞立视多媒体科技有限公司 Motion capture method, device and equipment based on multithreading and storage medium

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2500938A1 (en) * 2004-03-24 2005-09-24 Rohm And Haas Company Memory devices based on electric field programmable films
JP2006293985A (en) * 2005-03-15 2006-10-26 Fuji Photo Film Co Ltd Program, apparatus and method for producing album
US9067097B2 (en) * 2009-04-10 2015-06-30 Sovoz, Inc. Virtual locomotion controller apparatus and methods
CN102039024A (en) * 2009-10-19 2011-05-04 上海理工大学 Evaluating and training method for balancing function during crouching and standing
US9330342B2 (en) * 2012-12-10 2016-05-03 The Regents Of The University Of California On-bed monitoring system for range of motion exercises with a pressure sensitive bed sheet
CN103083027B (en) * 2013-01-10 2014-08-27 苏州大学 Gait phase distinguishing method based on lower limb joint movement information
US10258377B1 (en) * 2013-09-27 2019-04-16 Orthex, LLC Point and click alignment method for orthopedic surgeons, and surgical and clinical accessories and devices
US20170225033A1 (en) * 2015-06-23 2017-08-10 Ipcomm Llc Method and Apparatus for Analysis of Gait and to Provide Haptic and Visual Corrective Feedback
CN105456000B (en) * 2015-11-10 2018-09-14 华南理工大学 A kind of ambulation control method of wearable bionic exoskeleton pedipulator convalescence device
US20170243354A1 (en) * 2016-02-19 2017-08-24 Xerox Corporation Automatic frontal-view gait segmentation for abnormal gait quantification
CN108098736A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of exoskeleton robot auxiliary device and method based on new perception
CN106492896B (en) * 2016-12-27 2019-03-01 安徽理工大学 A kind of liquid-transfering device for volatile liquid
CN109528212B (en) * 2018-12-29 2023-09-19 大连乾函科技有限公司 Abnormal gait recognition equipment and method
CN110013370B (en) * 2019-05-24 2021-08-03 国家康复辅具研究中心 A method and device for determining the alignment effect of a lower limb prosthesis
CN110279419A (en) * 2019-06-24 2019-09-27 山西省信息产业技术研究院有限公司 A kind of device carrying out characteristics of human body's analysis based on gait
CN110558992B (en) * 2019-07-30 2022-04-12 福建省万物智联科技有限公司 Gait detection analysis method and device
CN110755070B (en) * 2019-08-28 2022-07-05 北京精密机电控制设备研究所 Multi-sensor fusion-based lower limb movement pose rapid prediction system and method
CN110509261B (en) * 2019-08-28 2022-05-03 沈阳航空航天大学 A leg exoskeleton device that assists the mobility of the elderly
CN110522458A (en) * 2019-10-15 2019-12-03 北京理工大学 A real-time gait recognition method for knee exoskeleton
CN111027432B (en) * 2019-12-02 2022-10-04 大连理工大学 A Vision-Following Robot Method Based on Gait Features
CN111012358B (en) * 2019-12-26 2023-02-10 浙江福祉科创有限公司 Human ankle joint motion trajectory measurement method and wearable device
CN110916679B (en) * 2019-12-31 2022-07-15 复旦大学 Device and method for detecting posture and gait of human lower limbs
CN111481197B (en) * 2020-04-22 2021-01-26 东北大学 A living-machine multimode information acquisition fuses device for man-machine natural interaction
CN111588597A (en) * 2020-04-22 2020-08-28 百年旭康医疗器械有限公司 Intelligent interactive walking training system and implementation method thereof
CN111714129A (en) * 2020-05-07 2020-09-29 广西科技大学 Human Gait Information Collection System
CN111805511B (en) * 2020-05-25 2021-09-14 浙江大学 Lower limb exoskeleton system with actively adjustable leg rod length and control method thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202604831U (en) * 2012-04-26 2012-12-19 西北大学 Limb motion parameter collecting and processing device
CN102801924A (en) * 2012-07-20 2012-11-28 合肥工业大学 Television program host interaction system based on Kinect
CN107491259A (en) * 2016-06-11 2017-12-19 苹果公司 Activity and body-building renewal
CN107536613A (en) * 2016-06-29 2018-01-05 深圳光启合众科技有限公司 Robot and its human body lower limbs Gait Recognition apparatus and method
CN107180235A (en) * 2017-06-01 2017-09-19 陕西科技大学 Human Action Recognition Algorithm Based on Kinect
CN109343713A (en) * 2018-10-31 2019-02-15 重庆子元科技有限公司 A kind of human action mapping method based on Inertial Measurement Unit
CN109991979A (en) * 2019-03-29 2019-07-09 华中科技大学 A gait planning method for lower limb robots for complex environments
CN110537921A (en) * 2019-08-28 2019-12-06 华南理工大学 A Portable Gait Multi-Sensing Data Acquisition System
CN111248919A (en) * 2020-01-20 2020-06-09 深圳市丞辉威世智能科技有限公司 Gait recognition method, device, equipment and readable storage medium
CN111427697A (en) * 2020-03-18 2020-07-17 深圳市瑞立视多媒体科技有限公司 Motion capture method, device and equipment based on multithreading and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"基于惯性数据的人体姿态实时三维重构关键技术研究";韩梅;《中国优秀硕士学位论文全文数据库信息科技辑》;20180615;第I138-2017页 *
"基于足底压力的坡道步态失稳自适应平衡研究";司莹;《中国优秀硕士学位论文全文数据库医药卫生科技辑》;20170215;第E060-654页 *

Also Published As

Publication number Publication date
CN112704491A (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN112704491B (en) Lower limb gait prediction method based on attitude sensor and dynamic capture template data
Komura et al. Simulating pathological gait using the enhanced linear inverted pendulum model
Hao et al. Smoother-based 3-D foot trajectory estimation using inertial sensors
CN106767790B (en) A Method for Estimating Pedestrian Movement Tracking by Fusion of Human Lower Limb Motion Model and Kalman Filter
CN113268141B (en) A motion capture method and device based on inertial sensors and fabric electronics
CN107194193A (en) A kind of ankle pump motion monitoring method and device
CN112405504B (en) Exoskeleton robot
CN107389052A (en) A kind of ankle pump motion monitoring system and terminal device
US20240115164A1 (en) Detection device, detection method, and program recording medium
CN118537915B (en) Lower limb exoskeleton gait trajectory planning method and device based on self-oscillation
CN111267071A (en) Multi-joint combined control system and method for exoskeleton robot
JP7398090B2 (en) Information processing device, calculation method and program
CN116206358A (en) A method and system for predicting movement patterns of lower extremity exoskeleton based on VIO system
JP2013208292A (en) Walking assistance device and walking assistance program
Ahmadi et al. Human gait monitoring using body-worn inertial sensors and kinematic modelling
CN117103260B (en) Gait control method and device for wearable auxiliary load robot
KR20190008519A (en) Walking profiler system and method
CN115479601B (en) A three-dimensional space positioning method, system, device and storage medium
JP2013208291A (en) Walking assistance device and walking assistance program
CN113146611B (en) A motion pattern recognition method for rigid-flexible coupled exoskeleton robot
Kanjanapas et al. 7 degrees of freedom passive exoskeleton for human gait analysis: Human joint motion sensing and torque estimation during walking
JP2014161390A (en) Device and method for estimating walking state
Firmani et al. A framework for the analysis and synthesis of 3D dynamic human gait
JP7480852B2 (en) Calculation device, gait measurement system, calculation method, and program
KR102695282B1 (en) Method and apparatus for severity determination of stroke

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant