[go: up one dir, main page]

CN111096830A - An exoskeleton gait prediction method based on LightGBM - Google Patents

An exoskeleton gait prediction method based on LightGBM Download PDF

Info

Publication number
CN111096830A
CN111096830A CN201911384974.3A CN201911384974A CN111096830A CN 111096830 A CN111096830 A CN 111096830A CN 201911384974 A CN201911384974 A CN 201911384974A CN 111096830 A CN111096830 A CN 111096830A
Authority
CN
China
Prior art keywords
prediction
flexion
left lower
extension angle
lower limb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911384974.3A
Other languages
Chinese (zh)
Other versions
CN111096830B (en
Inventor
孔万增
王伟富
宋国明
王雪岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN201911384974.3A priority Critical patent/CN111096830B/en
Publication of CN111096830A publication Critical patent/CN111096830A/en
Application granted granted Critical
Publication of CN111096830B publication Critical patent/CN111096830B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/60Artificial legs or feet or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2002/704Operating or control means electrical computer-controlled, e.g. robotic control

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Transplantation (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Cardiology (AREA)
  • Vascular Medicine (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Physiology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Psychiatry (AREA)
  • Dentistry (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Orthopedic Medicine & Surgery (AREA)

Abstract

本发明公开一种基于LightGBM的外骨骼步态预测方法。本发明通过人体运动学分析,提取下肢关节的加速度数据做离线分析,进而解算出关节控制目标轨迹,最后控制系统通过该轨迹进行运动控制,实现下肢外骨骼的人体步态运动。本发明提出一种全新的下肢运动步态轨迹预测方法,能够应用于下肢外骨骼控制中,创新性的对LightGBM实现并联结构输入,并对下肢关节的连续目标值的步态轨迹预测,预测方法具有较强的精确度,减少了训练时间。

Figure 201911384974

The invention discloses an exoskeleton gait prediction method based on LightGBM. The invention extracts the acceleration data of the lower limb joints for offline analysis through the analysis of human body kinematics, and then solves the joint control target trajectory. The invention proposes a new lower limb motion gait trajectory prediction method, which can be applied to the lower limb exoskeleton control, innovatively realizes the parallel structure input to the LightGBM, and predicts the gait trajectory of the continuous target value of the lower limb joints, and the prediction method It has strong accuracy and reduces training time.

Figure 201911384974

Description

Exoskeleton gait prediction method based on LightGBM
Technical Field
The invention belongs to the field of human-computer cooperative motion control of lower limb exoskeleton, and relates to a motion gait trajectory prediction method based on a LightGradient Boosting Machine (LIGHTGM).
Background
The lower limb exoskeleton robot is a typical man-machine integrated system worn outside the lower limbs of a user, integrates the detection, control, information fusion and other robot technologies, combines the intelligence of the user with the physical strength of the robot, and provides power to assist the user in moving. In the civil field, the exoskeleton robot can help the old to normally act. In the aspect of the medical field, the exoskeleton robot can assist the disabled in normal life and greatly reduce the working pressure of medical staff. In the aspect of military field, the exoskeleton robot can improve the rescue efficiency of a battlefield and help more injured people, and the exoskeleton robot plays a great role in various fields, so that the exoskeleton robot has a very wide development prospect.
At present, two systems are used for inputting to predict the gait track of the exoskeleton robot, firstly, a sensing type sensor is used for measuring human body biological signals such as electroencephalogram (EEG) or Electromyogram (EMG) and the like to be used as the input of an exoskeleton control system, although the biological signals are ahead of the human body movement and can solve the problem of movement lag, the EEG and EMG signals are unstable and are easily interfered by static electricity, sweat and the like. And secondly, physical sensors such as angle/angular velocity, force/moment and the like are utilized to acquire human kinematics or dynamic data in real time to be used as the input of the exoskeleton control system. For the gait trajectory prediction problem, the traditional method comprises a Support Vector Machine (SVM), Kalman filtering and the like, the traditional method has overlarge calculated amount and large memory occupation and is easy to generate overfitting, and the continuous target value prediction of the gait trajectory still needs to be further improved.
Therefore, it is necessary to provide a lower extremity exoskeleton gait prediction method capable of improving the accuracy of continuous target value prediction of gait tracks, improving track smoothness and reducing the calculation amount.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a gait track prediction method based on an improved LightGBM, which extracts acceleration data of lower limb joints for off-line analysis through human body kinematics analysis, further calculates a joint control target track, and finally controls the motion through the track by a control system to realize human body gait motion of lower limb exoskeleton.
In order to achieve the purpose, the technical scheme of the invention comprises the following specific contents:
a LightGBM-based exoskeleton gait prediction method, the method comprising the steps of:
step (1), data acquisition stage
Respectively arranging 5 IMU sensors on the waist, the left thigh, the left calf, the right thigh and the right calf of a user; acquiring x and y axial acceleration values of 5 IMU sensors in real time;
step (2), data analysis and noise elimination smoothing processing
Constructing a matrix consisting of 10 xk acceleration components according to the x and y axial acceleration values of the 5 IMU sensors in the step (1), wherein the matrix is defined as:
R=[V1,V2,…,Vi,…,V10]Tt represents the transpose of the matrix
Wherein
Figure BDA0002343336420000021
Representing the acceleration value of the ith acceleration component at the moment t, wherein K is the total number of data sampling points;
to ViPerforming moving average filtering and de-noising processing
Figure BDA0002343336420000022
Has a filtered value of
Figure BDA0002343336420000023
Then:
Figure BDA0002343336420000024
wherein L is the number of the original acceleration values obtained during the moving average filtering processing, and is an odd number, and M is (L-1)/2;
according to the formula (2.1), V is finally obtainediNoise-canceling data at all times t (t ═ 1,2, …, K):
Figure BDA0002343336420000025
step (3) acquiring real-time flexion and extension angles in gait data
According to the noise elimination data processed in the step (2), taking the left lower limb as an example, the hip joint flexion and extension angle of the left lower limb at a certain time t
Figure BDA0002343336420000026
And the flexion-extension angle of the knee joint
Figure BDA0002343336420000027
The following equations (3.1) and (3.2) respectively yield:
Figure BDA0002343336420000028
Figure BDA0002343336420000029
wherein a isx1,ay1The acceleration components, a, of the left thigh IMU sensor at the time t after the processing in step 2 are respectivelyx2,ax2Acceleration components of the left and right crus IMU sensors at the time t after the processing in the step 2 are respectively obtained;
finally obtaining the flexion and extension angle vector α of the hip joint and the knee joint of the left lower limb at all K momentsLeft side ofAnd βLeft side of
Figure BDA0002343336420000031
Figure BDA0002343336420000032
Obtaining flexion and extension angle vectors of the hip joint and the knee joint of the right lower limb in the same way;
step (4), continuous target value real-time prediction of improved LightGBM method
At the hip flexion and extension angle α of the left lower limbLeft side ofFor example, the following steps are carried out:
4.1 predictive model training
Training a prediction model of the hip joint flexion and extension angle of the left lower limb by taking the vector data of the hip joint flexion and extension angle of the left lower limb obtained in the step 3 as a training set;
first using a sliding window from αLeft side ofExtracting angle values from the left lower limb hip joint flexion and extension angle matrix Aα left sideAs a feature matrix for training the prediction model, see formula 4.1;
Figure BDA0002343336420000033
wherein w is the window width of the sliding window, N is the number of the sliding windows, the predicted length pLen is taken as the step length of the sliding window, and pLen is more than 1; to ensure better construction of the target matrix, K- [ (N-1) × pLen + w needs to be satisfied]Not less than pLen, i.e. ensuring αLeft side ofFinally, enough angle values are provided to construct predicted values of the pLen predicted lengths;
then according to the predicted length and αLeft side ofConstructing a matrix D according to the future-time flexion-extension angle value of the current sliding windowα left sideAs a prediction target value matrix for training a prediction model, see formula 4.2;
Figure BDA0002343336420000041
wherein
Figure BDA0002343336420000042
A predicted value vector corresponding to the ith future moment of the hip joint flexion and extension angle of the left lower limb is obtained;
and (3) constructing a feature matrix of the left lower limb knee joint according to the formulas (4.1) and (4.2) and substituting the feature matrix into a formula (4.3) to obtain pLen prediction models:
Figure BDA0002343336420000043
where T () is the LightGBM training function,
Figure BDA0002343336420000044
a prediction model (function) corresponding to the ith future moment after the hip joint flexion and extension angle of the left lower limb is trained;
4.2 parallel prediction of prediction models
The feature vector in the current latest sliding window of the hip joint of the left lower limb is
Figure BDA0002343336420000045
The prediction of the hip joint flexion and extension angles of the left lower limb at future pLen moments is realized by connecting pLen trained prediction models in a parallel structure:
Figure BDA0002343336420000046
wherein,
Figure BDA0002343336420000047
predicting the hip joint flexion and extension angle of the left lower limb at the ith moment in the future;
the left lower limb hip joint flexion and extension angle prediction vector P can be obtained from the formula (4.4)Left side of
Figure BDA0002343336420000051
Obtaining a flexion-extension angle parallel prediction model of the knee joint of the left lower limb, the hip joint of the right lower limb and the knee joint in the same way;
and (5) predicting the vector according to the step 4 to realize the gait prediction track.
The invention has the beneficial effects that:
the invention provides a brand-new lower limb movement gait track prediction method which can be applied to lower limb exoskeleton control, innovatively realizes parallel structure input to LightGBM, predicts the gait track of a continuous target value of a lower limb joint, has high accuracy and reduces training time.
Drawings
FIG. 1 is a graph of four raw data versus noise-canceled data; wherein (a) left calf acceleration x; (b) left calf acceleration y; (c) left thigh acceleration x; (d) left thigh acceleration y;
FIG. 2 is a diagram of a model of a lower limb structure of a human body;
FIG. 3 is a comparison of three Kalman filtering, XGBOST, and LightGBM methods, where (a) is RMSE, (a) is SC, and (c) is training time;
fig. 4(a) and (b) are comparison results of verification of hip joint and knee joint of left lower limb, respectively.
Detailed Description
In order to make the objects, technical solutions and points of the present invention clearer, embodiments of the present invention will be described in further detail below with reference to the accompanying drawings.
A LightGBM-based exoskeleton gait prediction method comprises the following steps:
1. data acquisition phase
Respectively arranging 5 IMU sensors on the waist, the left thigh, the left calf, the right thigh and the right calf of a user; the lower limb model diagram shown in fig. 2 can be simplified according to the structure of the lower limb of the human body, 5 black points represent the positions of 5 IMU sensors, each IMU sensor provides an acceleration component on a two-dimensional plane corresponding to each leg in the walking process of the lower limb of the human body, the hip joint takes the anticlockwise direction as the positive direction, and the knee joint takes the clockwise direction as the positive direction, so that the x-axis and y-axis acceleration values of the 5 IMU sensors are obtained in real time.
The acquisition equipment is 5 IMU sensors which are respectively arranged on the right shank, the right thigh, the left shank, the left thigh and the waist, the sampling rate is 100Hz, and the testee walks linearly at the pace of 2 km/h.
2. Data analysis and noise cancellation smoothing
Constructing a matrix consisting of 10 xk acceleration components according to the x and y axial acceleration values of the 5 IMU sensors in the step (1), wherein the matrix is defined as:
R=[V1,V2,…,Vi,…,V10]Tt represents the transpose of the matrix
Wherein
Figure BDA0002343336420000061
Representing the acceleration value of the ith acceleration component at the moment t, wherein K is the total number of data sampling points;
to ViPerforming moving average filtering and de-noising processing
Figure BDA0002343336420000062
Has a filtered value of
Figure BDA0002343336420000063
Then:
Figure BDA0002343336420000064
wherein L is the number of the original acceleration values obtained during the moving average filtering processing, and is an odd number, and M is (L-1)/2;
according to the formula (2.1), V is finally obtainediNoise-canceling data at all times t (t ═ 1,2, …, K):
Figure BDA0002343336420000065
in the actual operation process, if the calculation is carried out according to the formula (1)
Figure BDA0002343336420000066
Then N summation operations are required at each time, and the time complexity is high. In general, the moving average filter can be implemented by a recursive algorithm.
Applying the partial acceleration component data to the MAF algorithm, four raw data plots are plotted versus noise-canceled data, as shown in fig. 1.
3. Real-time joint angle calculation of gait data
According to the noise elimination data processed in the step (2), taking the left lower limb as an example, the hip joint flexion and extension angle of the left lower limb at a certain time t
Figure BDA0002343336420000067
And the flexion-extension angle of the knee joint
Figure BDA0002343336420000068
The following equations (3.1) and (3.2) respectively yield:
Figure BDA0002343336420000069
Figure BDA00023433364200000610
wherein a isx1,ay1The acceleration components, a, of the left thigh IMU sensor at the time t after the processing in step 2 are respectivelyx2,ax2Acceleration components of the left and right crus IMU sensors at the time t after the processing in the step 2 are respectively obtained;
finally obtaining the flexion and extension angle vector α of the hip joint and the knee joint of the left lower limb at all K momentsLeft side ofAnd βLeft side of
Figure BDA00023433364200000611
Figure BDA00023433364200000612
And obtaining the flexion and extension angle vectors of the hip joint and the knee joint of the right lower limb in the same way:
Figure BDA0002343336420000071
Figure BDA0002343336420000072
wherein a isx3,ay3The acceleration components a of the right thigh IMU sensor at the time t after the processing in the step 2 are respectivelyx4,ax4The acceleration degrees of the IMU sensor of the right shank at the time t after the processing in the step 2 are respectivelyAn amount;
finally obtaining the flexion and extension angle vector α of the hip joint and the knee joint of the left lower limb at all K momentsRight sideAnd βRight side
Figure BDA0002343336420000073
Figure BDA0002343336420000074
4. Continuous target value real-time prediction for improved LightGBM method
The input of the control signal of the exoskeleton control system directly influences the fluency of the exoskeleton movement, so that the higher the input frequency of the control signal is, the more continuous the signal value input at a single time is, the more natural the movement process of the actuating mechanism of the exoskeleton can be. Based on the principle, when the gait is predicted by using the known LightGBM algorithm, only one joint angle transformation can be predicted at a time, and the exoskeleton execution mechanism is stopped due to the time required for the prediction algorithm to calculate the prediction result and the delay problem of a control signal to the execution mechanism. The conventional LightGBM prediction model cannot meet the real-time prediction function of an actual exoskeleton, so that a gait prediction algorithm based on the LightGBM needs to be improved relatively to be more suitable for the operation of the exoskeleton.
4.1 predictive model training
Training a prediction model of the hip joint flexion and extension angle of the left lower limb by taking the vector data of the hip joint flexion and extension angle of the left lower limb obtained in the step 3 as a training set;
first using a sliding window from αLeft side ofExtracting angle values from the left lower limb hip joint flexion and extension angle matrix Aα left sideAs a feature matrix for training the prediction model, see formula 4.1;
Figure BDA0002343336420000075
wherein w is the window width of the sliding window, N is the number of the sliding windows, the predicted length pLen is taken as the step length of the sliding window, and pLen is more than 1; to ensure betterConstructing a target matrix according to the requirement of K- [ (N-1) x pLen + w]Not less than pLen, i.e. ensuring αLeft side ofFinally, enough angle values are provided to construct predicted values of the pLen predicted lengths;
then according to the predicted length and αLeft side ofConstructing a matrix D according to the future-time flexion-extension angle value of the current sliding windowα left sideAs a prediction target value matrix for training a prediction model, see formula 4.2;
Figure BDA0002343336420000081
wherein
Figure BDA0002343336420000082
A predicted value vector corresponding to the ith future moment of the hip joint flexion and extension angle of the left lower limb is obtained;
and (3) constructing a feature matrix of the left lower limb knee joint according to the formulas (4.1) and (4.2) and substituting the feature matrix into a formula (4.3) to obtain pLen prediction models:
Figure BDA0002343336420000083
where T () is the LightGBM training function,
Figure BDA0002343336420000084
a prediction model (function) corresponding to the ith future moment after the hip joint flexion and extension angle of the left lower limb is trained;
4.2 parallel prediction of prediction models
The feature vector in the current latest sliding window of the hip joint of the left lower limb is
Figure BDA0002343336420000085
The prediction of the hip joint flexion and extension angles of the left lower limb at future pLen moments is realized by connecting pLen trained prediction models in a parallel structure:
Figure BDA0002343336420000091
wherein,
Figure BDA0002343336420000092
predicting the hip joint flexion and extension angle of the left lower limb at the ith moment in the future;
the left lower limb hip joint flexion and extension angle prediction vector P can be obtained from the formula (4.4)Left side of
Figure BDA0002343336420000093
4.3 according to the same principle of steps 4.1-4.2, the right lower limb hip joint flexion and extension angle parallel prediction model is constructed as follows:
inputting a model:
Figure BDA0002343336420000094
and (3) outputting a model:
Figure BDA0002343336420000095
from the characteristic matrix A of the hip joint of the right lower limbα right side、Dα right sideAnalogy to equation (4.3), we derive the pLen prediction models:
Figure BDA0002343336420000096
the feature vector in the current latest sliding window of the hip joint of the right lower limb is
Figure BDA0002343336420000097
The prediction of the hip joint flexion and extension angles of the right lower limb at future pLen moments is realized by connecting pLen trained prediction models in a parallel structure:
Figure BDA0002343336420000101
wherein,
Figure BDA0002343336420000102
for the hip joint of the right lower limb at the ith moment in the futurePredicting a bending and stretching angle;
the right lower limb hip joint flexion and extension angle prediction vector P can be obtained from the formula (4.5)Right side
Figure BDA0002343336420000103
Similarly, the left lower limb knee joint flexion and extension angle parallel prediction model is constructed as follows:
inputting a model:
Figure BDA0002343336420000104
and (3) outputting a model:
Figure BDA0002343336420000105
from the characteristic matrix a of the knee joint of the left lower limbβ left side、Dβ left sideAnalogy to equation (4.3), we derive the pLen prediction models:
Figure BDA0002343336420000106
the feature vector in the current latest sliding window of the knee joint of the left lower limb is
Figure BDA0002343336420000107
The left lower limb knee joint flexion and extension angle prediction at future pLen moments is realized by connecting pLen trained prediction models in a parallel structure:
Figure BDA0002343336420000111
wherein,
Figure BDA0002343336420000112
predicting a bending and stretching angle of the knee joint of the left lower limb at the ith moment in the future;
the left lower limb knee joint flexion and extension angle prediction vector Q can be obtained from the formula (4.6)Left side of
Figure BDA0002343336420000113
Similarly, the right lower limb knee joint flexion and extension angle parallel prediction model is constructed as follows:
inputting a model:
Figure BDA0002343336420000114
and (3) outputting a model:
Figure BDA0002343336420000115
from the characteristic matrix a of the knee joint of the right lower limbβ right side、Dβ right sideAnalogy to equation (4.3), we derive the pLen prediction models:
Figure BDA0002343336420000116
the feature vector in the current latest sliding window of the knee joint of the right lower limb is
Figure BDA0002343336420000117
The prediction of the flexion and extension angles of the knee joints of the right lower limbs at future pLen moments is realized by connecting pLen trained prediction models in a parallel structure:
Figure BDA0002343336420000121
wherein,
Figure BDA0002343336420000122
predicting a bending and stretching angle of the knee joint of the right lower limb at the ith moment in the future;
the right lower limb knee joint flexion and extension angle prediction vector Q can be obtained from the formula (4.7)Right side
Figure BDA0002343336420000123
The step length of each sliding of a sliding window constructed by a data set when gait data is applied to a machine learning algorithm is 1, and the set prediction step length is required to be larger than 1 in order to realize continuous prediction of a target value.
Fig. 3 is a histogram of the contrast between the three algorithms. Compared with the XGboost and LightGBM algorithm based on Gradient Boosting, the gait prediction RMSE of Kalman filtering is higher, but the SC is lower, so that the prediction precision of Kalman filtering is low, but the prediction result is smoother, and the Kalman filtering prediction algorithm does not need training; in both XGBoost and LightGBM based on Gradient Boosting, RMSE of the predicted result is roughly the same overall, but SC of the predicted result of LightGBM is smaller than XGBoost, and training time of LightGBM is significantly smaller than that of XGBoost of the same training set size. Therefore, in the actual application process, the LightGBM can more quickly train a prediction model with good prediction effect.
The inventive prediction model validation is shown in fig. 4.
Step (5) executing the prediction result by the execution mechanism
Obtaining hip joint flexion and extension angles and knee joint flexion and extension angles of the left and right lower limbs according to the step 4 to obtain a prediction track; the left and right lower limb exoskeleton executing mechanisms convert the predicted bending and stretching angle tracks into corresponding control signals through the prior art, and then control motors at corresponding joints to operate so as to realize the gait walking of the outer limbs.

Claims (1)

1.一种基于LightGBM的外骨骼步态预测方法,其特征在于该方法包括如下步骤:1. an exoskeleton gait prediction method based on LightGBM, is characterized in that this method comprises the steps: 步骤(1)、数据采集阶段Step (1), data collection stage 将5个IMU传感器分别设置在使用者的腰部、左大腿、左小腿、右大腿、右小腿;实时获取5个IMU传感器的x、y轴向加速度值;Set the five IMU sensors on the user's waist, left thigh, left calf, right thigh, and right calf respectively; obtain the x and y axial acceleration values of the five IMU sensors in real time; 步骤(2)、数据分析与消噪平滑处理Step (2), data analysis and denoising and smoothing 根据步骤(1)5个IMU传感器的x、y轴向加速度值,构建一个由10×k加速度分量组成的矩阵,定义为:According to the x and y axial acceleration values of the five IMU sensors in step (1), a matrix composed of 10×k acceleration components is constructed, which is defined as: R=[V1,V2,…,Vi,…,V10]T,T表示矩阵的转置R=[V 1 ,V 2 ,...,V i ,...,V 10 ] T , where T represents the transpose of the matrix 其中
Figure FDA0002343336410000011
Figure FDA0002343336410000012
表示第i个加速度分量在t时刻加速度值,K为数据采样点的总数量;
in
Figure FDA0002343336410000011
Figure FDA0002343336410000012
represents the acceleration value of the i-th acceleration component at time t, and K is the total number of data sampling points;
对Vi做移动平均滤波消噪处理,设
Figure FDA0002343336410000013
的滤波值为
Figure FDA0002343336410000014
则:
Perform moving average filtering and denoising processing on V i , set
Figure FDA0002343336410000013
The filter value is
Figure FDA0002343336410000014
but:
Figure FDA0002343336410000015
Figure FDA0002343336410000015
其中L为移动平均滤波处理时所取的原加速度值个数,且L为奇数,M=(L-1)/2;Wherein L is the number of original acceleration values taken during the moving average filtering process, and L is an odd number, M=(L-1)/2; 根据公式(2.1),最终得到Vi全部t(t=1,2,…,K)时刻的消噪数据:According to formula (2.1), the denoised data of all Vi at time t (t=1,2,...,K) are finally obtained:
Figure FDA0002343336410000016
Figure FDA0002343336410000016
步骤(3)、获取步态数据中的实时屈伸角Step (3), obtain the real-time flexion and extension angle in the gait data 根据步骤(2)处理后的消噪数据,以左下肢为例,某一时刻t左下肢髋关节屈伸角角度
Figure FDA0002343336410000019
和膝关节的屈伸角角度
Figure FDA00023433364100000110
分别由公式(3.1)以及(3.2)得到:
According to the de-noised data processed in step (2), taking the left lower limb as an example, at a certain moment t, the hip flexion and extension angle of the left lower limb is
Figure FDA0002343336410000019
and knee flexion angle
Figure FDA00023433364100000110
It is obtained by formulas (3.1) and (3.2) respectively:
Figure FDA0002343336410000017
Figure FDA0002343336410000017
Figure FDA0002343336410000018
Figure FDA0002343336410000018
其中ax1,ay1分别为步骤2处理后时刻t左大腿IMU传感器的加速度分量,ax2,ax2分别为步骤2处理后时刻t左小腿IMU传感器的加速度分量;Wherein a x1 and a y1 are respectively the acceleration components of the left thigh IMU sensor at time t after step 2 is processed, and a x2 and a x2 are respectively the acceleration components of the left calf IMU sensor at time t after step 2 processing; 最终获得所有K个时刻的左下肢髋关节与膝关节屈伸角度向量α与βFinally, the flexion and extension angle vectors αleft and βleft of the left lower extremity hip joint and knee joint at all K moments are obtained:
Figure FDA0002343336410000022
Figure FDA0002343336410000022
Figure FDA0002343336410000023
Figure FDA0002343336410000023
同理获得右下肢髋关节与膝关节屈伸角度向量;Similarly, the flexion and extension angle vectors of the right lower limb hip joint and knee joint are obtained; 步骤(4)、改进的LightGBM方法的连续目标值实时预测Step (4), real-time prediction of continuous target value of the improved LightGBM method 以左下肢髋关节屈伸角α为例:Take the left lower extremity hip flexion and extension angle α left as an example: 4.1预测模型训练4.1 Predictive model training 通过步骤3获得的左下肢髋关节屈伸角度向量数据为训练集,训练左下肢髋关节屈伸角的预测模型;The vector data of the left lower extremity hip flexion and extension angle obtained through step 3 is a training set, and the prediction model of the left lower extremity hip flexion and extension angle is trained; 首先利用滑动窗口从α中提取角度值构建左下肢髋关节屈伸角矩阵Aα左,作为训练预测模型的特征矩阵,见公式4.1;First, use the sliding window to extract the angle value from αleft to construct the left lower extremity hip joint flexion and extension angle matrix A αleft , which is used as the feature matrix for training the prediction model, see formula 4.1;
Figure FDA0002343336410000021
Figure FDA0002343336410000021
其中w为滑动窗口的窗宽,N为滑动窗口数目,以预测长度pLen作为滑动窗口的步长,pLen>1;为保证更好地构建目标矩阵,需满足K-[(N-1)×pLen+w]≥pLen,即保证α最后有足够的角度值构建pLen个预测长度的预测值;Where w is the window width of the sliding window, N is the number of sliding windows, and the prediction length pLen is used as the step size of the sliding window, pLen>1; in order to ensure better construction of the target matrix, it is necessary to satisfy K-[(N-1)× pLen+w]≥pLen, that is, to ensure that there is enough angle value at the end of α left to construct the prediction value of pLen prediction length; 然后根据预测长度与α中相对于当前滑动窗口的未来时刻屈伸角角度值,构建矩阵Dα左,作为训练预测模型的预测目标值矩阵,见公式4.2;Then, according to the prediction length and the flexion-extension angle value of the left middle α relative to the current sliding window in the future, a matrix D α left is constructed as the prediction target value matrix of the training prediction model, see formula 4.2;
Figure FDA0002343336410000031
Figure FDA0002343336410000031
其中
Figure FDA0002343336410000034
为左下肢髋关节屈伸角对应未来第i个时刻的预测值向量;
in
Figure FDA0002343336410000034
is the predicted value vector of the left lower extremity hip flexion and extension angle corresponding to the ith moment in the future;
根据公式(4.1)与(4.2)构建左下肢膝关节的特征矩阵代入到公式(4.3),得到pLen个预测模型:According to formulas (4.1) and (4.2), the feature matrix of the left lower limb knee joint is constructed and substituted into formula (4.3) to obtain pLen prediction models:
Figure FDA0002343336410000032
Figure FDA0002343336410000032
其中T()为LightGBM训练函数,
Figure FDA0002343336410000035
为左下肢髋关节屈伸角训练后对应未来第i个时刻的预测模型(函数);
where T() is the LightGBM training function,
Figure FDA0002343336410000035
is the prediction model (function) corresponding to the i-th moment in the future after training on the flexion and extension angle of the left lower extremity hip joint;
4.2预测模型并联预测4.2 Prediction Model Parallel Prediction 左下肢髋关节当前最新滑动窗口内的特征向量为
Figure FDA0002343336410000036
通过将pLen个训练好的预测模型以并联结构相连,实现未来pLen个时刻的左下肢髋关节屈伸角预测:
The feature vector in the current latest sliding window of the left lower extremity hip joint is
Figure FDA0002343336410000036
By connecting pLen trained prediction models in a parallel structure, the prediction of the left lower extremity hip flexion and extension angle at pLen moments in the future is realized:
Figure FDA0002343336410000033
Figure FDA0002343336410000033
其中,
Figure FDA0002343336410000037
为未来第i个时刻的左下肢髋关节屈伸角预测值;
in,
Figure FDA0002343336410000037
is the predicted value of the left lower extremity hip flexion and extension angle at the i-th time in the future;
由式(4.4)可得左下肢髋关节屈伸角预测向量PFrom formula (4.4), the prediction vector Pleft of the left lower extremity hip joint flexion and extension angle can be obtained:
Figure FDA0002343336410000041
Figure FDA0002343336410000041
同理得到左下肢膝关节、右下肢髋关节和膝关节的屈伸角并联预测模型;Similarly, the parallel prediction model of the flexion and extension angles of the left lower extremity knee joint, right lower extremity hip joint and knee joint is obtained; 步骤(5)、根据步骤4预测向量以实现步态预测轨迹。Step (5), predict the vector according to step 4 to realize the gait prediction trajectory.
CN201911384974.3A 2019-12-28 2019-12-28 Exoskeleton gait prediction method based on LightGBM Active CN111096830B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911384974.3A CN111096830B (en) 2019-12-28 2019-12-28 Exoskeleton gait prediction method based on LightGBM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911384974.3A CN111096830B (en) 2019-12-28 2019-12-28 Exoskeleton gait prediction method based on LightGBM

Publications (2)

Publication Number Publication Date
CN111096830A true CN111096830A (en) 2020-05-05
CN111096830B CN111096830B (en) 2021-11-30

Family

ID=70424063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911384974.3A Active CN111096830B (en) 2019-12-28 2019-12-28 Exoskeleton gait prediction method based on LightGBM

Country Status (1)

Country Link
CN (1) CN111096830B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112114665A (en) * 2020-08-23 2020-12-22 西北工业大学 Hand tracking method based on multi-mode fusion
CN112535474A (en) * 2020-11-11 2021-03-23 西安交通大学 Lower limb movement joint angle real-time prediction method based on similar rule search
CN113829339A (en) * 2021-08-02 2021-12-24 上海大学 Exoskeleton movement coordination method based on long-time and short-time memory network
CN115294653A (en) * 2022-08-10 2022-11-04 电子科技大学 Lower limb exoskeleton gait prediction method based on Gaussian process regression
CN119723666A (en) * 2024-12-17 2025-03-28 南京邮电大学 A gait feature detection method for human hip arthritis based on gait analysis and posture estimation
CN120578884A (en) * 2025-05-15 2025-09-02 国网湖北省电力有限公司超高压公司 Exoskeleton trajectory prediction method, system and device based on motion intention recognition

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5007938A (en) * 1989-07-08 1991-04-16 Ipos Gmbh & Co. Kg Artificial foot for a leg prosthesis
US20040167638A1 (en) * 2002-11-01 2004-08-26 Caspers Carl A. Pressure/temperature monitoring device for prosthetics
US20060276393A1 (en) * 2005-01-13 2006-12-07 Sirtris Pharmaceuticals, Inc. Novel compositions for preventing and treating neurodegenerative and blood coagulation disorders
CN101036601A (en) * 2007-04-24 2007-09-19 杭州电子科技大学 Real time control device and control method by two-degrees-of freedom myoelectricity artificial hand
US20080091087A1 (en) * 2006-07-12 2008-04-17 Neuhauser Alan R Methods and systems for compliance confirmation and incentives
CN101579238A (en) * 2009-06-15 2009-11-18 吴健康 Human motion capture three dimensional playback system and method thereof
US20110125291A1 (en) * 2006-12-08 2011-05-26 Hanger Orthopedic Group Inc. Prosthetic device and connecting system using a vacuum
US20110213466A1 (en) * 2009-08-27 2011-09-01 The Foundry Llc Method and Apparatus for Force Redistribution in Articular Joints
CN103637840A (en) * 2005-08-23 2014-03-19 史密夫和内修有限公司 Telemetric orthopaedic implant
EP2825134A1 (en) * 2012-03-14 2015-01-21 Vanderbilt University System and method for providing biomechanically suitable running gait in powered lower limb devices
US20150148423A1 (en) * 2012-04-26 2015-05-28 Sentient Lifesciences, Inc. Use of n-acetylcysteine amide in the treatment of disease and injury
US20150313728A1 (en) * 2008-04-30 2015-11-05 Rizzoli Ortopedia S.P.A. Automatic prosthesis for above-knee amputees
CN109464193A (en) * 2018-12-27 2019-03-15 北京爱康宜诚医疗器材有限公司 Data prediction method, device and system
US20190117415A1 (en) * 2008-09-04 2019-04-25 Bionx Medical Technologies, Inc. Hybrid terrain-adaptive lower-extremity systems

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5007938A (en) * 1989-07-08 1991-04-16 Ipos Gmbh & Co. Kg Artificial foot for a leg prosthesis
US20040167638A1 (en) * 2002-11-01 2004-08-26 Caspers Carl A. Pressure/temperature monitoring device for prosthetics
US20060276393A1 (en) * 2005-01-13 2006-12-07 Sirtris Pharmaceuticals, Inc. Novel compositions for preventing and treating neurodegenerative and blood coagulation disorders
CN103637840A (en) * 2005-08-23 2014-03-19 史密夫和内修有限公司 Telemetric orthopaedic implant
US20080091087A1 (en) * 2006-07-12 2008-04-17 Neuhauser Alan R Methods and systems for compliance confirmation and incentives
US20110125291A1 (en) * 2006-12-08 2011-05-26 Hanger Orthopedic Group Inc. Prosthetic device and connecting system using a vacuum
CN101036601A (en) * 2007-04-24 2007-09-19 杭州电子科技大学 Real time control device and control method by two-degrees-of freedom myoelectricity artificial hand
US20150313728A1 (en) * 2008-04-30 2015-11-05 Rizzoli Ortopedia S.P.A. Automatic prosthesis for above-knee amputees
US20190117415A1 (en) * 2008-09-04 2019-04-25 Bionx Medical Technologies, Inc. Hybrid terrain-adaptive lower-extremity systems
CN101579238A (en) * 2009-06-15 2009-11-18 吴健康 Human motion capture three dimensional playback system and method thereof
US20110213466A1 (en) * 2009-08-27 2011-09-01 The Foundry Llc Method and Apparatus for Force Redistribution in Articular Joints
CN102639082A (en) * 2009-08-27 2012-08-15 科特拉有限公司 Method and apparatus for redistribution of forces in a joint
EP2825134A1 (en) * 2012-03-14 2015-01-21 Vanderbilt University System and method for providing biomechanically suitable running gait in powered lower limb devices
US20150148423A1 (en) * 2012-04-26 2015-05-28 Sentient Lifesciences, Inc. Use of n-acetylcysteine amide in the treatment of disease and injury
CN109464193A (en) * 2018-12-27 2019-03-15 北京爱康宜诚医疗器材有限公司 Data prediction method, device and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
N. SHIOZAWA: "Virtual walkway system and prediction of gait mode transition for the control of the gait simulator", 《THE 26TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY》 *
YUHANG YE: "Optimal Feature Selection for EMG-Based Finger Force Estimation Using LightGBM Model", 《2019 28TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN)》 *
宋国明: "下肢外骨骼步态感知预测与控制方法研究与应用", 《中国优秀硕士学位论文全文库》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112114665A (en) * 2020-08-23 2020-12-22 西北工业大学 Hand tracking method based on multi-mode fusion
CN112114665B (en) * 2020-08-23 2023-04-11 西北工业大学 Hand tracking method based on multi-mode fusion
CN112535474A (en) * 2020-11-11 2021-03-23 西安交通大学 Lower limb movement joint angle real-time prediction method based on similar rule search
CN113829339A (en) * 2021-08-02 2021-12-24 上海大学 Exoskeleton movement coordination method based on long-time and short-time memory network
CN113829339B (en) * 2021-08-02 2023-09-15 上海大学 Exoskeleton motion coordination method based on long short-term memory network
CN115294653A (en) * 2022-08-10 2022-11-04 电子科技大学 Lower limb exoskeleton gait prediction method based on Gaussian process regression
CN119723666A (en) * 2024-12-17 2025-03-28 南京邮电大学 A gait feature detection method for human hip arthritis based on gait analysis and posture estimation
CN120578884A (en) * 2025-05-15 2025-09-02 国网湖北省电力有限公司超高压公司 Exoskeleton trajectory prediction method, system and device based on motion intention recognition

Also Published As

Publication number Publication date
CN111096830B (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN111096830B (en) Exoskeleton gait prediction method based on LightGBM
Williams et al. Recurrent convolutional neural networks as an approach to position-aware myoelectric prosthesis control
Jiang et al. Bio-robotics research for non-invasive myoelectric neural interfaces for upper-limb prosthetic control: a 10-year perspective review
CN110755070B (en) Multi-sensor fusion-based lower limb movement pose rapid prediction system and method
CN106067178A (en) A kind of hand joint based on muscle synergistic activation model continuous method of estimation of motion
CN109394476A (en) The automatic intention assessment of brain flesh information and upper limb intelligent control method and system
Xi et al. Simultaneous and continuous estimation of joint angles based on surface electromyography state-space model
Koch et al. A recurrent neural network for hand gesture recognition based on accelerometer data
CN111399640A (en) Multi-mode man-machine interaction control method for flexible arm
CN109645995B (en) Joint motion estimation method based on EMG model and unscented Kalman filter
Zeng et al. Motion prediction based on sEMG-transformer for lower limb exoskeleton robot control
Johan et al. Preliminary design of an Intention-based sEMG-controlled 3 DOF upper limb exoskeleton for assisted therapy in activities of daily life in patients with hemiparesis
Li et al. Prediction of knee joint moment by surface electromyography of the antagonistic and agonistic muscle pairs
Wang et al. sEMG-based multi-joints motion estimation of lower limb utilizing deep convolutional neural network
CN107490470A (en) A kind of detection method for upper limbs ectoskeleton power-assisted efficiency
Bhardwaj et al. Electromyography in physical rehabilitation: a review
Zhang et al. Real‐time gait intention recognition for active control of unilateral knee exoskeleton
Zhang et al. Combined influence of classifiers, window lengths and number of channels on EMG pattern recognition for upper limb movement classification
JP2016054994A (en) Motion estimation device and motion estimation method
CN117036421A (en) Kerneled motion primitive humanoid trajectory planning method based on posture coordination
Yu et al. Finger joint angle estimation based on sEMG signals by attention-MLP
Tang et al. sEMG-based estimation of knee joint angles and motion intention recognition
Shen et al. A modular lower limb exoskeleton system with rl based walking assistance control
Yin et al. Real-Time Gait Trajectory Prediction Based on Convolutional Neural Network with Multi-Head Attention
CN118717470B (en) A human-computer interaction control method for a lower limb exoskeleton robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant