US20170243354A1 - Automatic frontal-view gait segmentation for abnormal gait quantification - Google Patents
Automatic frontal-view gait segmentation for abnormal gait quantification Download PDFInfo
- Publication number
- US20170243354A1 US20170243354A1 US15/283,603 US201615283603A US2017243354A1 US 20170243354 A1 US20170243354 A1 US 20170243354A1 US 201615283603 A US201615283603 A US 201615283603A US 2017243354 A1 US2017243354 A1 US 2017243354A1
- Authority
- US
- United States
- Prior art keywords
- joint
- gait
- subject
- set forth
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06T7/0042—
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7275—Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
-
- G06K9/00342—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/285—Analysis of motion using a sequence of stereo image pairs
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
Definitions
- Human gait constitutes an essential metric related to a person's health and well-being. Degradation of a person's walking pattern decreases quality of life for the individual and may result in falls and injuries. In one estimate, one out of every three elder adults (over the age of 65) falls each year, and these related injuries cost $20 billion per year in the United States.
- neurological maladies e.g., Parkinson's disease or multiple sclerosis
- degradation of the bones, joints or muscles e.g., lower limb injury or pains
- geriatric diseases e.g., osteoporosis
- the common symptoms for these cases include moving with slow pace, unstable standing, tilted walking, mini-step walking, altering velocity, length of the stride and cadence. Therefore, passive monitoring of a person's gait and the detection of deviations from normal patterns can support current frailty assessments leading to an improved and earlier detection of many diseases, or provide valuable information for rehabilitation. On the other hand, assessment is important for recuperative efforts. For example, improvement in a person's gait can be monitored and expected when therapeutic actions are taken, such as adjustment of medication, physical therapy, and joint replacement. It is very desirable to enable frequent, objective assessments to continuously understand a person's condition as well as perform fall prediction when gait changes significantly over a short period of time.
- Wearable sensors are being developed to add objectivity and move the assessment into a passive (e.g., home) setting, rather than costly, infrequent clinical assessments.
- the various wearable sensor-based systems that have been proposed use sensors located on several parts of the body, such as feet, knees, thighs or waist. Different types of sensors are used to capture the various signals that characterize the human gait.
- their major disadvantage is the need to place devices on the subject's body, which may be uncomfortable or intrusive.
- the use of wearable sensors allows analysis of only a limited number of gait parameters. Besides, the analysis of the signals is computationally complex and presents the problem of excessive noise.
- Marker based this method requires the subject wear easily detectable markers on the body, usually at joint locations.
- the 2D or 3D locations of the markers will be extracted in a monocular or multi-camera system.
- the marker locations or the relationships between them are then used to segment each stride/step.
- Marker-less this category of methods can be divided into two sub-categories: holistic (usually model free) and model based.
- holistic methods human subjects are usually first detected, tracked and segmented; then gait is usually characterized by the statistics of the spatiotemporal patterns generated by the silhouette of the walking person. A set of features/gait signatures is then computed from the patterns for segmentation/recognition, etc.
- One approach analyzed the auto correlation signals of the image sequence.
- Another approach used XT & YT slices for gait analysis.
- Model-based methods apply human body/shape or motion models to recover features of gait mechanics and kinematics. The relationship between body parts will be used to segment each stride/step or for other purposes. Models include generative and discriminative models.
- gait cycle precisely is one of the most important steps and building blocks.
- Stride-to-stride measurement of gait signals is essential for disease diagnosing and monitoring, such as Parkinson's. As such diseases usually progress over a long period of time, it is very desirable to enable frequent and objective assessments to continuously understand such patients' ambulatory condition.
- Gait signals can come from wearable devices or camera data.
- Current methods for gait analysis include manual or automatic segmentation based on some gait signals such as feet distance or knee angles, etc. Visual inspection of gait from real-time actions or video recordings is subjective and requires a costly trained professional to be present, thereby limiting the frequency at which evaluations can be performed.
- Wearables capture only a portion of gait signal (depending on where the sensors are positioned) and require compliance of a patient to consistently wear the device if day-to-day measurement is to be taken.
- Current computer vision techniques can be categorized into marker-based and marker-less approaches. Similar to wearables, marker-based technologies require precise positioning of markers on subjects, which is not feasible for day-to-day monitoring.
- Monocular marker-less technologies often require identifying human body parts first, which is very challenging due to variations in viewing angle and appearance. Hence, the current monocular marker-less method is usually performed in clinical settings where the viewing angle and camera-to-subject distance are fixed, and the method may not be robust enough in an assisted living or traditional home setting.
- Marker-based technologies require precise positioning of markers on subjects, which is not feasible in day-to-day monitoring.
- Monocular marker-less technologies are often performed in a clinical side-view, open space setting, where lateral views are possible. Lateral views may not be readily obtainable in an assisted living or traditional home setting.
- a computer-implemented method for gait analysis of a subject comprises obtaining visual data from an image capture device positioned in front of or behind the subject, the visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject, detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames, generating a joint model depicting the location of the at least one joint in each of the at least two frames, using the joint model to segment a gait cycle for the at least one joint, and comparing the gait cycle to a threshold value to detect abnormal gait.
- the method can further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape.
- the joint model can include a deformable parts model.
- the at least one joint can include an ankle, a knee, a hip, or other joint.
- the gait cycle can include a distance between two consecutive peaks in a trajectory of a joint.
- the gait cycle can include a distance between consecutive peaks in an angle of a joint or body part.
- the obtaining visual data from an image capture device can include using a camera mounted in an elongated hallway in which the subject can walk toward and away from the camera.
- a system for gait analysis of a subject comprises an image capture device operatively coupled to a data processing device and positioned in front of or behind the subject, the image capture device configured to capture visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject, a processor-usable medium embodying computer code, said processor-usable medium being coupled to said data processing device, said computer code comprising instructions executable by said data processing device and configured for: detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames; generating a joint model depicting the location of the at least one joint in each of the at least two frames; using the joint model to segment a gait cycle for the at least one joint; and comparing the gait cycle to a threshold value to detect abnormal gait.
- the instructions can further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape.
- the joint model can include a deformable parts model.
- the at least one joint can include, an ankle, a knee, a hip, or other joint.
- the gait cycle can include a distance between two consecutive peaks in a trajectory of a joint.
- the gait cycle can include a distance between consecutive peaks in an angle of a joint or body part.
- the image capture device can be mounted in an elongated hallway in which the subject can walk toward and away from the camera.
- a non-transitory computer-usable medium for gait analysis of a subject embodying a computer program code, said computer program code comprising computer executable instructions configured for: obtaining visual data from an image capture device positioned in front of or behind the subject, the visual date comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject; detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames; generating a joint model depicting the location of the at least one joint in each of the at least two frames; using the joint model to segment a gait cycle for the at least one joint; and comparing the gait cycle to a threshold value to detect abnormal gait.
- the instructions can further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape.
- the joint model can include a deformable parts model.
- the at least one joint can include an ankle, a knee, a hip, or other joint.
- the gait cycle can include a distance between two consecutive peaks in a trajectory of a joint or a distance between consecutive peaks in an angle of a joint or body part.
- the obtaining visual data from an image capture device can include using a camera mounted in an elongated hallway in which the subject can walk toward and away from the camera.
- FIG. 1 is a flowchart of an exemplary method in accordance with the present disclosure
- FIG. 2 is a schematic block diagram of an exemplary system in accordance with the present disclosure
- FIG. 3A illustrates a series of images with detected body parts superimposed thereon for a first subject condition
- FIG. 3B illustrates a series of images with detected body parts superimposed thereon for a second subject condition
- FIG. 4A is a series of 2D images and 3D shapes estimated therefrom represented as a linear combination of rotatable basis shapes for a first field of depth within the same field of view;
- FIG. 4B is a series of 2D images and 3D shapes estimated therefrom represented as a linear combination of rotatable basis shapes for a second field of depth within the same field of view;
- FIG. 5 graphically illustrates calculated features from reconstructed 3D model of DPM landmarks (DPM3D) compared with the features of a 3D model built based on manually annotated joints (GT3D);
- FIG. 6A graphically illustrates a comparison between two main features displayed in FIG. 5 (foot distance and knee angle), for a first experiment (a), the features calculated from reconstructed 3D model of DPM landmarks (DPM3D) compared with the features of a 3D model built based on manually annotated joints (GT3D);
- FIG. 6B graphically illustrates a comparison between two main features displayed in FIG. 5 (foot distance and knee angle), for a second experiment (b), the features calculated from reconstructed 3D model of DPM landmarks (DPM3D) compared with the features of a 3D model built based on manually annotated joints (GT3D); and,
- FIG. 7 graphically illustrates the variation of stride duration of the subject for the different conditions of FIG. 6 .
- the present disclosure sets forth systems and methods for performing an objective evaluation of different gait parameters by applying computer vision techniques that can use existing monitoring systems without substantial additional cost or equipment. Aspects of the present disclosure can perform assessment during a user's daily activity without the requirement to wear a device (e.g., a sensor or the like) or special clothing (e.g., uniform with distinct marks on certain joints of the person). Computer vision in accordance with the present disclosure can allow simultaneous, in-depth analysis of a higher number of parameters than current wearable systems. Unlike approaches utilizing wearable sensors, the present disclosure is not restricted or limited by power consumption requirements of sensors. The present disclosure can provide a consistent, objective measurement of gait parameters, which reduces error and variability incurred by subjective techniques. To achieve these goals, a body and gait representation is generated that can provide gait characteristics of an individual, while applying generally to classification and quantification of gait across individuals.
- a body and gait representation is generated that can provide gait characteristics of an individual, while applying generally to classification and quantification of gait across individuals.
- the present disclosure sets forth the following approach to overcome the problem of frontal-view gait abnormality detection which can be performed with a single non-calibrated camera and extracts unique signatures from descriptors of the body's deformation. Similar to a real life scenario, the subjects walk in a hallway toward or away from a camera.
- aspects of the method can include 1) Detection of body parts as 2D landmarks by employing a pose estimation algorithm in each frame; 2) Refining joint locations; 3) Estimation of the 3D shape of each subject, given the set of 2D landmarks detected; 4) Using the estimated 3D joint positions, variation of different features, such as knee angle or the distance between right and left feet is calculated; 5) Extraction of multiple gait cycles from each sequence of features by detecting the consecutive peaks from the aforementioned signals. 6) Using stride length, duration and average amplitude of knee angle as features for quantification of gait status.
- aspects of the present disclosure are aimed at passive assessment of health conditions for settings, such as in-home and assisted living.
- the technologies can also apply to a clinical setting.
- the system and method are directed to quantification from common walking settings, such as a hallway, where frontal views are available. The quantification can be used to understand the current state or progression of a degenerative condition or the recuperation after a medical procedure, such as knee replacement.
- the methods and systems described below further address the problem of segmenting gait cycles from video in the natural setting where the subject moves toward or away from the camera.
- the present method addresses this important imaging setting because it allows monitoring in real life home or assisted living settings where cameras can be mounted in hallways and multiple step cycles can be observed.
- Existing lateral-view methods are not well suited for this imaging condition.
- a single frontal view point is used and a special pose or camera calibration is not used, which differentiates aspects of the present disclosure from existing technology.
- the method accommodates the change of scale as the individual walks toward (or away from) the camera.
- the gait cycles are extracted from descriptors of the body's deformation.
- step 10 illustrates an exemplary process 2 in accordance with an aspect of the present disclosure.
- the exemplary method begins in step 10 wherein images of one or more subjects are acquired. This is typically performed by recording video or capturing multiple still images. It should be appreciated that step 10 includes acquiring a series of frames per gait cycle. While more frames per gait cycle can provide more information on details of movement within a cycle, at least two frames per cycle are needed to quantify the duration of a stride.
- step 12 detection of body parts as 2D landmarks is performed by employing a pose estimation algorithm in each frame.
- step 14 2D joint locations are refined.
- step 16 estimation of the 3D shape of each subject, given the set of 2D refined 2D joint locations, is performed.
- step 18 using the estimated 3D joint positions, variation of different features such as knee angle or the distance between right and left feet is calculated.
- step 20 extraction of multiple gait cycles from each sequence of features by detecting the consecutive peaks from the aforementioned signals is performed.
- step 22 gait status quantification is performed using stride length, duration and average amplitude of knee angle as features.
- an exemplary system 110 in accordance with the present disclosure is illustrated in block diagram form in connection with a patient space 122 such as a hallway, waiting room, or the like.
- patient space 122 is exemplary, and that the system 110 can be implemented in virtually any location or setting (e.g., public or private spaces, etc.) provided suitable images of a subject approaching and/or departing can be obtained.
- a plurality of cameras C 1 , C 2 and C 3 are positioned at different locations within the patient space 122 . However, any number of cameras can be utilized.
- the cameras C 1 , C 2 and C 3 are connected to a computer 130 and supply visual data comprising one or more image frames thereto via a communication interface 132 .
- the computer 130 can be a standalone unit configured specifically to perform the tasks associated with the aspects of this disclosure. In other embodiments, aspects of the disclosure can be integrated into existing systems, computers, etc.
- the communication interface 132 can be a wireless or wired communication interface depending on the application.
- the computer 130 further includes a central processing unit 136 coupled with a memory 138 .
- Stored in the memory 138 are various modules including an image acquisition module 140 , a gait analysis module 142 , and a gait segmentation module 144 .
- Visual data received from the cameras C 1 , C 2 and C 3 can be stored in memory 138 for processing by the CPU 136 in accordance with this disclosure. It will further be appreciated that the various modules can be configured to carry out the functions described in detail in the following paragraphs.
- a pose estimation algorithm is employed at step 12 to find the approximate position of joints in each frame of the video. This is accomplished, for example, using the Flexible Part Model for each frame independently.
- the torso and lower limbs are focused on; so the model consists of eighteen (18) parts total with basic parts including head, neck, shoulders, waist, hips, knees and ankles.
- the number of shape mixtures per part varies, which were estimated using hierarchical clustering.
- For the lower limbs five (5) mixtures, shoulders two (2), head and neck three (3) and the rest of the joints one (1) mixture was employed.
- the N-best pose solution is found per frame using the following method.
- a score configuration as the one in Eq. (1) where z i is the location of part i, with a local part score ⁇ (z i ), and pairwise deformation model ⁇ (z i , z j ), one can find the best configuration by backtracking from the root location with the highest score.
- N candidate poses are generated and for a particular pose one wants to maximize score in Eq. (2) where Local(k t ) is the score of candidate pose computed by Eq. (1).
- the part detection module provides an estimation of the selected parts in the form of bounding boxes. Then, using the estimation accurate locations of the joints are found.
- the corresponding landmarks are found in the 2D image using a set of regression models based on part locations estimated from a Deformable Part Model (DPM), for example.
- DPM Deformable Part Model
- the regression model is trained for x and y positions of each landmark separately given the location of detected bounding box of the corresponding landmark.
- FIG. 3 shows examples of DPM overlays for frontal views of two individuals (a) and (b) walking along a hall.
- An optional method that has the potential to improve feature extraction uses 3D shape reconstruction at step 16 .
- a convex formulation is applied to reconstruct the 3D shape of each subject given a set of 2D landmarks in each frame.
- the method employs a shape-space model, where a 3D shape is represented as a linear combination of rotatable basis shapes.
- Equation (3) shows the estimated shape S is a linear combination of k basis shapes B i learned from training data, rotated by rotation matrix R i and scaled by c i for each of the i basis shapes.
- the model is trained based on, for example, seven subjects from a dataset, such as the Carnegie Mellon University Motion Capture Database (CMU MoCap dataset). The selected subjects perform different activities such as jumping, boxing, running as well as walking.
- FIGS. 4A and 4B Examples of 3D estimated shapes generated from 2D image frames are illustrated in FIGS. 4A and 4B for the same subject at different fields of depth.
- step 18 features are extracted from the 3D joint positions estimated from the 3D shape.
- the features that are estimated are both in the 3D space such as the distance between right and left knee, distance between right and left foot, the variation of left knee angle (affected leg) which is the angle between the knee-hip and knee-ankle segments, as well as the 2D space such as the oscillation of head.
- Some of these features are displayed in FIG. 5 for a sample sequence. The selected sequence depicts a pattern for normal walking.
- the calculated features from reconstructed 3D model of DPM landmarks has been compared with the features of 3D model built based on manually annotated joints (GT3D).
- Gait cycle segmentation is performed in step 20 .
- Each feature extracted represents a gait cycle in a slightly different way from the others.
- One cycle for knee angle would be the distance between two consecutive peaks in the trajectory created by the whole sequence while for foot distance the distance between two adjacent peaks define a stride.
- a set of metrics such as stride duration, stride length, and cadence that have demonstrated significant differences clinically among other movements for various diseases or injuries. For example, some research indicates that stride length decreases with progression of Parkinson's disease while stride duration (time) tends to not decrease.
- the estimated set of metrics along with other features can then be employed for abnormal gait quantification depending on the application.
- FIG. 6 displays a comparison between two main features displayed in FIG. 5 (foot distance and knee angle), for two different conditions (a) and (b) of the same subject.
- the changes in stride duration (horizontal axis) by increasing the weights is clearly evident.
- FIG. 7 summarized in FIG. 7 for the variation of stride duration of the same subject.
- FIGS. 6 and 7 illustrate but one example of the manner in which aspects of the present disclosure can be used to analysis gait characteristics.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Artificial Intelligence (AREA)
- Social Psychology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
Abstract
A computer-implemented method for gait analysis of a subject includes obtaining visual data from an image capture device positioned in front of or behind the subject, the visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject, detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames, generating a joint model depicting the location of the at least one joint in each of the at least two frames, using the joint model to segment a gait cycle for the at least one joint, and comparing the gait cycle to a threshold value to detect abnormal gait.
Description
- This application claims priority to and the benefit of the filing date of U.S. Provisional Patent Application Ser. No. 62/297,341, filed Feb. 19, 2016, which application is hereby incorporated by reference.
- Human gait, a biometric aimed to recognize individuals by the way they walk, has recently come to play an increasingly important role in different applications such as access control and visual surveillance. Although no two body movements are ever the same, gait is a characteristic of an individual, analogous to other biometrics. Psychological, medical, and biomechanical studies support the notion that humans effortlessly recognize people by the way they walk, and basic gait patterns are unique to each individual. In contrast to many established biometric modalities such as face, fingerprint or retina, gait can be analyzed from a distance and can be observed without notification to the subject or compliance by the subject. In fact, the considerable attention towards this biometric has been due to its ability to ascertain somebody's identity at a distance while being noninvasive and non-perceivable.
- However, human gait analysis and assessment involves challenging issues due to the highly flexible structure and self-occlusion of the human body. These issues mandate using complicated processes for the measurement and analysis of gait in marker-less video sequences. For instance, footwear, physical conditions such as pregnancy, leg or foot injuries, or even drunkenness can change the manner of walking. Like most biometrics, gait will inherently change with age. Therefore, gait can disclose more than identity. As there are numerous applications for the detection of abnormal gait, it seems worthwhile to explore techniques that can accomplish this goal.
- Human gait constitutes an essential metric related to a person's health and well-being. Degradation of a person's walking pattern decreases quality of life for the individual and may result in falls and injuries. In one estimate, one out of every three elder adults (over the age of 65) falls each year, and these related injuries cost $20 billion per year in the United States. There are different types of physiological and anatomical factors that can adversely affect gait, such as neurological maladies (e.g., Parkinson's disease or multiple sclerosis), degradation of the bones, joints or muscles, lower limb injury or pains and geriatric diseases, such as osteoporosis, which affect a large percentage of the population. The common symptoms for these cases include moving with slow pace, unstable standing, tilted walking, mini-step walking, altering velocity, length of the stride and cadence. Therefore, passive monitoring of a person's gait and the detection of deviations from normal patterns can support current frailty assessments leading to an improved and earlier detection of many diseases, or provide valuable information for rehabilitation. On the other hand, assessment is important for recuperative efforts. For example, improvement in a person's gait can be monitored and expected when therapeutic actions are taken, such as adjustment of medication, physical therapy, and joint replacement. It is very desirable to enable frequent, objective assessments to continuously understand a person's condition as well as perform fall prediction when gait changes significantly over a short period of time.
- The traditional scales used to analyze gait parameters in clinical conditions are semi-subjective, carried out by specialists who observe the quality of a patient's gait by making him/her walk. This is sometimes followed by a survey in which the patient is asked to give a subjective evaluation of the quality of his/her gait. The disadvantage of these methods is that they give subjective measurements, particularly concerning accuracy and precision, which have a negative effect on the diagnosis, follow-up and treatment of the pathologies.
- Wearable sensors are being developed to add objectivity and move the assessment into a passive (e.g., home) setting, rather than costly, infrequent clinical assessments. The various wearable sensor-based systems that have been proposed use sensors located on several parts of the body, such as feet, knees, thighs or waist. Different types of sensors are used to capture the various signals that characterize the human gait. However, their major disadvantage is the need to place devices on the subject's body, which may be uncomfortable or intrusive. Also, the use of wearable sensors allows analysis of only a limited number of gait parameters. Besides, the analysis of the signals is computationally complex and presents the problem of excessive noise.
- Other than wearable or ground sensors, cameras are also used to analyze gait. Prior camera-based approaches have included the following:
- Marker based: this method requires the subject wear easily detectable markers on the body, usually at joint locations. The 2D or 3D locations of the markers will be extracted in a monocular or multi-camera system. The marker locations or the relationships between them are then used to segment each stride/step.
- Marker-less: this category of methods can be divided into two sub-categories: holistic (usually model free) and model based. For holistic methods, human subjects are usually first detected, tracked and segmented; then gait is usually characterized by the statistics of the spatiotemporal patterns generated by the silhouette of the walking person. A set of features/gait signatures is then computed from the patterns for segmentation/recognition, etc. One approach analyzed the auto correlation signals of the image sequence. Another approach used XT & YT slices for gait analysis. Model-based methods apply human body/shape or motion models to recover features of gait mechanics and kinematics. The relationship between body parts will be used to segment each stride/step or for other purposes. Models include generative and discriminative models.
- For most gait analysis methods, segmenting gait cycle precisely is one of the most important steps and building blocks. Stride-to-stride measurement of gait signals is essential for disease diagnosing and monitoring, such as Parkinson's. As such diseases usually progress over a long period of time, it is very desirable to enable frequent and objective assessments to continuously understand such patients' ambulatory condition. Gait signals can come from wearable devices or camera data. Current methods for gait analysis include manual or automatic segmentation based on some gait signals such as feet distance or knee angles, etc. Visual inspection of gait from real-time actions or video recordings is subjective and requires a costly trained professional to be present, thereby limiting the frequency at which evaluations can be performed. Wearables capture only a portion of gait signal (depending on where the sensors are positioned) and require compliance of a patient to consistently wear the device if day-to-day measurement is to be taken. Current computer vision techniques can be categorized into marker-based and marker-less approaches. Similar to wearables, marker-based technologies require precise positioning of markers on subjects, which is not feasible for day-to-day monitoring. Monocular marker-less technologies often require identifying human body parts first, which is very challenging due to variations in viewing angle and appearance. Hence, the current monocular marker-less method is usually performed in clinical settings where the viewing angle and camera-to-subject distance are fixed, and the method may not be robust enough in an assisted living or traditional home setting.
- Marker-based technologies require precise positioning of markers on subjects, which is not feasible in day-to-day monitoring. Monocular marker-less technologies are often performed in a clinical side-view, open space setting, where lateral views are possible. Lateral views may not be readily obtainable in an assisted living or traditional home setting.
- The following references, the disclosures of which are incorporated by reference herein in their entireties, and filed concurrently, are mentioned:
- U.S. application Ser. No. 15/283,629, filed Oct. 3, 2016, by Xu et al., (Attorney Docket No. XERZ 203330US01), entitled “A COMPUTER VISION SYSTEM FOR AMBIENT LONG-TERM GAIT ASSESSMENT”; and, U.S. application Ser. No. 15/283,663, filed Oct. 3, 2016, by Wu et al., (Attorney Docket No. XERZ 203336US01), entitled “SYSTEM AND METHOD FOR AUTOMATIC GAIT CYCLE SEGMENTATION”.
- The following reference, the disclosure of which is incorporated by reference herein in its entirety, is mentioned:
- U.S. application Ser. No. 14/963,602, filed Dec. 9, 2015, by Bernal, et al., (Attorney Docket No. XERZ 203256US01), entitled “COMPUTER-VISION-BASED GROUP IDENTIFICATION)”.
- In accordance with one aspect, a computer-implemented method for gait analysis of a subject comprises obtaining visual data from an image capture device positioned in front of or behind the subject, the visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject, detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames, generating a joint model depicting the location of the at least one joint in each of the at least two frames, using the joint model to segment a gait cycle for the at least one joint, and comparing the gait cycle to a threshold value to detect abnormal gait.
- The method can further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape. The joint model can include a deformable parts model. The at least one joint can include an ankle, a knee, a hip, or other joint. The gait cycle can include a distance between two consecutive peaks in a trajectory of a joint. The gait cycle can include a distance between consecutive peaks in an angle of a joint or body part. The obtaining visual data from an image capture device can include using a camera mounted in an elongated hallway in which the subject can walk toward and away from the camera.
- In accordance with another aspect, a system for gait analysis of a subject comprises an image capture device operatively coupled to a data processing device and positioned in front of or behind the subject, the image capture device configured to capture visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject, a processor-usable medium embodying computer code, said processor-usable medium being coupled to said data processing device, said computer code comprising instructions executable by said data processing device and configured for: detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames; generating a joint model depicting the location of the at least one joint in each of the at least two frames; using the joint model to segment a gait cycle for the at least one joint; and comparing the gait cycle to a threshold value to detect abnormal gait.
- The instructions can further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape. The joint model can include a deformable parts model. The at least one joint can include, an ankle, a knee, a hip, or other joint. The gait cycle can include a distance between two consecutive peaks in a trajectory of a joint. The gait cycle can include a distance between consecutive peaks in an angle of a joint or body part. The image capture device can be mounted in an elongated hallway in which the subject can walk toward and away from the camera.
- In accordance with another aspect, a non-transitory computer-usable medium for gait analysis of a subject is set forth, said computer-usable medium embodying a computer program code, said computer program code comprising computer executable instructions configured for: obtaining visual data from an image capture device positioned in front of or behind the subject, the visual date comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject; detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames; generating a joint model depicting the location of the at least one joint in each of the at least two frames; using the joint model to segment a gait cycle for the at least one joint; and comparing the gait cycle to a threshold value to detect abnormal gait.
- The instructions can further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape. The joint model can include a deformable parts model. The at least one joint can include an ankle, a knee, a hip, or other joint. The gait cycle can include a distance between two consecutive peaks in a trajectory of a joint or a distance between consecutive peaks in an angle of a joint or body part. The obtaining visual data from an image capture device can include using a camera mounted in an elongated hallway in which the subject can walk toward and away from the camera.
-
FIG. 1 is a flowchart of an exemplary method in accordance with the present disclosure; -
FIG. 2 is a schematic block diagram of an exemplary system in accordance with the present disclosure; -
FIG. 3A illustrates a series of images with detected body parts superimposed thereon for a first subject condition; -
FIG. 3B illustrates a series of images with detected body parts superimposed thereon for a second subject condition; -
FIG. 4A is a series of 2D images and 3D shapes estimated therefrom represented as a linear combination of rotatable basis shapes for a first field of depth within the same field of view; -
FIG. 4B is a series of 2D images and 3D shapes estimated therefrom represented as a linear combination of rotatable basis shapes for a second field of depth within the same field of view; -
FIG. 5 graphically illustrates calculated features from reconstructed 3D model of DPM landmarks (DPM3D) compared with the features of a 3D model built based on manually annotated joints (GT3D); -
FIG. 6A graphically illustrates a comparison between two main features displayed inFIG. 5 (foot distance and knee angle), for a first experiment (a), the features calculated from reconstructed 3D model of DPM landmarks (DPM3D) compared with the features of a 3D model built based on manually annotated joints (GT3D); -
FIG. 6B graphically illustrates a comparison between two main features displayed inFIG. 5 (foot distance and knee angle), for a second experiment (b), the features calculated from reconstructed 3D model of DPM landmarks (DPM3D) compared with the features of a 3D model built based on manually annotated joints (GT3D); and, -
FIG. 7 graphically illustrates the variation of stride duration of the subject for the different conditions ofFIG. 6 . - The present disclosure sets forth systems and methods for performing an objective evaluation of different gait parameters by applying computer vision techniques that can use existing monitoring systems without substantial additional cost or equipment. Aspects of the present disclosure can perform assessment during a user's daily activity without the requirement to wear a device (e.g., a sensor or the like) or special clothing (e.g., uniform with distinct marks on certain joints of the person). Computer vision in accordance with the present disclosure can allow simultaneous, in-depth analysis of a higher number of parameters than current wearable systems. Unlike approaches utilizing wearable sensors, the present disclosure is not restricted or limited by power consumption requirements of sensors. The present disclosure can provide a consistent, objective measurement of gait parameters, which reduces error and variability incurred by subjective techniques. To achieve these goals, a body and gait representation is generated that can provide gait characteristics of an individual, while applying generally to classification and quantification of gait across individuals.
- The present disclosure sets forth the following approach to overcome the problem of frontal-view gait abnormality detection which can be performed with a single non-calibrated camera and extracts unique signatures from descriptors of the body's deformation. Similar to a real life scenario, the subjects walk in a hallway toward or away from a camera. Aspects of the method can include 1) Detection of body parts as 2D landmarks by employing a pose estimation algorithm in each frame; 2) Refining joint locations; 3) Estimation of the 3D shape of each subject, given the set of 2D landmarks detected; 4) Using the estimated 3D joint positions, variation of different features, such as knee angle or the distance between right and left feet is calculated; 5) Extraction of multiple gait cycles from each sequence of features by detecting the consecutive peaks from the aforementioned signals. 6) Using stride length, duration and average amplitude of knee angle as features for quantification of gait status.
- Aspects of the present disclosure are aimed at passive assessment of health conditions for settings, such as in-home and assisted living. The technologies can also apply to a clinical setting. In one embodiment, the system and method are directed to quantification from common walking settings, such as a hallway, where frontal views are available. The quantification can be used to understand the current state or progression of a degenerative condition or the recuperation after a medical procedure, such as knee replacement.
- The methods and systems described below further address the problem of segmenting gait cycles from video in the natural setting where the subject moves toward or away from the camera. The present method addresses this important imaging setting because it allows monitoring in real life home or assisted living settings where cameras can be mounted in hallways and multiple step cycles can be observed. Existing lateral-view methods are not well suited for this imaging condition. In one embodiment, a single frontal view point is used and a special pose or camera calibration is not used, which differentiates aspects of the present disclosure from existing technology. The method accommodates the change of scale as the individual walks toward (or away from) the camera. The gait cycles are extracted from descriptors of the body's deformation.
- With reference to
FIG. 1 , a flow chart illustrates anexemplary process 2 in accordance with an aspect of the present disclosure. The exemplary method begins instep 10 wherein images of one or more subjects are acquired. This is typically performed by recording video or capturing multiple still images. It should be appreciated thatstep 10 includes acquiring a series of frames per gait cycle. While more frames per gait cycle can provide more information on details of movement within a cycle, at least two frames per cycle are needed to quantify the duration of a stride. - In
step 12, detection of body parts as 2D landmarks is performed by employing a pose estimation algorithm in each frame. In 14, 2D joint locations are refined. Instep step 16, estimation of the 3D shape of each subject, given the set of 2D refined 2D joint locations, is performed. Instep 18, using the estimated 3D joint positions, variation of different features such as knee angle or the distance between right and left feet is calculated. Instep 20, extraction of multiple gait cycles from each sequence of features by detecting the consecutive peaks from the aforementioned signals is performed. Instep 22, gait status quantification is performed using stride length, duration and average amplitude of knee angle as features. Each of steps 10-22 are further described below. - In
FIG. 2 , anexemplary system 110 in accordance with the present disclosure is illustrated in block diagram form in connection with apatient space 122 such as a hallway, waiting room, or the like. It will be appreciated thatpatient space 122 is exemplary, and that thesystem 110 can be implemented in virtually any location or setting (e.g., public or private spaces, etc.) provided suitable images of a subject approaching and/or departing can be obtained. In the exemplary embodiment, a plurality of cameras C1, C2 and C3 are positioned at different locations within thepatient space 122. However, any number of cameras can be utilized. - The cameras C1, C2 and C3 are connected to a
computer 130 and supply visual data comprising one or more image frames thereto via acommunication interface 132. It will be appreciated that thecomputer 130 can be a standalone unit configured specifically to perform the tasks associated with the aspects of this disclosure. In other embodiments, aspects of the disclosure can be integrated into existing systems, computers, etc. Thecommunication interface 132 can be a wireless or wired communication interface depending on the application. Thecomputer 130 further includes acentral processing unit 136 coupled with amemory 138. Stored in thememory 138 are various modules including animage acquisition module 140, agait analysis module 142, and agait segmentation module 144. Visual data received from the cameras C1, C2 and C3 can be stored inmemory 138 for processing by theCPU 136 in accordance with this disclosure. It will further be appreciated that the various modules can be configured to carry out the functions described in detail in the following paragraphs. - With reference to
FIG. 3 , and returning to the description ofmethod 2 inFIG. 1 , a pose estimation algorithm is employed atstep 12 to find the approximate position of joints in each frame of the video. This is accomplished, for example, using the Flexible Part Model for each frame independently. In one approach, the torso and lower limbs are focused on; so the model consists of eighteen (18) parts total with basic parts including head, neck, shoulders, waist, hips, knees and ankles. The number of shape mixtures per part varies, which were estimated using hierarchical clustering. In one approach, for the lower limbs five (5) mixtures, shoulders two (2), head and neck three (3) and the rest of the joints one (1) mixture was employed. - Then, the N-best pose solution is found per frame using the following method. Starting with a score configuration as the one in Eq. (1) where zi is the location of part i, with a local part score φ(zi), and pairwise deformation model ψ(zi, zj), one can find the best configuration by backtracking from the root location with the highest score.
-
S(z)=Σi∈Vφ(z i)+Σij∈Eψ(z i , z j) Eq. (1) - Using the N-best algorithm, iteratively return configurations ordered by score. And finally by exploiting temporal context from neighboring frames, associate the poses to find the best track in the whole video. The selected track is the best smoothing track covering the whole temporal span of the video. To do that, for each frame tin the video, N candidate poses are generated and for a particular pose one wants to maximize score in Eq. (2) where Local(kt) is the score of candidate pose computed by Eq. (1).
-
Score(k)=ΣtLocal(k t)+αPairwise(k t , k t−1) Eq. (2) - At
step 14, the part detection module provides an estimation of the selected parts in the form of bounding boxes. Then, using the estimation accurate locations of the joints are found. The corresponding landmarks are found in the 2D image using a set of regression models based on part locations estimated from a Deformable Part Model (DPM), for example. The regression model is trained for x and y positions of each landmark separately given the location of detected bounding box of the corresponding landmark. -
FIG. 3 shows examples of DPM overlays for frontal views of two individuals (a) and (b) walking along a hall. - An optional method that has the potential to improve feature extraction uses 3D shape reconstruction at
step 16. To utilize this method, a convex formulation is applied to reconstruct the 3D shape of each subject given a set of 2D landmarks in each frame. - The method employs a shape-space model, where a 3D shape is represented as a linear combination of rotatable basis shapes. Equation (3) below shows the estimated shape S is a linear combination of k basis shapes Bi learned from training data, rotated by rotation matrix Ri and scaled by ci for each of the i basis shapes. The model is trained based on, for example, seven subjects from a dataset, such as the Carnegie Mellon University Motion Capture Database (CMU MoCap dataset). The selected subjects perform different activities such as jumping, boxing, running as well as walking. In one embodiment, an 11-joint model was trained by learning a dictionary of size 200 (k=200) from the training shapes aligned by the Procrustes method.
-
S=Σi=1 kciRiBi Eq. (3) - Then, a convex relaxation approach is used to solve Eq. (4), and reconstruct the 3D shape using Eq. (3), having the recovered coefficients and rotation matrix of basis shapes.
-
min M1 , . . . , Mk 1/2∥W−Σ i−1 k M i B i|F 2+λΣi=1 k ∥M i∥2, Eq. (4) - Examples of 3D estimated shapes generated from 2D image frames are illustrated in
FIGS. 4A and 4B for the same subject at different fields of depth. - In
step 18, features are extracted from the 3D joint positions estimated from the 3D shape. The features that are estimated are both in the 3D space such as the distance between right and left knee, distance between right and left foot, the variation of left knee angle (affected leg) which is the angle between the knee-hip and knee-ankle segments, as well as the 2D space such as the oscillation of head. Some of these features are displayed inFIG. 5 for a sample sequence. The selected sequence depicts a pattern for normal walking. InFIG. 5 , the calculated features from reconstructed 3D model of DPM landmarks (DPM3D) has been compared with the features of 3D model built based on manually annotated joints (GT3D). - Gait cycle segmentation is performed in
step 20. Each feature extracted represents a gait cycle in a slightly different way from the others. One cycle for knee angle would be the distance between two consecutive peaks in the trajectory created by the whole sequence while for foot distance the distance between two adjacent peaks define a stride. Hence, from all or a selected subset of features, one can segment out gait cycles (peak-to-peak distance) from the sequence. From the segmented gait cycles, one can then estimate a set of metrics such as stride duration, stride length, and cadence that have demonstrated significant differences clinically among other movements for various diseases or injuries. For example, some research indicates that stride length decreases with progression of Parkinson's disease while stride duration (time) tends to not decrease. - At
step 22, the estimated set of metrics along with other features can then be employed for abnormal gait quantification depending on the application. - To simulate different types of abnormalities and test how well the selected features differentiate between them, experiments were performed where subjects walk with various ankle weights. The subjects walk back and forth in a hallway where two cameras are mounted at the front and end. Each subject wears an ankle weight of 2.5 and 7.5 lb. in each sequence. Finally, a sequence of normal gait for each subject where no ankle weight is worn is recorded.
-
FIG. 6 displays a comparison between two main features displayed inFIG. 5 (foot distance and knee angle), for two different conditions (a) and (b) of the same subject. The changes in stride duration (horizontal axis) by increasing the weights is clearly evident. These changes are summarized inFIG. 7 for the variation of stride duration of the same subject. -
FIGS. 6 and 7 illustrate but one example of the manner in which aspects of the present disclosure can be used to analysis gait characteristics. - It should now be appreciated that the system and method set forth the following advantages:
-
- An approach for detection of human gait abnormality in a frontal-view scenario.
- Using DPM to locate joints for abnormal gait detection.
- Using the reconstructed 3D model of the human body in each frame as depth information of detected joints.
- Employing the variations of joint trajectories in 3D as features that abstract from individual gait characteristics but allows for the classification of gait across individuals.
- Achieving objective evaluation of different gait parameters.
- The system further provides repeatability, reproducibility and less external factor inference by facilitating passive monitoring of subjects over long time periods and/or on multiple occasions.
- Being non-intrusive, with no need to place any device or markers on the subject during the experiments.
- Using a low-cost camera without expensive setups and expertise in operating the software.
- It will be appreciated that variants of the above-disclosed and other features and functions, or alternatives thereof, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.
Claims (20)
1. A computer-implemented method for gait analysis of a subject comprising:
obtaining visual data from an image capture device positioned in front of or behind the subject, the visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject;
detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames;
generating a joint model depicting the location of the at least one joint in each of the at least two frames;
using the joint model to segment a gait cycle for the at least one joint; and
comparing the gait cycle to a threshold value to detect abnormal gait.
2. The computer-implemented method for gait analysis of a subject as set forth in claim 1 , further comprising, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape.
3. The computer-implemented method for gait analysis of a subject as set forth in claim 1 , wherein the joint model includes a deformable parts model.
4. The computer-implemented method for gait analysis of a subject as set forth in claim 1 , wherein the at least one joint includes an ankle, a knee, a hip, or other joint.
5. The computer-implemented method for gait analysis of a subject as set forth in claim 4 , wherein the gait cycle includes a distance between two consecutive peaks in a trajectory of a joint.
6. The computer-implemented method for gait analysis of a subject as set forth in claim 4 , wherein the gait cycle includes a distance between consecutive peaks in an angle of a joint or body part.
7. The computer-implemented method for gait analysis of a subject as set forth in claim 1 , wherein the obtaining visual data from an image capture device includes using a camera mounted in an elongated hallway in which the subject can walk toward and away from the camera.
8. A system for gait analysis of a subject comprising:
an image capture device operatively coupled to a data processing device and positioned in front of or behind the subject, the image capture device configured to capture visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject;
a processor-usable medium embodying computer code, said processor-usable medium being coupled to said data processing device, said computer code comprising instructions executable by said data processing device and configured for:
detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames;
generating a joint model depicting the location of the at least one joint in each of the at least two frames;
using the joint model to segment a gait cycle for the at least one joint; and
comparing the gait cycle to a threshold value to detect abnormal gait.
9. The system set forth in claim 8 , wherein the instructions further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape.
10. The system set forth in claim 8 , wherein the joint model includes a deformable parts model.
11. The system set forth in claim 8 , wherein the at least one joint includes an ankle, a knee, a hip, or other joint.
12. The system set forth in claim 11 , wherein the gait cycle includes a distance between two consecutive peaks in a trajectory of a joint.
13. The system set forth in claim 11 , wherein the gait cycle includes a distance between consecutive peaks in an angle of a joint or body part.
14. The system set forth in claim 8 , the image capture device is mounted in an elongated hallway in which the subject can walk toward and away from the camera.
15. A non-transitory computer-usable medium for gait analysis of a subject, said computer-usable medium embodying a computer program code, said computer program code comprising computer executable instructions configured for:
obtaining visual data from an image capture device positioned in front of or behind the subject, the visual data comprising at least two image frames of the subject over a period of time walking toward or away from the image capture device, the at least two image frames capturing at least a portion of the gait of the subject;
detecting within the at least two images body parts as two-dimensional landmarks using a pose estimation algorithm on each of the at least two frames;
generating a joint model depicting the location of the at least one joint in each of the at least two frames;
using the joint model to segment a gait cycle for the at least one joint; and
comparing the gait cycle to a threshold value to detect abnormal gait.
16. The non-transitory computer-usable medium as set forth in claim 15 , wherein the instructions further comprise, prior to generating the joint model, estimating a three-dimensional shape of the subject using the two-dimensional landmarks, and estimating the at least one joint location based on the three-dimensional shape.
17. The non-transitory computer-usable medium as set forth in claim 15 , wherein the joint model includes a deformable parts model.
18. The non-transitory computer-usable medium as set forth in claim 15 , wherein the at least one joint includes an ankle, a knee, a hip, or other joint.
19. The non-transitory computer-usable medium as set forth in claim 18 , wherein the gait cycle includes a distance between two consecutive peaks in a trajectory of a joint or a distance between consecutive peaks in an angle of a joint or body part.
20. The non-transitory computer-usable medium as set forth in claim 15 , wherein the obtaining visual data from an image capture device includes using a camera mounted in an elongated hallway in which the subject can walk toward and away from the camera.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/283,603 US20170243354A1 (en) | 2016-02-19 | 2016-10-03 | Automatic frontal-view gait segmentation for abnormal gait quantification |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201662297341P | 2016-02-19 | 2016-02-19 | |
| US15/283,603 US20170243354A1 (en) | 2016-02-19 | 2016-10-03 | Automatic frontal-view gait segmentation for abnormal gait quantification |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170243354A1 true US20170243354A1 (en) | 2017-08-24 |
Family
ID=59629460
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/283,603 Abandoned US20170243354A1 (en) | 2016-02-19 | 2016-10-03 | Automatic frontal-view gait segmentation for abnormal gait quantification |
| US15/283,663 Active US9996739B2 (en) | 2016-02-19 | 2016-10-03 | System and method for automatic gait cycle segmentation |
| US15/283,629 Active US9993182B2 (en) | 2016-02-19 | 2016-10-03 | Computer vision system for ambient long-term gait assessment |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/283,663 Active US9996739B2 (en) | 2016-02-19 | 2016-10-03 | System and method for automatic gait cycle segmentation |
| US15/283,629 Active US9993182B2 (en) | 2016-02-19 | 2016-10-03 | Computer vision system for ambient long-term gait assessment |
Country Status (1)
| Country | Link |
|---|---|
| US (3) | US20170243354A1 (en) |
Cited By (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170238846A1 (en) * | 2016-02-19 | 2017-08-24 | Xerox Corporation | Computer vision system for ambient long-term gait assessment |
| US20180357760A1 (en) * | 2017-06-09 | 2018-12-13 | Midea Group Co., Ltd. | System and method for care support at home |
| CN109063661A (en) * | 2018-08-09 | 2018-12-21 | 上海弈知信息科技有限公司 | Gait analysis method and device |
| CN109273090A (en) * | 2018-12-03 | 2019-01-25 | 东北大学 | A High Availability Gait Analysis Method Based on Fourier Transform |
| US20190150792A1 (en) * | 2017-11-17 | 2019-05-23 | Toyota Jidosha Kabushiki Kaisha | Gait evaluation apparatus, gait training system, and gait evaluation method |
| CN109871800A (en) * | 2019-02-13 | 2019-06-11 | 北京健康有益科技有限公司 | A kind of estimation method of human posture, device and storage medium |
| CN110728226A (en) * | 2019-10-09 | 2020-01-24 | 清华大学 | Gait quantification system and method based on motion recognition |
| WO2020247246A1 (en) * | 2019-06-07 | 2020-12-10 | Tellus You Care, Inc. | Non-contact identification of gait dynamics, patterns and abnormalities for elderly care |
| WO2020260635A1 (en) | 2019-06-26 | 2020-12-30 | Ekinnox | Method for analysing the gait of an individual |
| JP2021030050A (en) * | 2019-08-29 | 2021-03-01 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Cognitive function evaluation method, cognitive function evaluation device, and cognitive function evaluation program |
| JP2021030051A (en) * | 2019-08-29 | 2021-03-01 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Fall risk evaluation method, fall risk evaluation device and fall risk evaluation program |
| JP2021030049A (en) * | 2019-08-29 | 2021-03-01 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Sarcopenia evaluation method, sarcopenia evaluation device, and sarcopenia evaluation program |
| CN112906599A (en) * | 2021-03-04 | 2021-06-04 | 杭州海康威视数字技术股份有限公司 | Gait-based personnel identity identification method and device and electronic equipment |
| WO2021175208A1 (en) * | 2020-03-02 | 2021-09-10 | 京东方科技集团股份有限公司 | Human body model modeling method and apparatus, electronic device and storage medium |
| JP2021133192A (en) * | 2020-02-28 | 2021-09-13 | 株式会社三菱ケミカルホールディングス | Measurement system, program |
| JP2021133193A (en) * | 2020-02-28 | 2021-09-13 | 株式会社三菱ケミカルホールディングス | Measurement system, method, program |
| US11166436B2 (en) * | 2016-04-28 | 2021-11-09 | Osaka University | Health condition estimation device |
| US20210346761A1 (en) * | 2020-05-06 | 2021-11-11 | Agile Human Performance, Inc. | Automated gait evaluation for retraining of running form using machine learning and digital video data |
| US20210365670A1 (en) * | 2011-01-12 | 2021-11-25 | Gary S. Shuster | Video and still image data alteration to enhance privacy |
| US11253173B1 (en) * | 2017-05-30 | 2022-02-22 | Verily Life Sciences Llc | Digital characterization of movement to detect and monitor disorders |
| CN115965994A (en) * | 2022-12-20 | 2023-04-14 | 中国科学院自动化研究所 | Gait quantification method and device based on low-resolution monocular camera |
| CN119523469A (en) * | 2024-11-15 | 2025-02-28 | 北京牧学计算机技术有限公司 | A gait analysis method and system for children's rehabilitation |
Families Citing this family (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10973440B1 (en) * | 2014-10-26 | 2021-04-13 | David Martin | Mobile control using gait velocity |
| TWM537280U (en) * | 2016-07-06 | 2017-02-21 | Idesyn Semiconductor Corp | Fall detection system analyzing fall severity, and wearable device thereof |
| US11468286B2 (en) * | 2017-05-30 | 2022-10-11 | Leica Microsystems Cms Gmbh | Prediction guided sequential data learning method |
| KR102510874B1 (en) * | 2017-06-15 | 2023-03-16 | 삼성전자주식회사 | Method for walking assist, and devices operating the same |
| US10796477B2 (en) * | 2017-06-20 | 2020-10-06 | Edx Technologies, Inc. | Methods, devices, and systems for determining field of view and producing augmented reality |
| CN107766819B (en) * | 2017-10-18 | 2021-06-18 | 陕西国际商贸学院 | A video surveillance system and its real-time gait recognition method |
| CN110837751B (en) * | 2018-08-15 | 2023-12-29 | 上海脉沃医疗科技有限公司 | Human motion capturing and gait analysis method based on RGBD depth camera |
| PT3656302T (en) * | 2018-11-26 | 2020-11-03 | Lindera Gmbh | System and method for human gait analysis |
| CN109528212B (en) * | 2018-12-29 | 2023-09-19 | 大连乾函科技有限公司 | Abnormal gait recognition equipment and method |
| US11179064B2 (en) * | 2018-12-30 | 2021-11-23 | Altum View Systems Inc. | Method and system for privacy-preserving fall detection |
| JP7097030B2 (en) * | 2019-04-05 | 2022-07-07 | 本田技研工業株式会社 | Subject's motion state observation system |
| CN110765946B (en) * | 2019-10-23 | 2022-07-29 | 北京卡路里信息技术有限公司 | Running posture assessment method, device, equipment and storage medium |
| US11284824B2 (en) * | 2019-12-02 | 2022-03-29 | Everseen Limited | Method and system for determining a human social behavior classification |
| CN113033264A (en) * | 2019-12-25 | 2021-06-25 | 中兴通讯股份有限公司 | Pedestrian retrieval method, server and storage medium |
| CN110974242B (en) * | 2019-12-26 | 2023-02-10 | 浙江福祉科创有限公司 | Gait abnormal degree evaluation method for wearable device and wearable device |
| CN111178338A (en) * | 2020-03-18 | 2020-05-19 | 福建中医药大学 | A method for establishing a database and a standardized model in a gait analysis system |
| US20230172491A1 (en) * | 2020-05-05 | 2023-06-08 | Stephen GROSSERODE | System and method for motion analysis including impairment, phase and frame detection |
| KR102297110B1 (en) * | 2020-05-07 | 2021-09-03 | 광주과학기술원 | Systems and methods for analyzing walking behavior |
| CN112704491B (en) * | 2020-12-28 | 2022-01-28 | 华南理工大学 | Lower limb gait prediction method based on attitude sensor and dynamic capture template data |
| ES3006733B2 (en) * | 2023-09-15 | 2025-07-18 | Consejo Superior Investigacion | GAIT SEGMENTATION METHOD |
| CN121214549B (en) * | 2025-09-26 | 2026-02-10 | 首都医科大学 | Rehabilitation patient gait image analysis and training system based on deep learning |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7330566B2 (en) * | 2003-05-15 | 2008-02-12 | Microsoft Corporation | Video-based gait recognition |
| US7421369B2 (en) * | 2005-06-09 | 2008-09-02 | Sony Corporation | Activity recognition apparatus, method and program |
| US7747409B2 (en) * | 2004-03-12 | 2010-06-29 | Vectronix Ag | Pedestrian navigation apparatus and method |
| US7804998B2 (en) * | 2006-03-09 | 2010-09-28 | The Board Of Trustees Of The Leland Stanford Junior University | Markerless motion capture system |
| US7857771B2 (en) * | 2003-04-03 | 2010-12-28 | University Of Virginia Patent Foundation | Method and system for the derivation of human gait characteristics and detecting falls passively from floor vibrations |
| US8073521B2 (en) * | 2003-09-19 | 2011-12-06 | Imatx, Inc. | Method for bone structure prognosis and simulated bone remodeling |
| US8206325B1 (en) * | 2007-10-12 | 2012-06-26 | Biosensics, L.L.C. | Ambulatory system for measuring and monitoring physical activity and risk of falling and for automatic fall detection |
| US8246354B2 (en) * | 2005-04-28 | 2012-08-21 | Simbex Llc | Training system and method using a dynamic perturbation platform |
| US8514236B2 (en) * | 2000-11-24 | 2013-08-20 | Cleversys, Inc. | System and method for animal gait characterization from bottom view using video analysis |
| US8854182B2 (en) * | 2011-06-14 | 2014-10-07 | International Business Machines Corporation | Opening management through gait detection |
Family Cites Families (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6923810B1 (en) * | 1988-06-13 | 2005-08-02 | Gary Karlin Michelson | Frusto-conical interbody spinal fusion implants |
| US6231527B1 (en) * | 1995-09-29 | 2001-05-15 | Nicholas Sol | Method and apparatus for biomechanical correction of gait and posture |
| US7084998B2 (en) | 2001-02-13 | 2006-08-01 | Ariba, Inc. | Method and system for processing files using a printer driver |
| US7227893B1 (en) | 2002-08-22 | 2007-06-05 | Xlabs Holdings, Llc | Application-specific object-based segmentation and recognition system |
| US7660439B1 (en) * | 2003-12-16 | 2010-02-09 | Verificon Corporation | Method and system for flow detection and motion analysis |
| WO2006034135A2 (en) | 2004-09-17 | 2006-03-30 | Proximex | Adaptive multi-modal integrated biometric identification detection and surveillance system |
| US7878990B2 (en) * | 2006-02-24 | 2011-02-01 | Al-Obaidi Saud M | Gait training device and method |
| KR100800874B1 (en) * | 2006-10-31 | 2008-02-04 | 삼성전자주식회사 | Stride length estimation method and portable terminal for same |
| WO2008066856A2 (en) * | 2006-11-27 | 2008-06-05 | Northeastern University | Patient specific ankle-foot orthotic device |
| US8300890B1 (en) | 2007-01-29 | 2012-10-30 | Intellivision Technologies Corporation | Person/object image and screening |
| JP2010017447A (en) | 2008-07-14 | 2010-01-28 | Nippon Telegr & Teleph Corp <Ntt> | Walking movement analyzer, walking movement analyzing method, walking movement analyzing program and its recording medium |
| US8154644B2 (en) | 2008-10-08 | 2012-04-10 | Sony Ericsson Mobile Communications Ab | System and method for manipulation of a digital image |
| US8447272B2 (en) * | 2009-11-25 | 2013-05-21 | Visa International Service Association | Authentication and human recognition transaction using a mobile device with an accelerometer |
| WO2012006549A2 (en) * | 2010-07-09 | 2012-01-12 | The Regents Of The University Of California | System comprised of sensors, communications, processing and inference on servers and other devices |
| EP2589979A1 (en) | 2011-11-03 | 2013-05-08 | Thales Nederland B.V. | System for characterizing motion of an individual, notably a human individual |
| DE102012212115B3 (en) * | 2012-07-11 | 2013-08-14 | Zebris Medical Gmbh | Treadmill assembly and method of operating such |
| WO2014112632A1 (en) * | 2013-01-18 | 2014-07-24 | 株式会社東芝 | Movement-information processing device and method |
| KR20140142463A (en) * | 2013-06-04 | 2014-12-12 | 한국전자통신연구원 | Apparatus and method of monitoring gait |
| CN115089444A (en) * | 2013-12-09 | 2022-09-23 | 哈佛大学校长及研究员协会 | Ways to Promote Gait Improvement |
| US9801568B2 (en) * | 2014-01-07 | 2017-10-31 | Purdue Research Foundation | Gait pattern analysis for predicting falls |
| US20170243354A1 (en) | 2016-02-19 | 2017-08-24 | Xerox Corporation | Automatic frontal-view gait segmentation for abnormal gait quantification |
| KR102487044B1 (en) * | 2016-04-14 | 2023-01-11 | 메드리듬스, 아이엔씨. | Systems and methods for neurologic rehabilitation |
-
2016
- 2016-10-03 US US15/283,603 patent/US20170243354A1/en not_active Abandoned
- 2016-10-03 US US15/283,663 patent/US9996739B2/en active Active
- 2016-10-03 US US15/283,629 patent/US9993182B2/en active Active
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8514236B2 (en) * | 2000-11-24 | 2013-08-20 | Cleversys, Inc. | System and method for animal gait characterization from bottom view using video analysis |
| US7857771B2 (en) * | 2003-04-03 | 2010-12-28 | University Of Virginia Patent Foundation | Method and system for the derivation of human gait characteristics and detecting falls passively from floor vibrations |
| US7330566B2 (en) * | 2003-05-15 | 2008-02-12 | Microsoft Corporation | Video-based gait recognition |
| US8073521B2 (en) * | 2003-09-19 | 2011-12-06 | Imatx, Inc. | Method for bone structure prognosis and simulated bone remodeling |
| US7747409B2 (en) * | 2004-03-12 | 2010-06-29 | Vectronix Ag | Pedestrian navigation apparatus and method |
| US8246354B2 (en) * | 2005-04-28 | 2012-08-21 | Simbex Llc | Training system and method using a dynamic perturbation platform |
| US7421369B2 (en) * | 2005-06-09 | 2008-09-02 | Sony Corporation | Activity recognition apparatus, method and program |
| US7804998B2 (en) * | 2006-03-09 | 2010-09-28 | The Board Of Trustees Of The Leland Stanford Junior University | Markerless motion capture system |
| US8206325B1 (en) * | 2007-10-12 | 2012-06-26 | Biosensics, L.L.C. | Ambulatory system for measuring and monitoring physical activity and risk of falling and for automatic fall detection |
| US8854182B2 (en) * | 2011-06-14 | 2014-10-07 | International Business Machines Corporation | Opening management through gait detection |
Cited By (42)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11600108B2 (en) * | 2011-01-12 | 2023-03-07 | Gary S. Shuster | Video and still image data alteration to enhance privacy |
| US20210365670A1 (en) * | 2011-01-12 | 2021-11-25 | Gary S. Shuster | Video and still image data alteration to enhance privacy |
| US9996739B2 (en) | 2016-02-19 | 2018-06-12 | Conduent Business Services, Llc | System and method for automatic gait cycle segmentation |
| US9993182B2 (en) * | 2016-02-19 | 2018-06-12 | Conduent Business Services, Llc | Computer vision system for ambient long-term gait assessment |
| US20170238846A1 (en) * | 2016-02-19 | 2017-08-24 | Xerox Corporation | Computer vision system for ambient long-term gait assessment |
| US11166436B2 (en) * | 2016-04-28 | 2021-11-09 | Osaka University | Health condition estimation device |
| US12419544B1 (en) * | 2017-05-30 | 2025-09-23 | Verily Life Sciences Llc | Digital characterization of movement to detect and monitor disorders |
| US11253173B1 (en) * | 2017-05-30 | 2022-02-22 | Verily Life Sciences Llc | Digital characterization of movement to detect and monitor disorders |
| US11998317B1 (en) * | 2017-05-30 | 2024-06-04 | Verily Life Sciences Llc | Digital characterization of movement to detect and monitor disorders |
| US10438136B2 (en) * | 2017-06-09 | 2019-10-08 | Midea Group Co., Ltd. | System and method for care support at home |
| US20180357760A1 (en) * | 2017-06-09 | 2018-12-13 | Midea Group Co., Ltd. | System and method for care support at home |
| US11690534B2 (en) * | 2017-11-17 | 2023-07-04 | Toyota Jidosha Kabushiki Kaisha | Gait evaluation apparatus, gait training system, and gait evaluation method |
| US20190150792A1 (en) * | 2017-11-17 | 2019-05-23 | Toyota Jidosha Kabushiki Kaisha | Gait evaluation apparatus, gait training system, and gait evaluation method |
| CN109063661A (en) * | 2018-08-09 | 2018-12-21 | 上海弈知信息科技有限公司 | Gait analysis method and device |
| CN109273090A (en) * | 2018-12-03 | 2019-01-25 | 东北大学 | A High Availability Gait Analysis Method Based on Fourier Transform |
| CN109871800A (en) * | 2019-02-13 | 2019-06-11 | 北京健康有益科技有限公司 | A kind of estimation method of human posture, device and storage medium |
| US11412957B2 (en) * | 2019-06-07 | 2022-08-16 | Tellus You Care, Inc. | Non-contact identification of gait dynamics, patterns and abnormalities for elderly care |
| WO2020247246A1 (en) * | 2019-06-07 | 2020-12-10 | Tellus You Care, Inc. | Non-contact identification of gait dynamics, patterns and abnormalities for elderly care |
| WO2020260635A1 (en) | 2019-06-26 | 2020-12-30 | Ekinnox | Method for analysing the gait of an individual |
| FR3097997A1 (en) | 2019-06-26 | 2021-01-01 | Ekinnox | Process for analyzing an individual's gait |
| US20210059596A1 (en) * | 2019-08-29 | 2021-03-04 | Panasonic Intellectual Property Corporation Of America | Cognitive function evaluation method, cognitive function evaluation device, and non-transitory computer-readable recording medium in which cognitive function evaluation program is recorded |
| US20210059614A1 (en) * | 2019-08-29 | 2021-03-04 | Panasonic Intellectual Property Corporation Of America | Sarcopenia evaluation method, sarcopenia evaluation device, and non-transitory computer-readable recording medium in which sarcopenia evaluation program is recorded |
| JP7439353B2 (en) | 2019-08-29 | 2024-02-28 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Cognitive function evaluation method, cognitive function evaluation device, and cognitive function evaluation program |
| US11779260B2 (en) * | 2019-08-29 | 2023-10-10 | Panasonic Intellectual Property Corporation Of America | Cognitive function evaluation method, cognitive function evaluation device, and non-transitory computer-readable recording medium in which cognitive function evaluation program is recorded |
| JP7473355B2 (en) | 2019-08-29 | 2024-04-23 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Fall risk assessment method, fall risk assessment device, and fall risk assessment program |
| JP2021030050A (en) * | 2019-08-29 | 2021-03-01 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Cognitive function evaluation method, cognitive function evaluation device, and cognitive function evaluation program |
| CN112438722A (en) * | 2019-08-29 | 2021-03-05 | 松下电器(美国)知识产权公司 | Method and apparatus for evaluating muscle degeneration and storage medium |
| JP7473354B2 (en) | 2019-08-29 | 2024-04-23 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Sarcopenia assessment method, sarcopenia assessment device, and sarcopenia assessment program |
| JP2021030049A (en) * | 2019-08-29 | 2021-03-01 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Sarcopenia evaluation method, sarcopenia evaluation device, and sarcopenia evaluation program |
| JP2021030051A (en) * | 2019-08-29 | 2021-03-01 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Fall risk evaluation method, fall risk evaluation device and fall risk evaluation program |
| CN110728226A (en) * | 2019-10-09 | 2020-01-24 | 清华大学 | Gait quantification system and method based on motion recognition |
| JP2021133192A (en) * | 2020-02-28 | 2021-09-13 | 株式会社三菱ケミカルホールディングス | Measurement system, program |
| JP2021137539A (en) * | 2020-02-28 | 2021-09-16 | 株式会社三菱ケミカルホールディングス | Measurement system, method, program |
| JP7419616B2 (en) | 2020-02-28 | 2024-01-23 | 株式会社Shosabi | Measurement systems, methods and programs |
| JP2021133193A (en) * | 2020-02-28 | 2021-09-13 | 株式会社三菱ケミカルホールディングス | Measurement system, method, program |
| WO2021175208A1 (en) * | 2020-03-02 | 2021-09-10 | 京东方科技集团股份有限公司 | Human body model modeling method and apparatus, electronic device and storage medium |
| US12183109B2 (en) | 2020-03-02 | 2024-12-31 | Boe Technology Group Co., Ltd. | Modeling method and modeling device for human body model, electronic device, and storage medium |
| US11980790B2 (en) * | 2020-05-06 | 2024-05-14 | Agile Human Performance, Inc. | Automated gait evaluation for retraining of running form using machine learning and digital video data |
| US20210346761A1 (en) * | 2020-05-06 | 2021-11-11 | Agile Human Performance, Inc. | Automated gait evaluation for retraining of running form using machine learning and digital video data |
| CN112906599A (en) * | 2021-03-04 | 2021-06-04 | 杭州海康威视数字技术股份有限公司 | Gait-based personnel identity identification method and device and electronic equipment |
| CN115965994A (en) * | 2022-12-20 | 2023-04-14 | 中国科学院自动化研究所 | Gait quantification method and device based on low-resolution monocular camera |
| CN119523469A (en) * | 2024-11-15 | 2025-02-28 | 北京牧学计算机技术有限公司 | A gait analysis method and system for children's rehabilitation |
Also Published As
| Publication number | Publication date |
|---|---|
| US9993182B2 (en) | 2018-06-12 |
| US9996739B2 (en) | 2018-06-12 |
| US20170238846A1 (en) | 2017-08-24 |
| US20170243057A1 (en) | 2017-08-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20170243354A1 (en) | Automatic frontal-view gait segmentation for abnormal gait quantification | |
| Khokhlova et al. | Normal and pathological gait classification LSTM model | |
| Gu et al. | Markerless gait analysis based on a single RGB camera | |
| Yoo et al. | Automated markerless analysis of human gait motion for recognition and classification | |
| US10506952B2 (en) | Motion monitor | |
| CN109815858B (en) | Target user gait recognition system and method in daily environment | |
| Houmanfar et al. | Movement analysis of rehabilitation exercises: Distance metrics for measuring patient progress | |
| González et al. | Comparison between passive vision-based system and a wearable inertial-based system for estimating temporal gait parameters related to the GAITRite electronic walkway | |
| CN112438723B (en) | Cognitive function evaluation method, cognitive function evaluation device and storage medium | |
| Prakash et al. | Identification of spatio-temporal and kinematics parameters for 2-D optical gait analysis system using passive markers | |
| Kargar et al. | Automatic measurement of physical mobility in get-up-and-go test using kinect sensor | |
| Jung et al. | Deep neural network-based gait classification using wearable inertial sensor data | |
| Gaud et al. | Human gait analysis and activity recognition: A review | |
| Bora et al. | Understanding human gait: A survey of traits for biometrics and biomedical applications | |
| Jamsrandorj et al. | Vision-based gait events detection using deep convolutional neural networks | |
| Khan et al. | Computer vision methods for parkinsonian gait analysis: A review on patents | |
| Bejinariu et al. | Image processing for the rehabilitation assessment of locomotion injuries and post stroke disabilities | |
| CN117635695A (en) | A walking trajectory center of gravity analysis method and system suitable for home care scenarios | |
| Habibi et al. | An AI-driven camera-based platform for patient ambulation assessment | |
| Benenaula et al. | Classification of gait anomalies by using space-time parameters obtained with pose estimation | |
| Tsao et al. | Human gait analysis by body segmentation and center of gravity | |
| Oluwadare | Gait analysis on a smart floor for health monitoring | |
| Irfan et al. | Reliable gait measurements using smartphone vision sensor | |
| Pappas et al. | Video-based clinical gait analysis in Parkinson’s disease: A Novel approach using frontal plane videos and machine learning | |
| Vilas et al. | Gait Analysis in Hereditary Amyloidosis Associated to Variant Transthyretin |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: XEROX CORPORATION, CONNECTICUT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAFAZZOLI, FAEZEH;XU, BEILEI;WU, WENCHENG;AND OTHERS;SIGNING DATES FROM 20160927 TO 20160928;REEL/FRAME:039921/0137 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |