[go: up one dir, main page]

NO20240165A1 - System and method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness - Google Patents

System and method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness

Info

Publication number
NO20240165A1
NO20240165A1 NO20240165A NO20240165A NO20240165A1 NO 20240165 A1 NO20240165 A1 NO 20240165A1 NO 20240165 A NO20240165 A NO 20240165A NO 20240165 A NO20240165 A NO 20240165A NO 20240165 A1 NO20240165 A1 NO 20240165A1
Authority
NO
Norway
Prior art keywords
exercise
anthropometric
user
deviation
image sequence
Prior art date
Application number
NO20240165A
Inventor
Marek S Tatara
Jacek Niklewski
Tron Krosshaug
Kjell Heen
Original Assignee
Kinetech As
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kinetech As filed Critical Kinetech As
Priority to NO20240165A priority Critical patent/NO20240165A1/en
Priority to PCT/NO2025/050028 priority patent/WO2025178499A1/en
Publication of NO20240165A1 publication Critical patent/NO20240165A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6895Sport equipment
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • A63B2024/0015Comparing movements or motion sequences with computerised simulations of movements or motion sequences, e.g. for generating an ideal template as reference to be achieved by the user

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Social Psychology (AREA)
  • Theoretical Computer Science (AREA)
  • Psychiatry (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Geometry (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Description

SYSTEM AND METHOD FOR CALCULATING AN ENHANCED
ANTHROPOMETRIC WEIGHTED MEASUREMENT OF EXERCISE
EFFECTIVENESS
Field of the invention
The present invention relates to a system and method for biometric analysis of a user executing an exercise, and more particularly to AI analysis of said exercise to provide a numerical efficiency value associated with the user’s execution of said exercise.
Background
Exercise and physical fitness are recognized as essential components of a healthy lifestyle. A regular exercise routine can contribute significantly to improved physical and mental well-being, reduced risk of chronic diseases, and enhanced quality of life. Consequently, there is a growing global emphasis on promoting physical activity and fitness.
Furthermore, human healthcare resources are expensive and scarce. When they regularly have to serve a chronic disease-group, which makes up a large proportion of the population, treatment becomes a socioeconomic problem.
Musculoskeletal diseases like Osteoarthritis are a disease group that affects the highest number of people and at the greatest cost. Effective treatment for the patient-group for maintaining functional capacity and quality of life is to perform specially adapted physical exercises.
However, the effectiveness and safety of exercise programs are critically dependent on the accurate performance of specific exercises and training techniques. The improper execution of exercises can lead to a range of problems, including suboptimal fitness results, increased risk of injury, and discouragement among individuals attempting to adopt a fitness regimen. In the field of musculoskeletal diseases, current treatment takes place under professional supervision which is very resource-intensive and costly. Many therefore do not receive the treatment they need.
One fundamental challenge is that individuals, especially beginners, often lack access to qualified fitness trainers or instructors who can provide guidance and correction during exercise sessions. Furthermore, even when fitness trainers are available, they may not be able to closely monitor each client's form and technique continuously.
Incorrect exercise training can manifest in various ways, in particular, performing exercises with improper posture, body mechanics, or range of motion, leading to reduced effectiveness and an increased risk of injury. Other issues associated with incorrect exercising may be over-exertion, under-exertion, and inconsistent progress since the absence of a consistent and accurate assessment of performance can hinder individuals from tracking their progress effectively.
Most exercise training aids rely on static, non-interactive resources, such as printed materials, and video tutorials. These methods lack the feedback and guidance necessary to ensure proper exercise form and technique throughout an entire workout. Personal training sessions are often too expensive for the required regularity for most people.
Furthermore, current technical solutions which incorporate computer assessed exercise training lack the precision required to ensure injury-free training.
The present invention provides a method and system to address the abovedescribed problems with incorrect, injury-prone methods of training and/or performing exercise-based treatments. Furthermore, the invention seeks to provide enhanced computer-assisted training to ensure correct and optimal training thereby achieving a maximum potential from each training session.
Summary of the Invention
According to a first aspect of the invention there is provided a method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness comprising: selecting an exercise from a selection of predetermined reference exercises, wherein the reference exercises are based on a subject having a reference anthropometry; accessing a database to retrieve a predetermined set of rules for said selected exercise; providing an image sequence having a plurality of frames of a user performing the selected exercise; analysing the image sequence to extract anthropometric information about the user; calculating a percentage ratio between the reference anthropometry and the user’s anthropometry for body parts associated with the selected exercise; calculating a difference in a load level based on the anthropometric difference between the reference exercise and analysis of anthropometry of the image sequence; adjusting the predetermined set of rules for said selected exercise to account for the difference in load level; measuring one or more angles of body parts and/or joints of the user performing the selected exercise from the image sequence; analysing measurements of the one or more angles of body parts and/or joints against the adjusted set of rules to assess an efficiency of the exercise.
Image analysis of each frame in the image sequence estimates an anthropometry of the user in their body position for said frame, wherein the estimated anthropology may comprise a plurality of estimated key points representing corresponding apexes of an angle of an associated joint.
Calculating a difference in a load level based on the anthropometric difference between the reference exercise and analysis of anthropometry of the image sequence may comprise: measuring a moment arm length from a first line of gravity through a first associated key point to a second line of gravity through a second associated key point; multiplying the moment arm length by a total weight through the associated moment and by gravitational acceleration.
Anthropometric information about the user may comprise a length of one or more body parts of a user and a user’s body proportions.
Predetermined set of rules may comprise at least one of: posture angles; and a relationship a body part has to a horizontal plane.
The method may further comprise configuring a benchmark exercise by: refining the reference exercise to further identify predetermined segments to focus analysis on during analysis of the image sequence; setting threshold values for angles for the predetermined segments; setting a threshold frequency for one or more predetermined moments; and setting anthropometric values for the various segments for the corresponding reference exercise; wherein the benchmark exercise is used for a comparison analysis with the image sequence.
The method may further comprise analysing the image sequence to determine a duration and/or frequency of each repetition and incorporating this analysis into the assessment of the efficiency of the exercise.
A time of eccentric extension and a time of concentric extension may be measured separately.
Analysing the image sequence to extract anthropometric information about the user may be performed using computer vision.
The method may further comprise: assessing whether an anthropometric deviation between a real anthropometry of a user and an estimated anthropometry on a frame-by-frame basis is above a predetermined threshold; and correcting for said anthropometric deviation if it is determined that the anthropometric deviation is above the predetermined threshold.
Correcting for said anthropometric deviation may comprises; discarding measurements from a corresponding frame in which the anthropometric deviation is above the predetermined threshold; extrapolating replacement measurements from a previous frame to the discarded frame and a subsequent frame from the discarded frame, provided the previous and subsequent frames have an anthropometric deviation below the predetermined threshold.
The image sequence may have a threshold acceptable number of frames, wherein the anthropometric deviation is above the accepted predetermined threshold and/or, having a predetermined number of frames from a discarded frame, wherein if the threshold number of frames is exceeded, no joint angle is calculated.
Replacement measurements may be extrapolated from the previous and subsequent frames using a prediction filter.
Correcting for said anthropometric deviation may comprise: identifying a key point generated from computer vision anthropometric analysis associated with an anthropometric deviation above the acceptable threshold and replacing said computer vision generated key point with a corrected key point wherein the corrected key point is calculated using the anthropometric deviation.
The method of may further comprise prioritising angles which have a higher percentage change for the associated exercise over angles which have a lower percentage change when assessing anthropometric deviation.
The method may further comprise identifying estimated key points which have an anthropometric deviation below a predetermined threshold as acceptable key points; and using the acceptable key points and known anthropometric measurements to reconstruct a plurality of key points in a three-dimensional perspective using inverse kinematics.
According to a second aspect of the invention, there is provided a method for augmenting feedback media indicative of exercise effectiveness using the enhanced anthropometric weighted measurement, comprising the steps of the first aspect of the invention and augmenting a feedback media indicative of the efficiency of the exercise.
According to a third aspect of the invention there is provided a system for calculating an enhanced anthropometric weighted measurement of exercise effectiveness comprising: a first memory module comprising a database with a plurality of reference exercise information stored thereupon; an image acquisition module comprising a camera for obtaining an image sequence of a user perform a selected exercise; an imaging processing module for analysing the image sequence to extract anthropometric information about the user; a second storage module comprising computer readable media comprising instruction to perform the tasks of: calculating a percentage ratio between the reference anthropometry and the user’s anthropometry for body parts associated with the selected exercise; calculating a difference in a load level based on the anthropometric difference between the reference exercise and analysis of anthropometry of the image sequence; adjusting the predetermined set of rules for said selected exercise to account for the difference in load level; measuring one or more angles of body parts and/or joints of the user performing the selected exercise from the image sequence; analysing measurements of the one or more angles of body parts and/or joints against the adjusted set of rules to assess an efficiency of the exercise; and a processor to execute the instructions stored in the second storage module.
The computer readable media of the second storage module may further comprises instruction to perform the tasks of the first or the second aspect of the invention.
The system may be in the form of an application downloaded onto, or accessed by, a user’s personal computer device.
The method of the first or second aspect, or the system of the third aspect, wherein the one or more angles of body parts and/or joints may be measured, and/or the resulting measurements may be analysed, in real-time.
Brief Description of the Drawings
Fig. 1 is a flowchart of the method of the invention herein;
Fig. 2a shows a first example exercise in the form of a weighted squat;
Fig. 2b shows a second example exercise in the form of a free weighted bicep curl;
Fig. 2c shows a measuring of load level based on the anthropometric difference between the reference exercise and analysis of anthropometry of the image sequence;
Fig. 2d shows a squat exercise used in a measuring of load level;
Fig. 3 shows a graphical representation of deviations between an actual length of a bone-segments and an estimated length;
Fig. 4 shows a visual representation of a method for correcting an anthropometric deviation;
Fig. 5 shows a system according to the present invention.
Fig. 6 is a series of screen shots of a user interface of the system corresponding to a plurality of frames from an image sequence of a first exercise; and
Fig. 7 is a series of screen shots of a user interface of the system corresponding to a plurality of frames from an image sequence of a second exercise.
Definitions
Unless otherwise defined, all terms of art, notations and other scientific terms or terminology used herein are intended to have the meanings commonly understood by those of skill in the art to which this invention pertains. In some cases, terms with commonly understood meanings are defined herein for clarity and/or for ready reference, and the inclusion of such definitions herein should not necessarily be construed to represent a substantial difference over what is generally understood in the art.
In this text, the term ‘anthropometric’ refers to the scientific study of the measurements and proportions of the human body and involves the systematic measurement of various body dimensions, such as height, weight, length, girth, and breadth, as well as the analysis of their relationships and proportions.
In this text, the term ‘biomechanics’ refers to the study of the mechanical principles that operate on our biological systems and includes the mechanical forces that are particularly linked to the musculoskeletal system.
In this text, the term ‘exercise’ refers to a particular movement and/or position of a user’s body and/or posture, wherein said particular movement and/or posture is for the purpose of developing a strength and/or flexibility of the user’s body if the movement and/or posture is performed over time.
Detailed Description
The method and system of the present invention combines biomechanical research and human pose estimation with artificial intelligence-based analysis to analyse a correctness of a user’s exercise training, provide a numerical efficiency score on said correctness level, and provide instruction to improve said efficiency score. The technology identifies and compensates for any anthropometric deviations during the biomechanical analysis. The anthropometric measurements are used to set individual biomechanical threshold values for the user to achieve optimal effect from each individual exercise.
Figure 1 is a flow chart of a method of the invention herein. The method starts at 102 wherein a reference exercise item is selected (retrieved) from a plurality of reference exercise items. Exercise is used in the context of this application to refer to a particular movement and/or position of a user’s body and/or posture, wherein said particular movement and/or posture is for the purpose of developing a strength and/or flexibility of the user’s body if the movement and/or posture is performed over time. The exercise may be a strength-based exercise, a flexibilitybased exercise, a mobility-based exercise, a balance-based exercise or a stabilitybased exercise. Most preferably, the exercise is a strength-based exercise.
However, exercises that are for the purpose of physical therapy, occupational therapy, and even aerobic exercise are within the scope of the invention. Wherein the exercise is a strength-based exercise, the exercise may be a weighted exercise or a bodyweight exercise. Preferably, each reference exercise item comprises a particular exercise of repeated movement of a subject body and may or may not include the use of weights. To name a few non-limiting examples, the particular exercise of the reference exercise item may be: squats, push-ups, lunges, crunches, jumping jacks, burpees, tricep dips, leg raises, high knees, calf raises, side leg raises, and box jumps. Each reference exercise item comprises researched-based information about an effect of said exercise on a subject’s body. In a first example, this information includes which movements constitute the most optimal muscle stimulation during an exercise, for example, which posture angles and movements constitutes the most optimal muscle stimulation. Posture angles includes particular joint angles and/or angles of a body part with respect to a horizontal plane.
To provide an illustrative example, a first reference exercise item relates to a squat with a bar weight as shown in figure 2a. The reference exercise item comprises information about the optimum angle of the lower leg with respect to the thigh, said optimum angle having been identified through research of a series of potential angles and measuring their effectiveness on muscle and skeletal strain. The information may also comprise an optimum angle, or range of angles, for the subject’s back with respect to a horizontal plane. The horizontal plane is defined as 90 degrees from the directional force of gravity or parallel to a surface on which the subject is standing. Information may further include an optimal head position, an optimal position of the bar weight with respect to the subject, an optimal angle of the subject’s forearm with respect to their upper arm.
To provide a second illustrative example, a second reference exercise item relates to a bicep curl as shown in figure 2b. The second reference exercise item comprises information about the optimum angle of the forearm with respect to the upper arm, said optimum angle having been identified through research of a series of potential angles and measuring a resultant effectiveness on muscle and skeletal strain. The information may also comprise an optimum angle, or range of angles, for the subject’s arm with respect to their torso. Information may further include an optimal head position, and an optimal position of the free weight in the subject’s hand.
After selection and retrieval of the reference exercise item, the method then progresses to step 104, wherein the reference exercise is used to configure a benchmark exercise package.
Alternatively, a plurality of benchmark exercise packages are pre-generated from the plurality of reference exercises and a stored in accessible storage media such that a user can directly select a desired benchmark exercise package.
Each benchmark exercise package comprises a series of specified (predetermined) key performance indicators for the associated exercise. These key performance indicators may include threshold posture angles and optimum frequency for associated exercise moments. The benchmark exercise package may provide identification of the primary segments of a subject’s body involved in the reference exercise and further provide anthropometric values for a reference subject performing the associated reference exercise.
Each benchmark exercise package is derived from evidence-based research of the corresponding exercise. For example, this research includes measuring muscle strain across a range of posture angles to determine optimum posture angles of key segments involved in the exercise. Alternatively or additionally, benchmark values can also be based on physical laws.
Having selected and retrieved the reference exercise item and/or benchmark exercise package, at step 106 an image sequence is provided. The image sequence is of a user performing an exercise corresponding to the reference exercise. The image sequence can be provided as a pre-recording or may be a live video stream.
At step 108, the image sequence is analysed. Biomechanical analysis is performed to extract anthropometric measurements of the user 110. The basis of biomechanical analysis is known as human pose estimation (HPE). This is the technology that, based on computer vision (image analysis), calculates the length and angles of limbs and joints. These estimations are first done in two-dimensions.
More specifically, human pose estimation is a computer vision task that involves detecting and locating key points on the human body, such as joints and body parts, in images or videos. A spatial arrangement of a person's body is determined and a pose or posture the person is in is realised. deep learning models such as convolutional neural networks and/or other deep learning architectures are commonly used for this task. These models are trained on large datasets with annotated human poses to learn the spatial relationships between body parts. The key points which are identified on the human body may be joints - such as at least one of ankles, knees, hips, wrists, elbows, shoulders – and/or body parts – such as the head and/or the torso. Then. Some non-limiting examples of pose estimation models include OpenPose®, PoseNet®, and HRNet (High-Resolution Network)®. Some non-limiting example datasets for training and evaluating pose estimation models included Common Objects in Context (COCO)®, MPII Human Pose®, and PoseTrack®.
At 112, a more accurate calculation of forces that the particular reference exercise has on the user’s body as a result of the impact of the user’s particular anthropometry is measured. A comparison between the extracted anthropometric measurements and the reference anthropometry of the reference subject is calculated. Preferably, the comparison is calculated as a percentage ratio. The calculated comparison, for example in the form of the percentage ratio, can be used to adjust the posture angles given in the reference exercise item and/or the benchmark exercise package to provide a personalised set of posture angles accounting for the user’s particular anthropometry and to measure the real forces on various parts of the user’s body as a result of the exercise.
Figure 2c and 2d demonstrate a more accurate calculation of forces of a particular reference exercise on the user’s body as a result of the impact of the user’s particular anthropometry for an example squat exercise.
In particular, figure 2c and 2d demonstrate the principles and key points for calculating which forces a squat exercise generates and the impact different length of the femur of the user, from the reference anthropometry, has on the effect of the exercise.
In figure 2c, dotted lines 9, 10, 11 are perpendicular lines used as reference lines to calculate lines on which moment-arms lie. Line 9 represents a perpendicular line through the user’s hip 8, line 10 represents a line of gravity from a weight 15 used in the exercise, and line 11 represents a perpendicular line though the user’s knee 4.
Moment arm 6 extends from perpendicular reference line 10 (i.e. the line of gravity from the free weight) to the hip 8. Moment arm 7 extends from perpendicular reference line 10 to the knee 4. More particularly, at the deepest position, the perpendicular line 9 is calculated for the lowest point (i.e., the user’s hip 8).
Perpendicular line 10, showing the force of gravity, forms the basis for other calculations. At the deepest position, the perpendicular line 11 for the knee 4 is calculated. The bold solid lines 12 show a simplified skeleton for a squat as demonstrated in the image of figure 2d. The dotted line 13 shows the equivalent simplified skeleton for a squat for a person with longer femurs. The dotted line 14 shows the posture of the person with long femurs when achieving the same torque as the person with short femurs, that is, when the hip joint aligns with the perpendicular line 9 of the reference person with shorter femurs.
The below equations 1 to 3 show an example calculation of load on a hip joint for a reference person (with shorter femurs) and a comparison with a user of the system (with longer femurs when compared with the reference person).
Equation 1 calculates load on a hip joint during a squat exercise such as the squat exercise shown in figure 2d.
Equation 1 Equation 2 provides the load on the hip joint in the specific example of figure 2d wherein the moment arm 6 is 0.27m, the weight is 102kg. The example values of equation 2 represent values of the reference exercise, in this example.
Equation 2 Equation 3 below provides the load on the hip joint of the real user of the system having the same posture as the reference exercise.
Equation 3
The real user has a 5 cm longer femur which results, according to equation 3, in a 22% greater load on the hip joint. The above calculation of load assumes that the calf angle is the same so that only the length of the moment arm on the hip is increased.
In some examples, the method comprises a further step of identifying and correcting for an anthropometric deviation. In this step, at least one real anthropometric value of the user is known. This at least one real anthropometric value is compared with the corresponding extracted value to check for an anthropometric deviation between said measured and real values. When an anthropometric deviation is identified which is above a predetermined threshold, the method progresses to carry out steps to correct for this deviation.
Figure 3 shows an illustrative example of deviations between an actual length of bone-segments and an estimated length extracted during the image analysis step 108. The figure shows anthropometric deviations occurring during image analysis according to step 108 for a squat exercise. Each individual line represents a leg segment. In the example of figure 3, a user’s left leg is focused on. This is because it is the most prominent, where the greatest motions occur and where the greatest deviation has been identified. The x-axis is the timeline measured in video frames, i.e., 30 frames per second (FPS). The y-axis is metric deviation between actual anthropometry and estimated anthropometry. Figure 3 shows the anthropometric deviation seems to increase proportionally with the change of angle. In the example of figure 3, the posture angle is a knee angle. The anthropometric deviation for the femur, which has the greatest movement, varies from about -17 cm to about -27 cm. A difference between the highest and lowest deviation is approximately 10 cm, which, in this example, is identified as a significant deviation. For comparison, a difference between the highest and lowest deviation for the lower leg, for which there is less movement, the deviation fluctuates by only 2.5 cm. Figure 3 demonstrates that there can be a significant discrepancy between actual and estimated anthropometric measurements. For this particular exercise example of a squat, there is evidence of a pattern in which the deviation increases proportionally with motion, although this may not apply to other exercises and/or other bone segments. The method can compensates for anthropometry which is, to the greatest extent possible, independent of motions, camera position and a particular HPE-model.
In a first method ALT.1 for correcting for anthropometric deviation, a frame in which the deviation is identified is discarded for the purpose of obtaining a measured posture angle of the user. This method is preferred when the anthropometric deviation is a result of a bone-segment of the user being estimated too short or too long compared to the known length of the particular bone segment, and thus a key point position is anomalous. A replacement posture angle will be calculated from extrapolating the key point position of both a previous and subsequent frame having an acceptable or no anthropometric deviation to provide a predicted replacement key point. This predicted replacement key point is then used in the calculation for the corresponding posture angle associated with that frame.
The first method ALT.1 for correcting for anthropometric deviation has limitation criteria. A first limitation criteria in the first method ALT.1 is that a previous frame (i.e. fn-x) to be used in the prediction extrapolation is within a predetermined threshold number of previous frames away from the discarded frame (e.g. up to 3 frames away from the rejected frame, x=3). A second limitation criteria is the frame used for the prediction extrapolation must have an anthropometric deviation beneath a predetermined threshold. If the immediately adjacent previous frame (fn-
1) also has an unacceptable anthropometric deviation, this cannot be used in the prediction estimation. In some examples, a next previous frame (fn-2) is used in the prediction estimation provided that frame has an acceptable anthropometric deviation. Thus, the method of ALT.1 includes reviewing the previous frames in sequence until a frame having an acceptable anthropometric deviation is identified and this frame is used in the prediction calculation, provided said identified frame having an acceptable anthropometric deviation is within the threshold number of previous frames away from the particular discarded frame (i.e. fn).
In a similar manner, a subsequent frame (fn+x) to be used in the prediction extrapolation is a predetermined threshold number of frames (x) away from the particular discarded frame (fn).
If no previous/subsequent frames having an acceptable anthropometric deviation are identified within the predetermined threshold number of frames (x), a posture angle is not calculated for that frame using the ALT.1 method.
The method of ALT.1 is suitable wherein the deviation only applies to a few frames in the plurality of frame constituting the image sequence. In addition, the method of ALT.1 is particularly suitable when a quality of closely related frames is high. The quality of a frame is considered high, at least in part, if the anthropometric deviation is below a predetermined threshold. Preferably, the number of frames constituting an unacceptable number of frames having anthropometric deviation above the threshold deviation in the image sequence is between 5 and 20, more preferably less than ten consecutive frames.
Since the method ALT.1 uses information from previous and subsequent frames (fn+x, fn+x), there is a time delay to the posture angle analysis. In an illustrative example wherein there is a four-frame margin and a frame rate of 30 frames per second, the time delay will be thirteen hundredths of a second.
In a second method ALT. 2 for correcting for anthropometric deviation, the frame is not discarded for the purposes of calculating posture angles. Instead, a position of one or more key points is recalculated. Preferably, each of the one or more key points is an apex of a corresponding posture angle associated with the selected exercise.
Figure 4 shows a visual representation of the method of ALT.1 or ALT.2 for correcting for anthropometric deviation, wherein a key point is recalculated using alternative information. The lines 302, 304, 306 and 308 represent posture angles for a sequence of frames as the user moves to perform a squat exercise. The user moves downwards into the squat through four frames in the illustrative figure. The line 306 representing a posture angle between a user’s back and thigh in frame 37 of 41, has been identified as having an anthropometric deviation above an acceptable threshold. The solid line represents part of the measured posture angle before correction. Point A represents a key point of the measured anthropometry of the user in the second of the four frames, and is an apex of the posture angle between the user’s back and the user’s thigh. Point B represents a key point of the measured anthropometry of the user in the third of the four frames, and is the apex of the posture angle between the user’s back and the user’s thigh at a later time. Point D represents a key point of the measured anthropometry of the user in the fourth of the four frames, and is the apex of the posture angle between the user’s back and the user’s thigh at a still later time. Since the length of the user’s thigh has been incorrectly measured by the image analysis (HPE), key point B is in an incorrect position. Point C is the corrected key point recalculated using the methods of ALT.1 or ALT.2. Corrected point C can then be used to calculate the measured posture angle between the user’s back and the user’s thigh. This corrected measured posture angle is then the posture angle analysed against the adjusted reference and/or benchmark exercise for correctness, and to ultimately establish a numerical efficiency value.
In a third method ALT.3 for correcting for anthropometric deviation each and every approved key point is identified and, the identified key points and known anthropometric measurements of the user are fed into an inverse kinematics model to render a three-dimensional reconstruction of key points of a user’s body. The inverse kinematic model can determine posture angles required to achieve the specific approved key point positions and orientations and, thus derive a threedimensional anthropometric model. As mentioned above, approved key points are those wherein an anthropometric deviation between an image analysis (HPE) measurement and a known body proportion is below a predetermined threshold.
The deviation correction methods can be applied to both the initial anthropometric measurement analysis at the start of a user’s exercise and on a frame-by-frame basis for each re-adjusted position as the user moves through the exercise.
Wherein the deviation correction is applied to the anthropometric model of each frame this is referred to as a deviation adjusted-extracted anthropometric model (DAEM).
At step 116, the deviation adjusted-extracted anthropometric model (DAEM) for each of the frames are assessed against one or more preset parameters with respect to predetermined ranges of values. At least one of these preset parameters against which the DAEM is assessed is posture angle, wherein the measured posture angle of the user in the image sequence is compared to the anthropometrically adjusted posture angle value or range. Preferably, the one or more parameters against which the DAEM is assessed are the key performance indicators of the benchmark exercise package and, thus, include threshold posture angles and optimum frequency for associated exercise moments. Assessment of the DAEM over a plurality of frames constitutes an assessment of a user’s execution of the exercise. In a preferred example, the three-dimensional anthropometric model is used. This enables the user’s execution of the exercise to be assessed from different angles, for example, from a profile point of view and from a frontal plane point of view. Beneficially, this multi-view, multi-angle analysis feature of the invention can be achieved using only a single camera, such as a camera on the user’s phone. Assessing the execution of an exercise from different angles/views can be useful since one of the angles may reveal a posture angle error which appears correct in the primary view.
At step 118, the frame-by-frame analysis of the measured parameters against predetermined values is used to calculate an effectiveness score. This effectiveness score can then be transmitted to the user in the form of augmented feedback media.
In addition to tracking the correlation between the reference/benchmark exercise and the users image sequence, the method and system of the invention can further automatically track valid repetitions. The tracking of repetition may comprise tracking a pace and consistency. Vertical movement velocity may provide a useful indication of relative effort. Thus, the relative effort estimate given by the user may be estimated by monitoring the pace. Based on measurements of velocity, the method may further include dynamically adapting the relevant benchmark exercise package. The dynamic adaptation of the benchmark exercise package can occur in real-time for optimal effect achievement. Alternatively, dynamic adaption of the benchmark exercise package can occur periodically. In a particular illustrative example, the pace is analysed to assess for that last repetitions take longer than the first. This may provide an indication of a good disposition of the exercise performed. In the squat exercise example, the average time for the first three lifts can be compared to the last three lifts in a set of a plurality of repetitions. If the average time for the last three repetitions is shorter than for the first three repetitions, this is recorded and optionally a visual or audible alert is given. By tracking the pace of the repetition in a set, unfortunate jerks can be uncovered during a user’s execution of an exercise. These jerks can be both ineffective and harmful. Ideally, a majority of exercises are carried out with smooth movements.
The efficiency score (i.e. the enhanced anthropometric weighted measurement associated with exercise effectiveness) can then be based on a biomechanical analysis of a combination of posture angle and motion.
The invention will now be described with reference to a specific example wherein a selected exercise is a body-weighted squat.
Figure 5 shows a second aspect of the invention of a system 400 for analysing a correlation between an image sequence of a user performing an action against an anthropometrically adjusted reference action. The system has a first memory storage module 402, an image acquisition module 404, an image processing module 406, a second memory storage module 408, a processor 410, and a user interface 412.
The first memory storage module 402 comprises a databased with plurality of reference exercise items. In a special case, the reference exercise items are benchmark exercise packages and includes further key performance indicators as described above. Preferably, the first storage module 402 is non-volatile memory and may be ROM, HDD, SSD or cloud storage.
The image acquisition module 404 comprises a camera and is configured to take an image sequence (video) of a user performing an action, for example a selected exercise. The acquired image sequence can be stored in a cache memory 405 for access by the image processing module 406 and the processor 410. The image processing module 406 is configured to retrieve the image sequence and perform an image analysis (HPE) to extract anthropometric information about the user, to track a correlation between the user’s action and a reference action, and optionally to identify anthropometric deviations between measured anthropometry of the user and known anthropometry of the user. The processor 410 is configured to execute instructions stored in the second storage module 408 to perform the tasks of steps 112, 114, 116 in combination with the image processor, 118 and 120 of the method of the first aspect of the herein invention. Preferably, the second storage module 408 is non-volatile memory and may be ROM, HDD, SSD or cloud storage.
The user interface 412 enables a user to control the system 400 including selecting a reference exercise item, starting an image acquisition, and commencing analysis. The user interface also provides the user with information. This information is preferably information about the correlation between the posture angles of the user’s performed exercise against optimal posture angles provided in the reference exercise. The information can be displayed as an overlaid rendering of the reference posture angles and/or real posture angles. It can include visual indicators of the amount of correlation such as colour coding, line flashing or differing line style. The information can also be visually displayed to the user via other symbology. For example, a numerical efficiency score can be provided on the user interface. The efficiency score can be provided in real-time or can be given at the end of an exercise, or at the end of every repetition of an exercise. In some examples, there is an efficiency score for a plurality of body parts.
Figure 6 shows a series of screen shots of the user interface corresponding to a plurality of frames from an image sequence of a first exercise with overlaid visual enhanced biomechanical analysis.
Figure 7 is a series of screen shots of the user interface of the system corresponding to a plurality of frames from an image sequence of the exercise from the second view point.
Alternatively or in addition to the display of visual information, feedback on the correctness of the execution of the exercise may be delivered audially. This has the benefit that the user does not have to adjust their position to receive the information since adjusting their position may result in incorrect exercise execution.
Thus, the augmented feedback media to convey exercise efficiency of the method and system herein can comprise overlaid rendering of the reference posture angles and/or real posture angles, visual indicators of the amount of correlation such as colour coding, line flashing or differing line style, other visually or audially symbology, and even physical indicia such as vibrations from a connected vibratable device, to name some non-limiting examples.
Data processing, computation, and/or analysis, associated with the system and method herein described, may take place in the form of software as a service (SAAS) or Edge based processing. In an example, the system is implemented as an application on a user’s personal computing device such as their mobile phone or tablet. The application can request a permission to use the computing device hardware such as its camera and storage. In this way, no additional accessories are needed to realise the system or implement the method of the invention.
In addition to showing the effect and results of single exercises, the method and system of the invention can provide the user with tracking their development over time. Data derived from the method analysis of each exercise session can be saved and statistical analysis performed thereon. In this way, an improvement on a particular exercise and/or posture correctness with respect to a particular body part or joint can be tracked over time.
The present invention combines evidence-based research which gives rise to the specialised benchmark exercise packages combined with artificial intelligence and biomechanics-based analysis provides a method and system for carrying out correct treatment safely without expert supervision. A particular benefit of the method and system of the invention is that treatment can be performed in a patients/user's own home, without the presence of a healthcare professional.
A user only needs the technical competency level of standard use of a personal computing device, installing applications and taking a video recording.
The system and method uncover exercises that are considered ineffective or directly harmful if performed incorrectly. In the event of deviations or incorrectly performed exercises, information to this effect is communicated to the user through the user interface or otherwise. Instruction for correction can be communicated to the user.
A communication connection with a user profile of the system can be established with a healthcare professional such as a physiotherapist for professional overview and assessment. In this way computational analysis, artificial intelligence and human intervention can be combined to provide improved treatment. In a particular example, the communication link with a healthcare professional is only established when the efficiency score dips below a predetermined threshold. Alternatively or additionally, an image sequence associated with an executed exercise wherein a predetermined threshold efficiency score was not met can be stored in memory and marked to indicate that said image sequence should be evaluates by the health care professional. The healthcare professional can then access the saved filed for review retrospectively and review several exercises/exercise sessions in one go. The healthcare provider or personal trainer can review the feedback provided by the computational analysis of the method and review the image sequence and provide their own feedback. They can add notes to the associated exercise.
The system and method of the invention provides a more efficient, economic, safe and diverse treatment to a larger proportion of a user group. It is particularly suited to the field of physiotherapy treatment and workout training.
Having described preferred examples of the invention it will be apparent to those skilled in the art that other embodiments incorporating the invention may be used. These and other examples of the invention illustrated above are intended by way of example only and the actual scope of the invention is to be determined from the appended claims.

Claims (21)

P ATENT C LAIMS
1. A method for measuring an enhanced anthropometric weighted measurement associated with exercise effectiveness comprising:
selecting an exercise from a selection of predetermined reference exercises, wherein the reference exercises are based on a subject having a reference anthropometry;
accessing a database to retrieve a predetermined set of rules for said selected exercise;
providing an image sequence having a plurality of frames of a user performing the selected exercise;
analysing the image sequence to extract anthropometric information about the user;
calculating a percentage ratio between the reference anthropometry and the user’s anthropometry for body parts associated with the selected exercise;
calculating a difference in a load level based on the anthropometric difference between the reference exercise and analysis of anthropometry of the image sequence;
adjusting the predetermined set of rules for said selected exercise to account for the difference in load level;
measuring one or more angles of body parts and/or joints of the user performing the selected exercise from the image sequence;
analysing measurements of the one or more angles of body parts and/or joints against the adjusted set of rules to assess an efficiency of the exercise.
2. The method of claim 1, wherein image analysis of each frame in the image sequence estimates an anthropometry of the user in their body position for said frame, wherein the estimated anthropology comprises a plurality of estimated key points representing corresponding apexes of an angle of an associated joint.
3. The method of claim 2, wherein calculating a difference in a load level based on the anthropometric difference between the reference exercise and analysis of anthropometry of the image sequence comprises:
measuring a moment arm length from a first line of gravity through a first associated key point to a second line of gravity through a second associated key point;
multiplying the moment arm length by a total weight through the associated moment and by gravitational acceleration.
4. The method of claim 2 or 3, wherein anthropometric information about the user comprises a length of one or more body parts of a user and a user’s body proportions.
5. The method of any preceding claim, wherein said predetermined set of rules comprises at least one of:
posture angles; and
a relationship a body part has to a horizontal plane.
6. The method of any preceding claim, further comprising configuring a benchmark exercise by:
refining the reference exercise to further identify predetermined segments to focus analysis on during analysis of the image sequence;
setting threshold values for angles for the predetermined segments; setting a threshold frequency for one or more predetermined moments; and setting anthropometric values for the various segments for the corresponding reference exercise;
wherein the benchmark exercise is used for a comparison analysis with the image sequence.
7. The method of any proceeding claim, further comprising analysing the image sequence to determine a duration and/or frequency of each repetition and incorporating this analysis into the assessment of the efficiency of the exercise.
8. The method of claim 6, wherein a time of eccentric extension and a time of concentric extension is measured separately.
9. The method of any preceding claim, wherein analysing the image sequence to extract anthropometric information about the user is performed using computer vision.
10. The method of any preceding claim, further comprising:
assessing whether an anthropometric deviation between a real anthropometry of a user and an estimated anthropometry on a frame-by-frame basis is above a predetermined threshold; and
correcting for said anthropometric deviation if it is determined that the anthropometric deviation is above the predetermined threshold.
11. The method of claim 10, wherein correcting for said anthropometric deviation comprises;
discarding measurements from a corresponding frame in which the anthropometric deviation is above the predetermined threshold;
extrapolating replacement measurements from a previous frame to the discarded frame and a subsequent frame from the discarded frame, provided the previous and subsequent frames have an anthropometric deviation below the predetermined threshold.
12. The method of claim 11, wherein the image sequence has a threshold acceptable number of frames, wherein the anthropometric deviation is above the accepted predetermined threshold and/or, having a predetermined number of frames from a discarded frame, wherein if the threshold number of frames is exceeded, no joint angle is calculated.
13. The method of claim 11 or 12, wherein the replacement measurements are extrapolated from the previous and subsequent frames using a prediction filter.
14. The method of claim 10 when dependent on claim 9, wherein correcting for said anthropometric deviation comprises:
identifying a key point generated from computer vision anthropometric analysis associated with an anthropometric deviation above the acceptable threshold; and
replacing said computer vision generated key point with a corrected key point wherein the corrected key point is calculated using the anthropometric deviation.
15. The method of any of claims 10 to 14, further comprising:
prioritising angles which have a higher percentage change for the associated exercise over angles which have a lower percentage change when assessing anthropometric deviation.
16. The method of claim 10, further comprising:
identifying estimated key points which have an anthropometric deviation below a predetermined threshold as acceptable key points; and
using the acceptable key points and known anthropometric measurements to reconstruct a plurality of key points in a three-dimensional perspective using inverse kinematics.
17. The method of any of claims 1 to 16, further for augmenting feedback media indicative of exercise effectiveness using the enhanced anthropometric weighted measurement, comprising
the steps of any of claims 1 to 16; and
augmenting a feedback media indicative of the efficiency of the exercise.
18. A system for calculating an enhanced anthropometric weighted measurement of exercise effectiveness comprising:
a first memory module comprising a database with a plurality of reference exercise information stored thereupon;
an image acquisition module comprising a camera for obtaining an image sequence of a user perform a selected exercise;
an imaging processing module for analysing the image sequence to extract anthropometric information about the user;
a second storage module comprising computer readable media comprising instruction to perform the tasks of:
calculating a percentage ratio between the reference anthropometry and the user’s anthropometry for body parts associated with the selected exercise;
calculating a difference in a load level based on the anthropometric difference between the reference exercise and analysis of anthropometry of the image sequence;
adjusting the predetermined set of rules for said selected exercise to account for the difference in load level;
measuring one or more angles of body parts and/or joints of the user performing the selected exercise from the image sequence;
analysing measurements of the one or more angles of body parts and/or joints against the adjusted set of rules to assess an efficiency of the exercise; and
a processor to execute the instructions stored in the second storage module.
19. The system of claim 18, wherein the computer readable media of the second storage module further comprises instruction to perform the tasks of: any of claims 2 to 17.
20. The system of claim 18 or 19, wherein the system is in the form of an application downloaded onto, or accessed by, a user’s personal computer device.
21. The method of any of claims 1 to 17, or the system of claim 18 to 20, wherein the one or more angles of body parts and/or joints are measured, and/or the resulting measurements are analysed, in real-time.
NO20240165A 2024-02-23 2024-02-23 System and method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness NO20240165A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
NO20240165A NO20240165A1 (en) 2024-02-23 2024-02-23 System and method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness
PCT/NO2025/050028 WO2025178499A1 (en) 2024-02-23 2025-02-21 System and method for enhanced anthropometric analysis of exercise effectiveness

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NO20240165A NO20240165A1 (en) 2024-02-23 2024-02-23 System and method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness

Publications (1)

Publication Number Publication Date
NO20240165A1 true NO20240165A1 (en) 2025-08-25

Family

ID=95155140

Family Applications (1)

Application Number Title Priority Date Filing Date
NO20240165A NO20240165A1 (en) 2024-02-23 2024-02-23 System and method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness

Country Status (2)

Country Link
NO (1) NO20240165A1 (en)
WO (1) WO2025178499A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2609858A1 (en) * 2011-12-28 2013-07-03 Samsung Electronics Co., Ltd Method for measuring quantity of exercise and display apparatus thereof
US20130171601A1 (en) * 2010-09-22 2013-07-04 Panasonic Corporation Exercise assisting system
KR102294261B1 (en) * 2020-12-22 2021-08-26 정재훈 System for evaluating exercise motion for coaching strength training

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20240198177A1 (en) * 2020-03-04 2024-06-20 Peloton Interactive, Inc. Exercise instruction and feedback systems and methods
CN111652078A (en) * 2020-05-11 2020-09-11 浙江大学 A computer vision-based yoga action guidance system and method
CN112597933B (en) * 2020-12-29 2023-10-20 咪咕互动娱乐有限公司 Action scoring method, device and readable storage medium
CN115445170B (en) * 2022-07-30 2024-06-25 华为技术有限公司 A kind of exercise reminder method and related equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130171601A1 (en) * 2010-09-22 2013-07-04 Panasonic Corporation Exercise assisting system
EP2609858A1 (en) * 2011-12-28 2013-07-03 Samsung Electronics Co., Ltd Method for measuring quantity of exercise and display apparatus thereof
KR102294261B1 (en) * 2020-12-22 2021-08-26 정재훈 System for evaluating exercise motion for coaching strength training

Also Published As

Publication number Publication date
WO2025178499A1 (en) 2025-08-28

Similar Documents

Publication Publication Date Title
US12033076B2 (en) Systems and methods for assessing balance and form during body movement
US11338174B2 (en) Method and system of planning fitness course parameters
US9750454B2 (en) Method and device for mobile training data acquisition and analysis of strength training
US9514534B2 (en) Dynamic movement assessment system and method
Myer et al. The effects of plyometric versus dynamic stabilization and balance training on lower extremity biomechanics
US6632158B1 (en) Monitoring of training programs
McKean et al. Quantifying the movement and the influence of load in the back squat exercise
WO1998040126A9 (en) System and method for monitoring training programs
US20240091593A1 (en) System and Method for Strength Training
US20220047920A1 (en) Systems and methods for personalized fitness assessments and workout routines
Luangaphirom et al. Real-time weight training counting and correction using MediaPipe
Borisov et al. Application of Computer Vision Technologies to Reduce Injuries in the Athletes’ Training
US20240331832A1 (en) System for dynamically analyzing and improving physical performance and injury mitigation in tactical performers
NO20240165A1 (en) System and method for calculating an enhanced anthropometric weighted measurement of exercise effectiveness
Walsh et al. The effect of fatigue on climbing fluidity and hand movements
Burnie The effects of strength training on intermuscular coordination during maximal cycling
Noteboom Injury prevention in fitness and strength training.: Technology for musculoskeletal load optimization
Chapman Biomechanical analyses of bodyweight unilateral lower limb exercise tasks-comparison of common squatting and lunging movements
Hirsch Adapting Manifold-Based Analyses to Describe How Personal and Task Constraints Impact Lower Extremity Movement Variability During Exercise
Ernazarova METHOD OF REHABILITATION AND INTEGRAL ASSESSMENT OF STUDENTS WITH PHYSICAL DISABILITIES
Martínez-Freire et al. Intelligent Electronic System for Optimizing Postures in Lifting of Power
Chaubet Exploring the Influence of Strength and Load on Back Squat Kinematics During Sets to Volitional Failure
Myers et al. Musculoskeletal capacity and serve mechanics in professional women’s tennis players
Webb Factors influencing snatch performance in elite weightlifters
Boman Evaluating the Segmental Contribution to Whole-Body Center-of-Mass Movement in Depth Jumping