[go: up one dir, main page]

GB2641713A - Monitoring limb movement - Google Patents

Monitoring limb movement

Info

Publication number
GB2641713A
GB2641713A GB2407239.9A GB202407239A GB2641713A GB 2641713 A GB2641713 A GB 2641713A GB 202407239 A GB202407239 A GB 202407239A GB 2641713 A GB2641713 A GB 2641713A
Authority
GB
United Kingdom
Prior art keywords
data
voluntary
limb
subject
features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2407239.9A
Other versions
GB202407239D0 (en
Inventor
Fusari Gianpaolo
Quan Xueyuan
Gibbs Ella
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ip2ipo Innovations Ltd
Original Assignee
Imperial College Innovations Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imperial College Innovations Ltd filed Critical Imperial College Innovations Ltd
Priority to GB2407239.9A priority Critical patent/GB2641713A/en
Publication of GB202407239D0 publication Critical patent/GB202407239D0/en
Priority to PCT/GB2025/051106 priority patent/WO2025243030A1/en
Publication of GB2641713A publication Critical patent/GB2641713A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1101Detecting tremor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4082Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1107Measuring contraction of parts of the body, e.g. organ or muscle

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Signal Processing (AREA)
  • Psychiatry (AREA)
  • Neurology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Fuzzy Systems (AREA)
  • Neurosurgery (AREA)
  • Developmental Disabilities (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Measuring voluntary limb movements of a subject comprises receiving N-axis accelerometer data S1 corresponding to at least one limb of the subject and a measurement period, where N is greater than or equal to two. A data model is applied to classify S4 the accelerometer data as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject. The classification of the measurement period is then stored S5. In another aspect there is a system with an apparatus for securing to at least one limb of a subject, which comprises an accelerometer. A data processing device is coupled to the apparatus. A pre-trained machine learning model may be trained to classify the accelerometer data, where it receives labelled measurement periods. A second, fine-tuning machine learning model may be trained to map classifications of the first model to final classifications in a supervised learning process, using the labelled measurement periods as ground truth.

Description

[0001] Monitoring limb movement
[0002] Field
[0003] The present invention relates to a method and devices for measuring limb movements and distinguishing between voluntary and involuntary movements.
[0004] Background
[0005] After stroke, or other brain injuries, repetitive movement physical therapy is a common standard for rehabilitation. Existing models of care rely on one-to-one sessions for physical therapy. However, delivery of this therapy is resource intensive, and may fall short, especially after patients are discharged from acute clinical care.
[0006] Technological innovations may provide options to remediate and enhance rehabilitation services, for example by: tracking rehabilitation activity and progress over time; allowing self-management of rehabilitation; and facilitating remote access and monitoring for clinicians as well as goal-setting.
[0007] Fusari et al, "Protocol for a feasibility study of OnTrack: a digital system for upper limb rehabilitation after stroke", https://doi.org/10.1136/bmjopen-2019-034936, describes a protocol including software for smart devices and coaching support and relating to treatment of arm weakness after stroke.
[0008] Other proposals for remote delivery and/or monitoring of physical therapy are described in US 2014/0081661 Al, US 2019/0200915 Al, US 2022/0062392 Al, WO 2019/207346 Al, US 2020/0258631 Al, EP 4 094 684 Al, US 11 302 448 B1, WO 2016/081830 Al, US 10 130 31181, US 2022/0020469 Al, and US 2022/0020469 Al. -2 -
[0009] Summary
[0010] According to a first aspect of the invention, there is provided a method of measuring voluntary limb movements of a subject. The method includes receiving N-axis accelerometer data corresponding to at least one limb of the subject and a measurement period. N is greater than or equal to two. The method also includes applying a data model to classify the accelerometer data as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject. The method also includes storing the classification of the measurement period.
[0011] Applying a data model to classify the accelerometer data may include determining a number/set of features of the accelerometer data, and applying the data model to the number/set of features.
[0012] The number/set of features may include: * a first feature indicative of an average value; * a second feature indicative of a lower bound value; * a third feature indicative of an upper bound value; * a fourth feature which is one of a standard deviation and a variance value; * a fifth feature based on a sum of squares; and * a sixth feature based on a frequency of sign changes.
[0013] The first feature may be a mean average. The first feature may be a median average. The first feature may be a mode average.
[0014] The second feature may be a minimum value. The second feature may be a 5th centile value. The second feature may be a 10th centile value. The second feature may be a 15th centile value. The second feature may be a 20th centile value. The second feature may be a 25th centile (or "lower quartile") value.
[0015] The third feature may be a maximum value. The third feature may be a 95th centile value. The third feature may be a 90th centile value. The third feature may be an 85th centile value. The third feature may be a 80th centile value. The third feature may be a 75th centile (or "upper quartile") value.
[0016] The fifth feature may be a square root of the sum of squares.
[0017] The sixth feature may be a count of a total number of sign changes. -3 -
[0018] The number/set of features may be calculated separately for each of the N-axes of the accelerometer data.
[0019] The number/set of features may be calculated once for all N-axes of the accelerometer data. When the plurality of features are calculated once for all N-axes of the accelerometer data, the N-axis accelerometer data may be converted to a single magnitude for each time within the measurement period before calculating the plurality of features. The conversion to a single magnitude may be conducted by obtaining the vector magnitude at each time, followed by subtracting a value corresponding to gravity from the vector magnitude.
[0020] When one or more of the plurality of features requires sign information, for example the sign change count, such features may still be calculated using one or more axes of the N-axis accelerometer data. Alternatively, in a case where the N-axis accelerometer data includes all three translational degrees of freedom and all three rotational degrees of freedom, the plurality of features may be calculated once for all three translational degrees of freedom and once for all rotational degrees of freedom.
[0021] In other examples, the N-axis accelerometer data may take the form of attitude and/or gravity data.
[0022] The data model may be configured to classify voluntary limb movements into one of two or more pre-defined categories of voluntary limb movement.
[0023] The two or more pre-defined categories of voluntary limb movement may include non-repetitive (or "one-off") voluntary limb movements. Examples of non-repetitive arm movements may include reaching for and/or retrieving an item, putting on or taking of clothing, and so forth. The two or more pre-defined categories of voluntary limb movement may include continuous or repetitive voluntary limb movements. Examples of repetitive arm movements may include brushing teeth, cleaning (scrubbing), exercising and so forth.
[0024] The data model may be configured to classify non-voluntary limb movements into one of two or more pre-defined categories of non-voluntary limb movement. The pre-defined categories of non-voluntary limb movement may include, without being limited to, "transport", "walking", "non-limb muscle movements", tremors, spasms, tics, and "other". -4 -
[0025] Non-limb muscle movement may refer to movement of the limb that is not initiated or performed purposefully, or voluntarily, by the individual. This may include, for example, movement of a limb reflexively or autonomically reacting to the motion of a vehicle whilst in transport (for example, maintaining balance on a moving bus), muscle tremors, the non-swing of a restrained arm whilst walking, the motion caused by breathing whilst at rest, and so forth.
[0026] The data model may include, or take the form of, a trained machine learning model. The trained machine learning model may include, or take the form of, a multi-layer perceptron (MLP). The trained machine learning model may include, or take the form of, a convolutional neural network (CNN). The trained machine learning model may include, or take the form of, a support vector machine (SVM). The trained machine learning model may include, or take the form of, a transformer model. The trained machine learning model may include, or take the form of, a decision tree. The trained machine learning model may include, or take the form of, a hidden Markov model (HMM). The trained machine learning model may include, or take the form of, a long short-term memory (LSTM) model.
[0027] The data model need not include a machine learning model. The data model may take the form of a clustering analysis, a rules based approach and so forth.
[0028] Receiving N-axis accelerometer data may include, or take the form of, obtaining the N-axis accelerometer data from a wearable device arranged on at least one limb of the subject.
[0029] The method may be carried out for each of a sequence of two or more measurement periods. Each measurement period may have an identical duration. The two or more measurement periods may have variable durations.
[0030] Each measurement period may overlap with one or more previous and/or one or more subsequent measurement periods. In other words, each measurement period may correspond to a moving window, which may be shifted by a sampling interval Sts between measurement periods. The sampling interval Zits may be less than a duration Atm of the measurement period.
[0031] The method may also include comprising determining and storing a total period corresponding to voluntary limb movements. The method may also include determining and storing a total period corresponding to limb movements. The method -5 -may also include determining and storing a total period corresponding to non-voluntary limb movements.
[0032] The limb may be an arm of the subject. The limb may be a leg of the subject.
[0033] The value of N may be greater than or equal to three.
[0034] According to a second aspect of the invention there is provided a method of training a machine learning model to function as the data model for a method according to the first aspect. The method includes receiving a first set of labelled data comprising N-axis accelerometer data corresponding to a number of first measurement periods corresponding to voluntary limb movement. The method also includes receiving a second set of labelled data comprising N-axis accelerometer data corresponding to a plurality of second measurement periods corresponding to non-voluntary limb movement. The method also includes training the machine learning model in a supervised learning process using the first and second sets of labelled data as ground truth.
[0035] The method of the second aspect may include features corresponding to any features of the method of the first aspect. Definitions applicable to the method of the first aspect (or features thereof) may be equally applicable to the method of the second aspect (or features thereof).
[0036] According to a third aspect of the invention, there is provided a method of training a machine learning model to function as the data model for a method according to the first aspect. The method includes receiving a first, pre-trained machine learning model trained to classify accelerometer data corresponding to at least one limb of the subject and a measurement period as a voluntary limb movement or a non-voluntary limb movement of the subject. The method also includes receiving a plurality of labelled measurement periods. Each labelled measurement period corresponds to N-axis accelerometer data corresponding to at least one limb of the subject and a measurement period. Each labelled measurement period is associated with a label input by the subject and indicating a voluntary arm movement or a non-voluntary limb movement. The method also includes training a second, fine-tuning machine learning model to map classifications of the first machine learning model to final classifications in a supervised learning process using the labelled measurement periods as ground truth. -6 -
[0037] The first, pre-trained machine learning model may have been trained according to the method of the second aspect.
[0038] The method of the third aspect may include features corresponding to any features of the methods of the first and/or second aspects. Definitions applicable to the method of the first and/or second aspects (or features thereof) may be equally applicable to the method of the third aspect (or features thereof).
[0039] According to a fourth aspect of the invention, there is provided apparatus configured for securing to at least one limb of a subject and comprising an accelerometer. The apparatus is also configured to obtain N-axis accelerometer data corresponding to a measurement period from the accelerometer. The value of N is greater than or equal to two. The apparatus is also configured to apply a data model to classify the accelerometer data as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject. The apparatus is also configured to store the classification of the measurement period.
[0040] The apparatus of the fourth aspect may include features corresponding to any features of the method(s) of the first, second and/or third aspect(s). Definitions applicable to the method(s) of the first, second and/or third aspect(s) (or features thereof) may be equally applicable to the apparatus of the fourth aspect (or features thereof).
[0041] According to a fifth aspect of the invention, there is provided a system including an apparatus configured for securing to at least one limb of a subject and comprising an accelerometer. The system also includes a data processing device communicatively coupled to the apparatus. The system is configured to obtain N-axis accelerometer data corresponding to a measurement period from the accelerometer. The value of N is greater than or equal to two. The system is also configured to apply a data model to classify the accelerometer data as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject. The system is also configured to store the classification of the measurement period.
[0042] The system of the fifth aspect may include features corresponding to any features of the method(s) of the first, second and/or third aspect(s) and/or of the apparatus of the fourth aspect. Definitions applicable to the method(s) of the first, second and/or third aspect(s) (or features thereof) and/or of the apparatus of the fourth aspect (or features thereof) may be equally applicable to the system of the fifth aspect (or features thereof). -7 -
[0043] The data processing device may include, or take the form of, a mobile phone, smart-phone, tablet computer or similar device. The data processing device may include, or take the form of, a dedicated controller unit including one or more digital electronic processors, volatile storage (memory) coupled to the one or more digital electronic processors, and non-volatile storage storing program code which, when executed, causes the one or more digital electronic processors to carry out the functions of the data processing device. The wearable device make take any form described in relation to the data processing device. The intermediate device make take any form described in relation to the data processing device.
[0044] The system may be configured to apply the data model to classify the accelerometer data by determining a plurality of features of the accelerometer data, and applying the data model to the plurality of features.
[0045] The apparatus may be configured to determine the number/set of features of the accelerometer data. The data processing device may be configured to determine the plurality of features of the accelerometer data. The apparatus may be configured to apply the data model to the plurality of features. The data processing device may be configured to apply the data model to the plurality of features.
[0046] The data processing device may be communicatively coupled to the apparatus using a short-range wireless communications protocol. The short-range wireless communications protocol may take the form of Bluetooth (RTM). The short-range wireless communications protocol may take the form of near-field communication (NFC). The short-range wireless communications protocol may take the form of low-power wide-area-network (LPWAN). The short-range wireless communications protocol may take the form of wi-fi according to the IEEE 802.11 standard.
[0047] References to standards refer specifically to the versions in force in November 2023, and communicative coupling of the apparatus and data processing device may preferably be back-compatible with one or more previous versions of any standard used. -8 -
[0048] Brief description of the drawings
[0049] Certain embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which: Figure 1 is a schematic block diagram of an exemplary system; Figure 2 is a process flow diagram for a method of measuring voluntary limb movements of a subject; Figure 3 schematically illustrates measurement of overlapping measurement periods; Figure 4 is a schematic block diagram illustrating training of a machine learning model, and the connections to the method shown in Figure 2; Figure 5 is a schematic block diagram of a multi-layer perception model for use in the method shown in Figure 2; Figure 6A plots an example of experimentally measured accelerometer data; Figure 6B plots the data of Figure 6A after conversion to single magnitude data; Figure 6C plots an example of single magnitude data corresponding to transport in an automobile, measured from the arm of a subject; Figure 6D plots an example of single magnitude data corresponding to non-voluntary but natural arm movements when walking; Figure 6E plots an example of single magnitude data corresponding to an arm which is restrained whilst walking (for example held in a sling); and Figure 6F plots an example of single magnitude data corresponding to measurements taken from the arm of a subject during a non-limb movement.
[0050] Detailed description of certain embodiments
[0051] In the following, like parts shall be denoted by like reference numerals.
[0052] The methods, devices and systems described in this specification have been developed with the intention to improve, support and enhance rehabilitation through physical movement associated with daily activities. Many of the improvements described herein have been developed based on the inventors' experiences working closely with patients (including for example stroke survivors, breast cancer survivors, etc.) and clinicians. The methods, devices and systems described in this specification utilise smart devices, whether generic devices executing suitable software or bespoke, customised devices, which allow at-home feedback on activity and progress, and remote coaching to help manage rehabilitation.
[0053] As an overview example, an exemplary system termed the "Activity Monitoring Rehab System" has three main components: -9 - * Activity tracking using wearable technology such as a smart watch carrying out a purpose-built monitoring and classification method which is described herein. The monitoring and classification method may be packaged as a mobile application (or more colloquially "app") which provides real-time data on gross affected limb activity (for example arm activity); * A self-management coaching aspect including automated personalised feedback on physical activity, personalised advice, goal management, educational material, and clinician remote coaching; * A clinician-facing software platform -for providing remote access to patients' progress and for personalising care experiences depending on patients' needs and ability.
[0054] The subject of the present specification and the appended claims is the method of monitoring and classifying limb movements described hereinafter, as well as methods of training an appropriate data model and apparatuses and systems for implementing such methods. The methods, devices and systems described herein are not limited to application/deployment in the context of the "Activity Monitoring Rehab" system outlined hereinbefore, and this is presented as an example to aid understanding of the present invention.
[0055] Referring to Figure 1, a block diagram of an exemplary system 1 is shown.
[0056] The system 1 is suitable for implementing a method of measuring voluntary limb movements of a subject. The system 1 includes a wearable device 2 and one or more client(s) 3 communicatively coupled to a data processing device 4 via one or more networks 5.
[0057] The wearable device 2 is to be worn or otherwise attached (for example using a strap) to a limb (not shown) of a user which is to be monitored. For example, an arm or leg for which rehabilitation/recovery exercises need to be performed. The wearable device 2 includes an N-axis accelerometer 6 which is used to obtain raw data of movement of the monitored limb, where N is greater than or equal to 2, i.e. I\12. For example, the accelerometer 6 may be 3-axis and may measure translational accelerations ax, ay, az along three mutually orthogonal local axes, generally referred to as x-, y-, and z-axes. In other examples, the accelerometer 6 may be a complete 6-axis sensor, recording both translational ax, ay, az and rotational rx, ry, rz accelerations, where rx denotes rotational acceleration about the local x-axis of the accelerometer 6 and so forth. In other words, the accelerometer may include an -10 -orientation sensor/gyro. The accelerometer 6 may output raw data in the form of accelerations and/or angular accelerations, or related quantities from which such information is available (for example by differentiation and/or scaling). The accelerometer 6 may be of any suitable type, although compact, chip-based accelerometers 6 are preferable.
[0058] The wearable device 2 also includes one or more first digital electronic processors 7 (or simply "first processors" hereinafter), first random-access memory 8, first nonvolatile storage 9 storing first program code 10 and a first network interface 11, all connected to one another and the accelerometer 6 by a first bus 12. When executed by the one or more first processors 7, the first program code 10 causes the wearable device 2 to carry out the functions and steps of the method as described hereinafter in relation to Figure 2. Other necessary components such as battery for powering the wearable device 2 are not shown in Figure 1 for simplicity of illustration.
[0059] The wearable device 2 may therefore be relatively simple, and could take the form of a lightweight, simple and cheap to produce bespoke device. For example, the wearable device 2 need not include extensive computing power, and the first processor(s) 7, memory 8 and storage 9 may be provided by a microcontroller or similar device. In such an example, the network interface 11 may be provided by a short range wireless communication module such as a Bluetooth (RTM) interface (which could be integral to a microcontroller used).
[0060] Alternatively, in some implementations the wearable device 2 may instead take the form of a "smart" watch incorporating further features such as a display 13 and one or more user interface devices 14, for example buttons and/or a combined display 13 and user interface device 14 such as a touchscreen.
[0061] A wearable device 2 having greater computational processing power may locally implement a larger fraction, or even all, of the steps of the monitoring method described hereinafter in relation to Figure 2, and in such cases the first storage 9 may also store the data model 15 used for classifying movement. In such an example, the outputs of the monitoring method may be stored in the first storage 9 as a local cache 16, which may be accessed and/or uploaded to the data processing device 4 periodically or when access is available via the network(s) 5.
[0062] The network(s) 5 may include any combination of wired and/or wireless networks, which may include local area networks, wide area networks, mobile communications networks (3G, 4G, 5G and so forth), the Internet and so forth. Connection of the wearable device 7 to data processing device 4 may be via an intermediate device 17. For example, a wearable device 2 may communicate raw or processed data from the accelerometer to an intermediate device 17 in the form of a mobile or smart phone using a short-range wireless protocol such as Bluetooth (RTM), low-power wide-area-network (LPWAN), wi-fi according to the IEEE 802.11 standard, and so forth. The intermediate device 17 in the form of a mobile/smart phone may then relay this to the data processing device 4 using networks 5 including a mobile communications network (for example 4G or 5G) and the internet.
[0063] In some implementations, if an intermediate device 17 such as a mobile/smart phone is used, or any other device of comparable computing power, then some or all of the data processing steps of the method described hereinafter in relation to Figure 2 may be executed by one or more processor(s) (not shown) of the intermediate device 17.
[0064] The data processing device 4 includes a network interface 18 for communicating with the network(s) 5, one or more second digital electronic processors 19 (or simply "second processors" hereinafter), second random-access memory 20 and second nonvolatile storage 21, interconnected by a second bus 22. The second storage 21 stores second program code 23, the data model 15 used for classifying movements and one or more user data caches 24. The second program code 23, when executed by the one or more second processors 19, causes the data processing device 4 to execute the functions and steps of the method described herein. Each user data cache 24 corresponds to a specific user, and keeps a record of recorded and classified movements for that user. User data caches 24 are preferably encrypted, and may additionally be anonymised for some purposes. The data processing device 4 may take the form of a server. The data processing device 4 may optionally include a display (not shown) and one or more user interface devices (not shown) to allow direct access, though more commonly access will be conducted via the client(s) 3.
[0065] The data processing device 4 may include, or take the form of, a mobile phone, smart-phone, tablet computer, laptop computer, desktop computer, server, or any suitable device including the necessary elements.
[0066] The one or more client(s) 3 provide access to doctors, physical therapists and similar clinicians/practitioners who are managing the physical therapy of a user. The client(s) 3 may access user data caches 24 for which they are authorised, for example the patients they are responsible for, to review and analyse the classified movement data -12 -stored therein. The client(s) 3 may also be used to set and/or adjust exercises and targets via the data processing device 4, which may then be communicated to the wearable device 4 (and/or the intermediate device 17) via the network(s) 5. The one or more client(s) 3 may also include the user(s) of the system 1, who may wish to access the data processing device 4, for example to review and analyse their own statistics, or to set/adjust their own personal goals and targets. Each client 3 may take any suitable form such as, for example, a web-application or a mobile/smart phone "app" executed on any suitable type of computing device such as, without limitation, a desktop computer, a laptop computer, a tablet computer, a smartphone, and so forth.
[0067] The clients 3 may, but need not, take the form of separate devices. The functions of the client(s) 3 described hereinbefore may additionally or alternatively by provided by one or more of the wearable device 2, the data processing device 4 and/or the intermediate device 17. In other words, any one of the wearable device 2, the data processing device 4 and/or the intermediate device 17 may function as a client 3 in addition to their other functions.
[0068] Method of measuring voluntary limb movements of a subject Referring also to Figure 2, a process flow diagram is shown for a method of measuring voluntary limb movements of a subject.
[0069] The method may be implemented using the exemplary system 1 shown in Figure 1, but this is not essential, and the method may be implemented using any system capable of providing the required accelerometer measurements and sufficient computing power to carry out the steps described herein.
[0070] Once the monitoring is started, N-axis accelerometer data is received (step Si.). The N-axis accelerometer data corresponds to at least one limb of the subject (or "user"), and spans a measurement period Lit. (Figure 3). The number N is greater than or equal to two, i.e. N 2. The N-axis accelerometer data may be obtained using any suitable device coupled to the limb of the subject, for example using the accelerometer 6 of the wearable device 2 of the exemplary system 1, with the wearable device 2 arranged on a limb of the subject. The limb may be an arm or a leg of the subject. In some cases, two or more (even all) limbs of the subject may be monitored using separate wearable devices 2.
[0071] -13 -As an specific example, a case of 3-axis accelerometer 6 which outputs an acceleration vector a(t) = (ax(t), ay(t), az(t)) having components ax, ay, az along x-, y-and z-axes at time t shall be considered. If measurements are obtained at a sampling interval 5t, then the accelerometer data corresponding to a measurement period Atm starting at a time to may be denoted {a(to), a(to+5t), a(to+ Atm-50, a(to+ Litm)}, or in a more compact notation, as {ao, al, ..., ak, aK}, with ao = a(to), a= = a(to+50, ak = a(to+k.5t) for 1<k<K and K.5t = Atm (K is a positive definite integer).
[0072] Although herein we mainly refer to an example in which N = 3, in other examples the value of N may be greater than or equal to three. Although herein we mainly refer to an example in which all the monitored accelerations are translational along x-, y-and z-axes, in other examples mixtures of translational and rotational accelerations may be used.
[0073] The received accelerometer data is processed to determine/extract a number M of features (step 53). This step, also referred to as "feature extraction", may be carried out by the first processor(s) 7 of the wearable device 2. Alternatively, the raw accelerometer data may be transmitted to the data processing device 4 (or the intermediate device 17) which may carry out the feature extraction process (step S3).
[0074] The inventors have found that one useful set of features includes: 1. a first feature, FL indicative of an average value of the accelerometer data; 2. a second feature, F2, indicative of a lower bound value of the accelerometer data; 3. a third feature, F3, indicative of an upper bound value of the accelerometer data; 4. a fourth feature, F4, which is one of a standard deviation of the accelerometer data and a variance value of the accelerometer data; 5. a fifth feature, Fs, based on a sum of squares of the accelerometer data; and 6. a sixth feature, F6, based on a frequency of sign changes of the accelerometer data.
[0075] In general, the number and type of features Fi, Fm may be varied, and may be more or fewer than M=6 and/or may be of different types to those listed above.
[0076] For example, at a minimum the first Fi and fourth F4 features listed above may be used to obtain reasonable classifications. However, the inventors have observed that using fewer than the listed features F1, ..., F6 may lead to reduced classification accuracy (though still acceptable for some applications), whilst using more than the -14 -listed features F6 can may provide diminishing returns in terms of the improvement of accuracy compared to the increase in required computing resources. In general, the selection of features Fm should aim to maximise the independence of each feature Fm from each other feature Fism. In other words, features Fm should be selected which represent differentiable characteristics of the accelerometer data.
[0077] For example, the first feature Fi may take the form of a mean average, a median average, or a mode average, or any other suitable metric which may be computed to indicate the average value of the accelerometer data.
[0078] The second feature, F2, may take the form of a minimum value, a centile value such as the 5th, 10th, 15th, 20th or 25th centile value (the 25th centile value is sometimes referred to at the "lower quartile" value, or any other suitable metric which may be computed to indicate the lower bound value of the accelerometer data. Similarly, the third feature, Fs, may take the form of a maximum value, a centile value such as the g5th, gothi 85-th, 80th or 75th (or "upper quartile") centile value, or any other suitable metric which may be computed to indicate the upper bound value of the accelerometer data.
[0079] The fifth feature, F5, may take the form of the sum of squares of the accelerometer data, the square root of the sum of squares of the accelerometer data, or any other suitable metric on a sum of squares of the accelerometer data.
[0080] The sixth feature, Fs, may take the form of a count of a total number of sign changes, an average time between sign changes, or any other metric which may be computed to be indicative of the frequency of sign changes of the accelerometer data.
[0081] In one example, the features determined in the feature extraction process (step S3) are the set: Fi = mean average; F2 = minimum value, F3 = maximum value; F4 = standard deviation; F5 = root of the sum of square; and F6 = sign change count.
[0082] The features Fm may be calculated separately for each of the N-axes of the accelerometer data. For the case of three-axis data a(t) = (ax(t), ay(t), az(0) described hereinbefore, this would result in three values of the first feature Fi, one for each axis, i.e. Fix = avg(ax), Fiy = avg(ay) and Fiz = avg(az), and similarly for each other feature F2, ... , -15 -However, in practice, treating the local x-, y-and z-axes of the accelerometer 6 individually is not meaningful unless the device has been at least roughly aligned and calibrated relative to the user's limb. Fortunately, the inventors have discovered, somewhat surprisingly, that accurate classification may be realised when each feature FM is calculated once for all N-axes of the accelerometer data, i.e. the N-axes of data are aggregated to a single magnitude for each time in the measurement period Atm, then the features FM are computed using the single magnitudes.
[0083] When the features FM are calculated once for all N-axes of the accelerometer data, the N-axis accelerometer data is first converted to a single magnitude la(t)I for each time t within the measurement period Atm before calculating the features n, FM. The conversion to a single magnitude is conducted by treating the N-axis accelerometer data as an N-dimensional vector, then obtaining the vector magnitude at each time t, followed by subtracting a value corresponding to the acceleration due to gravity from the resulting vector magnitude. When implemented, the determination of single magnitude data is a part of (a pre-processing step) the feature extraction (step S3) process.
[0084] Returning to the example of three-axis accelerometer data {ao, ai, ..., a k, ak} defined hereinbefore, the single magnitude data would be obtained as {IaoI, lakl, ..., lak1}, where for example: lakl= lax(t + k602 + ax(t + k602 + ax(t + k602 -(1) In which g is acceleration due to gravity. The features FM are then calculated using the single magnitude data {laol, ..., laki, ..., lad} as input. In other examples, the N-axis accelerometer data may take the form of attitude and/or gravity data, and so forth.
[0085] When one or more of the plurality of features requires sign information, for example feature F6 defined hereinbefore as the sign change count, such features may be calculated using the single magnitude data, for example { laol, lakl, lad}, due to the subtraction of the acceleration due to gravity. Alternatively, a feature such as F6 which requires sign information may instead be calculated using data for one or more individual axes of the N-axis accelerometer data.
[0086] Referring back to Figure 2, following feature extraction (step S3), the data model 15 is applied to the extracted features FM (step S4) to classify the accelerometer data -16 -as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject.
[0087] Whilst for many applications a classification into voluntary or non-voluntary movement will be sufficient, in some applications more detail may be desired. For example, the data model 15 may be further configured to sub-classify voluntary limb movements into one of two or more pre-defined categories of voluntary limb movement such as, for example, non-repetitive (or "one-off") voluntary limb movements or continuous or repetitive voluntary limb movements. Examples of non-repetitive limb movements may include, for example, reaching for and/or retrieving an item, putting on or taking of clothing and so forth, and the data model 15 may be further configured to categorise non-repetitive voluntary limb movements into such specific actions. Examples of repetitive arm movements may include brushing teeth, cleaning (scrubbing), exercising and so forth, and the data model 15 may be further configured to categorise repetitive voluntary limb movements into such specific actions.
[0088] In some applications, further stratification of non-voluntary movements may be desired. For example, the data model 15 may be further configured to sub-classify non-voluntary limb movements into one of two or more pre-defined categories of non-voluntary limb movement. Pre-defined categories of non-voluntary limb movement may include, without being limited to, "transport", "walking" (for example arm movement when walking), "non-limb muscle movements", tremors, spasms, tics, and "other". Herein, the term "non-limb muscle movement" refers to movement of the monitored limb of a user that is not initiated or performed purposefully, or voluntarily, by the individual. This may include, for example, movement of a limb reflexively or autonomically reacting to the motion of a vehicle whilst in transport (for example, maintaining balance on a moving bus), muscle tremors, the non-swing of a restrained arm whilst walking, the motion caused by breathing whilst at rest, and so forth.
[0089] The data model 15 applied (step S4) to the features FM may be of any suitable type for classifying data based on an input vector. The data model 15 is preferably a trained machine learning model 25 (Figures 4 and 5). The trained machine learning model 25 may include, or take the form of, a convolutional neural network (CNN), a multi-layer perceptron, a support vector machine (SVM), a transformer model, a decision tree, a hidden Markov model (HMM), a long short-term memory (LSTM) model, or any other machine learning model suitable for application to a classification task. An example of training a data model 15 in the form of a trained machine learning model 25 is described in relation to Figure 4.
[0090] -17 -The application of the data model (step S4) may be carried out by the data processing device 4, based on features FM received from the wearable device 2 (as illustrated in Figure 1). Alternatively, the application of the data model (step 54) may be carried out by the intermediate device 17 when used. If the wearable device 2 includes sufficient computing capacity, the application of the data model (step S4) may even be carried out locally by the wearable device 2.
[0091] Referring again to Figure 2, following the application of the data model (step S4) to classify the accelerometer data for the most recent measurement period Atm, the classification of the accelerometer data for that measurement period Atm as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject (and/or any sub-classifications thereof) is stored (step S5).
[0092] Based on the classification of the accelerometer data for the most recent measurement period Atm, total durations of voluntary limb movements and/or non-voluntary limb movements (and/or sub-classifications of either) may be updated. Total durations may be reset in response to a command, or in response to a schedule. For example, total durations may be tracked daily, weekly and/or monthly.
[0093] When implemented using the system 1, the classification is stored (step S5) at least in the user data cache 24 corresponding to the user. However, when the data model 15 is applied (step S4) on the wearable device 2, the classification may also be stored (step S5) in the local cache 16 of the wearable device 2. Additionally, if the intermediate device 17 is used, a copy of the classification may optionally be stored in a further local cache (not shown) stored on the intermediate device 17.
[0094] If monitoring is to be continued (step S7IYes), then accelerometer data is received for the next measurement period Atm (step S1) and the analysis and classification repeated (steps S3 through S5).
[0095] In some implementations, the measurement periods Atm may be consecutive, in other words one after the previous without overlap. For example, a first measurement period may span ti t < t2 where t2 -ti = Atm, and the next measurement period would then be t2 t < t3 where t3 -t2 = Atm, and so forth.
[0096] However, referring also to Figure 3, the measurement periods Atm may be overlapped as illustrated, for example using conventional buffering techniques. For example, a -18 -first measurement period to-43 runs from to to t3 and is stored in a first buffer Buffi. Before the end of the first measurement period to-43, at time ti, a second measurement period ti-44 starts to be recorded to a second buffer Bu fb. At time t2, a third measurement period t2-4.5 starts to be recorded to a third buffer Buff3.
[0097] When the first measurement period to-43 finishes at time t3, the first buffer is processed using the method of classification described herein (and shown in Figure 2). Meanwhile, a fourth measurement period 1-3-46 starts to be recorded to a fourth buffer Buffo, allowing time for processing of the first buffer Buffi. In the illustrated example, the processing of the first buffer Buffi is shown as being rapid enough that the first buffer Buffi is available for re-use by the start of the fifth measurement period t4-)ti, but depending on the time required for processing by specific hardware, additional buffers may be readily added to maintain uninterrupted processing.
[0098] In this way, the spacing between classification outputs may be reduced to a classification sampling interval Ats which is less than the measurement period Atm. Of course, when overlapped measurement periods Atm are used, the total duration of voluntary and/or non-voluntary limb movement (and/or sub-classifications thereof) should be adjusted to avoid double-counting of overlapping periods. For example, using the timings illustrated in Figure 3, if the first and second measurement periods are both classified as voluntary movement, then the full measurement period Atm should be recorded corresponding the first measurement period, but only t4-t3 = Atm/3 = Ats should be added to the accumulated total period in respect of the second measurement period. If a kit measurement period Atm is classified as voluntary limb movement and the k+ lth measurement period Atm is classified as voluntary limb movement, the transition between voluntary and non-voluntary limb movement may be assigned as the end of the eh measurement period Atm (though other approximations of the end time may be used).
[0099] As examples of durations, a measurement period of Atm = 5 s and a classification interval of Zits = 1 s has been confirmed to work by the inventors.
[0100] Although the preceding explanations have presumed for simplicity that the measurement period Atm has a fixed duration, this is not essential. In some implementations variable measurement periods Atm may be used, for example triggered to start and/or end based on apply a threshold amplitude to the accelerometer 6 output, for example a(t). In other words, a measurement period Atm may be started in response to a measured magnitude of acceleration exceeding a -19 -triggering threshold, and may be terminated in response to a measured magnitude of acceleration dropping before an inactivity threshold (which may be the same as or different from the triggering threshold). This is only one example, and different triggers for starting and/or ending a measurement period Atm may be used.
[0101] Training the data model Referring also to Figure 4, a schematic overview of both the training and runtime of the method of classifying limb movement as voluntary or non-voluntary is shown.
[0102] This corresponds to an implementation in which the data model 15 takes the form of a trained machine learning model 25.
[0103] There are several distinct phases, shown separated by horizontal chained lines, including a data input phase 26, a training phase 27, an evaluation phase 28, and a runtime phase 29. The runtime phase 29 corresponds to execution of the method of classification explained hereinbefore and illustrated in Figure 2, for example implemented using the system 1 shown in Figure 1.
[0104] The data input phase 26 includes receiving (or retrieving from storage) first and second sets of labelled data 30, 31. The first set of labelled data 30 takes the form of N-axis accelerometer data corresponding to a number of first measurement periods Atm which are known to correspond to voluntary limb movement. The second set of labelled data 31 takes the form of N-axis accelerometer data corresponding to a number of second measurement periods Atm which are known to correspond to non-voluntary limb movement.
[0105] The first set of labelled data 30 may optionally be sub-divided into two or more first subsets corresponding to different types of voluntary limb movement. This is only necessary if it is desired to train the machine learning model 25 to sub-classify the corresponding types of voluntary limb movement. Similarly, the second set of labelled data 31 may optionally be sub-divided into two or more second subsets corresponding to different types of non-voluntary limb movement. This is only necessary if it is desired to train the machine learning model 25 to sub-classify the corresponding types of non-voluntary limb movement.
[0106] In the example shown in Figure 4, the first set of labelled data 30 does not include any first subsets, whilst the second set of labelled date 31 is made up of three second subsets 32a, 32b, 32c corresponding respectively to categories of walking (non- -20 -voluntary movement of an arm when walking), transport (for example in a bus or car) and non-limb muscle movement (for example movement resulting from breathing).
[0107] It is possible, and sufficient for some applications, to simply train the machine learning model 25 directly using the first 30 and second 31 sets of labelled data. However, this may result in overtraining and reduced accuracy when applied to other datasets.
[0108] Preferably, the first 30 and second 31 sets of labelled data are divided by the same fraction G to form a training set 33 and an evaluation set 34. In this way, the training set 33 includes a fraction G of the first set of labelled data 30 and a fraction G of the second set of labelled data 31, whilst the complements (fraction 1-G) of the first 30 and second 31 sets of labelled data form the evaluation set 34. The value of G may typically be in the range between (inclusive of end-points) 0.5 and 0.8. The inventors have found that a value of 0.6 provides good performance.
[0109] Following the data input phase 26 is the training phase 27. The training phase 27 and the following evaluation phase 28 may be iterated multiple times.
[0110] In the training phase 27, a set of features FM are computed for each measurement period Atm included in the training set 33 (step S8), referred to as "feature engineering". In some examples, a pre-set list of initial features Fm and thresholds may be used, in which case the feature engineering (step S8) is effectively identical to the feature extraction (step S3) performed in the runtime phase 29.
[0111] However, in other examples, it is possible to also use the selection of the particular set of features Fm, and/or thresholds involved in computation of such features FM, as hyperparameters which may be iterated during the training (see the dashed arrow return path from the evaluation at step S12IFail). When the selection of features FM will be used as a hyperparameter, for the first iteration a pre-set list of initial features Ft, FM and thresholds may be used.
[0112] The machine learning model 25 is then trained for one, or preferably more, epochs (step 59). For example, a machine learning model 25 in the form of a multi-layer perceptron (MLP -see for example Figure 5) may be trained for about 1000 epochs, whilst a convolutional neural network (CNN) might be trained for about 10 epochs.
[0113] Each epoch corresponds to processing the entire training set 33, comparing the classifications made by the machine learning model 25 against the ground truth labels, -21 -and adjusting the machine learning model 25, for example in the case of a neural network type model, by back-propagation of errors.
[0114] Control then passes to the evaluation phase 28. Each measurement period Atm in the evaluation phase is processed (step S10) to extract the features n, Fm. When a single set of features F./, Fm are used, this may be performed once, and need not be repeated on subsequent iterations. This may be identical to the runtime 29 feature extraction (step 53). When the choice of features Fm is used as a hyperparameter, newly added or changed feature F. may need to be (re)computed on subsequent iterations of the evaluation phase 28.
[0115] The present state of the data model 15, 25, for example the current weights of a neural network, is then applied to classify each measurement period Atm of the evaluation set 34 (step S11). This is not identical to the runtime 29 application (step S4), because the data model 15 will be updated at each iteration.
[0116] The accuracy of the classifications of the evaluation set 34 are then compared to a threshold fraction thresh (or equivalently percentage) (step S12). For example, a value of thresh = 0.9 was found to work well for the Activity Monitoring Rehab System protocol described hereinbefore. Of course, the threshold fraction thresh may be set according to the requirements of each particular application.
[0117] If the accuracy exceeds the threshold (step S12Ipass), then the trained machine learning model 25 may be stored, and deployed to provide the data model 15 for the runtime phase 29. The same trained machine learning model 25 may be used in conjunction with a number of different wearable devices 2, being used/worn by a number of different subjects.
[0118] However, if the accuracy does not exceed the threshold (step S12Ifail), then the training phase 27 is repeated.
[0119] Referring also to Figure 5, one example of a machine learning model 25 in the form of a MLP is shown.
[0120] The machine learning model 25 illustrated in Figure 5 corresponds to the example illustrated in Figure 4, using a fixed set of input features Fi = mean average; F2 = minimum value, F3 = maximum value; F4 = standard deviation; Fs = root of the sum of square; and F6 = sign change count.
[0121] -22 -The exemplary machine learning model 25 shown in Figure 5 is a three layer MLP having six input nodes, P hidden layer nodes HL(1), HL(p), HL(P) , and four output nodes. Each of the input nodes corresponds to one of the six features F6. The four output nodes corresponds respectively to voluntary movement (set 30 in Figure 4), walking (subset 32a in Figure 4), transport (subset 32b in Figure 4) and non-limb muscle movement (subset 32c in Figure 4). The connections between nodes represent weighted Sigmoid functions.
[0122] Of course, in general there will be M input nodes, once corresponding to each feature FM, and the number of output nodes will vary depending on the number (two or more) of classes/sub-classes which it is desired to measure. In addition, the number of hidden layers may also be increased if desired.
[0123] Using a measurement period of At, = 5 s and a machine learning model 25 in the form of the MLP shown in Figure 5 with a number P = 100 of hidden layer nodes, the inventors have obtained accuracy rates of 0.95. The MLP was trained for 1000 epochs, using a training set which included 902 samples of voluntary movement and 239 samples of non-voluntary movement, including 97 samples for the walking sub-category 32a, 46 samples for the transport sub-category 32b, and 96 samples for the non-limb movement sub-category 32c.
[0124] Secondary/in-use training method In many applications it will be sufficient to execute the data input 26, training 27 and evaluation 28 phases once, and to then deploy the trained machine learning model 25.
[0125] However, in some applications it may be advantageous to further train the machine learning model 25 in use. For example, to progressively improve the accuracy of classifications provided for each specific subject/user. This is possible with some modifications to the methods.
[0126] Firstly, in the runtime phase 29 a mechanism may be provided to allow the subject to provide additional, positively labelled data. Referring again to Figures 1 and 2, when the wearable device 2 includes user interface device(s) 14, the subject may provide input indicating the type of activity they are undergoing (step S2). For example, the subject may provide input that they are performing (or about to perform) a voluntary limb movement such as a set of exercises. As another example, the subject may provide an input that they are in a transport vehicle. Although shown after the step of -23 -receiving accelerometer data (step S1), the input indicating a type of movement (step S2) may instead be received in advance instead (before step Si).
[0127] Secondly, the subject indication of activity type may be stored as ground truth, along with the corresponding features FM (step S6). Although illustrated after the storage of the classification result (step S5), the data for in-use training may be stored (step S6) at any point after feature extraction (step S3).
[0128] Using the subject-labelled features Ft, ..., FM, the machine learning model 25 may then be re-trained, or fine-tuned, to provide improved accuracy for the specific subject.
[0129] In the case of re-training, the subject-labelled features F1, FM may be appended to the stored training set 33 and evaluation set 34, and the training 27 and evaluation 28 phases may be re-run either in response to a request or a trigger such as the user adding a new labelled set of features n, Fm, or once a certain number of new labelled sets have been added.
[0130] However, complete re-training may be computationally expensive, may require storage of the entire stored training set 33 and evaluation set 34, and may require a large number of subject specific labelled sets to change the weightings. This may be mitigated using a loss function which preferentially weights accuracy in classifying the subject-specific labelled feature sets Fir..., FM.
[0131] Preferably, in-use training would be implemented by training a second, fine-tuning machine learning model (not shown) to map classifications of the first machine learning model 25 (for example the output nodes shown in Figure 5) to final classifications. The fine tuning machine learning model (not shown) may be trained in a supervised learning process using the features FM corresponding to subject-labelled measurement periods.
[0132] Examples of accelerometer data for measurement periods Referring also to Figure 6A, an example of a measurement period At,,, for three-axis accelerometer data {ao, ai, ..., a k, aK} is plotted.
[0133] In Figure 6A, the normalised amplitudes of the three components ax(t), ay(t) and ar(t) are plotted separately, and in this example correspond to voluntary movement of a subjects arm. The labels x, y and z relate to the local coordinates of the accelerometer 6, and were not calibrated or aligned relative to the subjects arm.
[0134] -24 -It may be observed that the offsets vary due to the orientation of the accelerometer 6 local axes x, y, z relative to gravity.
[0135] Referring also to Figure 6B, the same measurement period Zitm as Figure 6A is shown after conversion to single magnitude data {IaoI, *.*, laki, ..., IaKI} as described he It may be observed that calculation of a sign-change count remains meaningful, because the conversion process includes the subtraction of acceleration due to gravity g from the magnitude of the acceleration vector a.
[0136] Figures 6C through 6F plot examples of single magnitude data {IaoI, lakl, IaKI} for measurement periods Atm corresponding to non-voluntary limb movements.
[0137] Figure 6C corresponds to transport in an automobile, measured from the arm of a subject. Figure 6D corresponds to non-voluntary but natural arm movements when walking. Figure 6E corresponding to movements measured from an arm which is restrained whilst walking (for example held in a sling). Figure 6E corresponds to measurements taken from the arm of a subject during a non-limb movement, in this example breathing normally at rest.
[0138] Modifications It will be appreciated that various modifications may be made to the embodiments hereinbefore described. Such modifications may involve equivalent and other features which are already known in the field of methods and devices for monitoring and/or classifying limb movements, and which may be used instead of or in addition to features already described herein. Features of one embodiment may be replaced or supplemented by features of another embodiment.
[0139] Specific examples have been described in which the data model 15 takes the form of a trained machine learning model 25 which is a multi-layer perceptron (MLP). However, the trained machine learning model may be of any type suitable for classifying an input vector of IN features FM such as, for example, a convolutional neural network (CNN), a support vector machine (SVM), a transformer model, a decision tree, a hidden Markov model (HMM), a long short-term memory (LSTM) model, and so forth.
[0140] -25 -Alternatively, the data model need 15 not be a trained machine learning model 25, and instead may take any other suitable form such as, for example, a clustering analysis, a rules based approach/model and so forth.
[0141] Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combination of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom.

Claims (25)

1. -26 -Claims 1. A method of measuring voluntary limb movements of a subject, the method comprising: receiving N-axis accelerometer data corresponding to at least one limb of the subject and a measurement period, wherein N is greater than or equal to two; applying a data model to classify the accelerometer data as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject; storing the classification of the measurement period.
2. The method of claim 1, wherein applying a data model to classify the accelerometer data comprises: determining a plurality of features of the accelerometer data; applying the data model to the plurality of features.
3. The method of claim 2, wherein the plurality of features comprise: a first feature indicative of an average value; a second feature indicative of a lower bound value; a third feature indicative of an upper bound value; a fourth feature which is one of a standard deviation and a variance value; a fifth feature based on a sum of squares; and a sixth feature based on a frequency of sign changes.
4. The method of claims 2 or 3, wherein the plurality of features are calculated separately for each of the N-axes of the accelerometer data;
5. The method of claims 2 or 3, wherein the plurality of features are calculated once for all N-axes of the accelerometer data.
6. The method of any one of claims 1 to 5, wherein the data model is configured to classify voluntary limb movements into one of two or more pre-defined categories of voluntary limb movement.
7. The method of any one of claims 1 to 6, wherein the data model is configured to classify non-voluntary limb movements into one of two or more pre-defined categories of non-voluntary limb movement.-27 -
8. The method of any one of claims 1 to 7, wherein the data model comprises a trained machine learning model.
9. The method of any one of claims 1 to 8, wherein receiving N-axis accelerometer data comprises obtaining the N-axis accelerometer data from a wearable device arranged on at least one limb of the subject.
10. A method comprising carrying out the method of any one of claims 1 to 9 for a sequence of two or more measurement periods.
11. The method of claim 10, wherein each measurement period overlaps with previous and or subsequent measurement periods.
12. The method of claims 10 or 11, further comprising determining and storing a total period corresponding to voluntary limb movements.
13. The method of any one of claims 1 to 12, wherein the limb is an arm of the subject.
14. The method of any one of claims 1 to 12, wherein the limb is a leg of the subject.
15. The method of any one of claims 1 to 14, wherein N is greater than or equal to three. 25
16. A method of training a machine learning model to function as the data model for a method according to any one of claims 1 to 15, the method comprising: receiving a first set of labelled data comprising N-axis accelerometer data corresponding to a plurality of first measurement periods corresponding to voluntary limb movement; receiving a second set of labelled data comprising N-axis accelerometer data corresponding to a plurality of second measurement periods corresponding to non-voluntary limb movement; training the machine learning model in a supervised learning process using the first and second sets of labelled data as ground truth.
17. A method of training a machine learning model to function as the data model for a method according to any one of claims 1 to 15, the method comprising: -28 -receiving a first, pre-trained machine learning model trained to classify accelerometer data corresponding to at least one limb of the subject and a measurement period as a voluntary limb movement or a non-voluntary limb movement of the subject; receiving a plurality of labelled measurement periods, each labelled measurement period corresponding to N-axis accelerometer data corresponding to at least one limb of the subject and a measurement period and being associated with a label input by the subject and indicating a voluntary arm movement or a non-voluntary limb movement; training a second, fine-tuning machine learning model to map classifications of the first machine learning model to final classifications in a supervised learning process using the labelled measurement periods as ground truth.
18. Apparatus configured for securing to at least one limb of a subject and comprising an accelerometer, the apparatus configured: to obtain N-axis accelerometer data corresponding to a measurement period from the accelerometer, wherein N is greater than or equal to two; to apply a data model to classify the accelerometer data as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject; to store the classification of the measurement period.
19. A system comprising: apparatus configured for securing to at least one limb of a subject and comprising an accelerometer; and a data processing device communicatively coupled to the apparatus; wherein the system is configured: to obtain N-axis accelerometer data corresponding to a measurement period from the accelerometer, wherein N is greater than or equal to two; to apply a data model to classify the accelerometer data as corresponding to a voluntary limb movement or a non-voluntary limb movement of the subject; to store the classification of the measurement period.
20. The system of claim 19, wherein the system is configured to apply the data model to classify the accelerometer data by: determining a plurality of features of the accelerometer data; and applying the data model to the plurality of features.-29 -
21. The system of claim 20, wherein the apparatus is configured to determine the plurality of features of the accelerometer data
22. The system of claim 20, wherein the data processing device is configured to determine the plurality of features of the accelerometer data.
23. The system of any one of claims 20 to 22, wherein the apparatus is configured to apply the data model to the plurality of features.
24. The system of any one of claims 20 to 22, wherein the data processing device is configured to apply the data model to the plurality of features.
25. The system of any one of claims 19 to 24, wherein the data processing device is communicatively coupled to the apparatus using a short-range wireless communications protocol.
GB2407239.9A 2024-05-21 2024-05-21 Monitoring limb movement Pending GB2641713A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2407239.9A GB2641713A (en) 2024-05-21 2024-05-21 Monitoring limb movement
PCT/GB2025/051106 WO2025243030A1 (en) 2024-05-21 2025-05-20 Monitoring limb movement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2407239.9A GB2641713A (en) 2024-05-21 2024-05-21 Monitoring limb movement

Publications (2)

Publication Number Publication Date
GB202407239D0 GB202407239D0 (en) 2024-07-03
GB2641713A true GB2641713A (en) 2025-12-17

Family

ID=92932082

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2407239.9A Pending GB2641713A (en) 2024-05-21 2024-05-21 Monitoring limb movement

Country Status (2)

Country Link
GB (1) GB2641713A (en)
WO (1) WO2025243030A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023212711A2 (en) * 2022-04-29 2023-11-02 Virginia Tech Intellectual Properties, Inc. Forearm exoskeleton for tremor alleviation

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8702629B2 (en) * 2005-03-17 2014-04-22 Great Lakes Neuro Technologies Inc. Movement disorder recovery system and method for continuous monitoring
US20140081661A1 (en) 2012-07-05 2014-03-20 Home Team Therapy Method and system for physical therapy using three-dimensional sensing equipment
WO2016081830A1 (en) 2014-11-20 2016-05-26 The Trustees Of The University Of Pennsylvania Methods, systems, and computer readable media for providing patient tailored stroke or brain injury rehabilitation using wearable display
US10130311B1 (en) 2015-05-18 2018-11-20 Hrl Laboratories, Llc In-home patient-focused rehabilitation system
US11089146B2 (en) * 2015-12-28 2021-08-10 The Mitre Corporation Systems and methods for rehabilitative motion sensing
CA3018018A1 (en) 2016-03-25 2017-09-28 Ipsen Biopharm Limited Collecting physical therapy information to enhance treatment efficacy of botulinum toxin
CN109688926B (en) 2016-09-14 2022-09-13 豪夫迈·罗氏有限公司 Digital biomarkers for cognitive and motor diseases or disorders
CN111557033A (en) 2017-10-25 2020-08-18 豪夫迈·罗氏有限公司 Digital quality metric biomarkers for cognitive and mobility diseases or disorders
WO2019207346A1 (en) 2018-04-27 2019-10-31 Neuroanalytics Pty Ltd System and method for rehabilitation program management in post stroke subjects
US11056242B1 (en) 2020-08-05 2021-07-06 Vignet Incorporated Predictive analysis and interventions to limit disease exposure
US20220020469A1 (en) 2020-07-20 2022-01-20 Children's Hospitals and Clinics of Minnesota Systems and methods for functional testing and rehabilitation
EP4188216A1 (en) * 2020-08-03 2023-06-07 GyroGear Limited Systems and methods for tremor management
EP4094684A1 (en) 2021-05-28 2022-11-30 Feetme Rehabilitation method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023212711A2 (en) * 2022-04-29 2023-11-02 Virginia Tech Intellectual Properties, Inc. Forearm exoskeleton for tremor alleviation

Also Published As

Publication number Publication date
GB202407239D0 (en) 2024-07-03
WO2025243030A1 (en) 2025-11-27

Similar Documents

Publication Publication Date Title
US10540597B1 (en) Method and apparatus for recognition of sensor data patterns
Ascioglu et al. Design of a wearable wireless multi-sensor monitoring system and application for activity recognition using deep learning
US20220409098A1 (en) A wearable device for determining motion and/or a physiological state of a wearer
Dao et al. Human activity recognition system for moderate performance microcontroller using accelerometer data and random forest algorithm
CN109069066A (en) Wearable and connection gait analysis system
Malshika Welhenge et al. Human activity classification using long short-term memory network
CN115554674A (en) Method and device for predicting sports energy consumption
Zhang et al. Integrated sensing and computing for wearable human activity recognition with MEMS IMU and BLE network
GB2641713A (en) Monitoring limb movement
Jarrah et al. Enhancing elderly care with wearable technology: development of a dataset for fall detection and ADL classification during Muslim prayer activities
Mekruksavanich et al. Hyperparameter tuning in convolutional neural network for face touching activity recognition using accelerometer data
Dong et al. Real-time physical activity monitoring by data fusion in body sensor networks
Liu et al. Shoulder Motion Detection Algorithm Based on MPU6050 Sensor and XGBoost Model
CN117838099A (en) A healthy exercise assessment system for the elderly based on multi-parameter fusion
KR102754066B1 (en) Smart Devices Based Multisensory Approach for Complex Human Activity Recognition System and the method thereof
Ozbay et al. A comparative analysis of machine learning algorithms for accurate step detection in wrist worn devices
Li et al. REAL-TIME INJURY RISK ASSESSMENT AND EARLY WARNING FOR SOCCER PLAYERS UTILIZING SENSORS AND MACHINE LEARNING.
US20210298642A1 (en) System and method for treatment of lower back pain based on biometrically determined change in gait
Xiao et al. Activity recognition based on kinetic energy harvester and accelerometer
Kongsil et al. Physical activity recognition using streaming data from wrist-worn sensors
Soon et al. Using Smartwatch Data to Estimate Motion
Puchalski et al. Movement pattern recognition in boxing using raw inertial measurements
CN120089403B (en) Multi-mode wearable sensing data real-time analysis method based on layered architecture
Yogesh Instance based human physical activity (hpa) recognition using shimmer2 wearable sensor data sets
Luu et al. Accurate Step Count With Generalizable Deep Learning on Accelerometer Data