[go: up one dir, main page]

WO2019021315A1 - Motion sense technology system - Google Patents

Motion sense technology system Download PDF

Info

Publication number
WO2019021315A1
WO2019021315A1 PCT/IN2018/050489 IN2018050489W WO2019021315A1 WO 2019021315 A1 WO2019021315 A1 WO 2019021315A1 IN 2018050489 W IN2018050489 W IN 2018050489W WO 2019021315 A1 WO2019021315 A1 WO 2019021315A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
processing unit
user
user device
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IN2018/050489
Other languages
French (fr)
Inventor
Tarun Kumar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2019021315A1 publication Critical patent/WO2019021315A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0204Operational features of power management
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0223Magnetic field sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0247Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/7405Details of notification to user or communication with user or patient; User input means using sound

Definitions

  • the present disclosure relates to a motion sense technology system. More specifically, a system and method for rendering captured data of one or more users. Further, the invention discloses system of hardware embodiments of Motion Sensor units and Motion Sensor Processing unit for capturing and processing of the motion data of any human/animal/object doing any physical activity, and for visualization and analysis of the same through software application on a mobile/tablet/laptop/or any Display.
  • the motion data can be captured in the fields of training of physiotherapy patients for their gait improvement, training and coaching of players of sports, as well as in other fields or forms of physical training, which can then be used as training inputs to assist individual(s) / medical professionals / coaches / trainers to train / coach in any activity involving physical movement or sport.
  • the motion of the human or the subject is captured through special cameras equipped with capability to captured the infrared light emitted on to the visual markers, the light reflected by such markers is then captured by such expensive camera-system (usually a series of 6 to 8 such cameras placed at strategic locations in the activity- environment).
  • camera-system usually a series of 6 to 8 such cameras placed at strategic locations in the activity- environment.
  • MEMS IMU sensors such as, Accelerometer, Gyroscope, Magnetometer sensors.
  • all such equipment and their usage require all or some of the following essential environment constraints and/or attributes, ambient quality of lighting, right distance between the human or the subject whose motion is to be analyzed and the right placement of camera systems at specific angles and distance for the correct focus of the image and video quality for later processing.
  • Various embodiments herein describe a system for rendering captured data of one or more users.
  • the system comprises one or more sensing units configured for capturing data of one or more users, at least one processing unit connected to the one or more sensing units for processing the captured data received from the one or more sensing units, wherein the at least one processing unit being adapted to communicate with the one or more sensing units for evaluating health of the one or more sensing units and providing one or more predefined instructions, and a user device connected to the at least one processing unit for receiving the processed data and rendering the processed data through one or more predefined user-specific models, thereby providing pervasive professional analysis and training to the one or more users during physical movement or physical training or sport.
  • a cloud server suite connected to one of the at least one processing unit and the user device configured to perform steps comprising communicating with the user device for registering profile of one or more users, receiving pictures and meta-data of the one or more users including dimensions of one or more body-parts as required for physical movement or physical training or sport, and processing the received pictures and meta-data and dimensions of the one or more body-parts of each user of the one or more user for creating one or more predefined models in 2D/3D, said model is proportionately scaled as per the received pictures and data.
  • the one or more sensing units comprising one or more sensors, said sensor is selected from a group comprising accelerometer, gyroscope, magnetometer, pressure sensor, and altimeter sensors.
  • the one or more sensing units are attached to a body parts of the user by means of attachment.
  • the means of attachment include but not limited to tape, band, adhesive, strap etc.
  • the one or more sensing unit is adapted to perform steps comprising receiving instruction from at least one of the user device and the at least one processing unit, performing self-diagnostics test to determine health check status, storing the health check status locally, sharing the health check status with the at least one processing unit, receiving the evaluated health check status from the at least one processing unit based on a first set of predefined values; if the health check status failed, shutting down the one or more sensing units to rectify the functioning of said unit by the intervention of the user or system administrator; if the health check status passed, configuring the one or more sensing units based on the body parts of the user and one or more predefined actions to be performed by the user; and capturing the data of the user based on the configuration; and sharing the captured data with the at least one processing unit.
  • the system administrator who is able to identify and rectify the functioning of units, includes but not limited to a human being, and/or a smart module/firmware.
  • the one or more sensing units is adapted to perform the steps comprising capturing data, by motion sensors, pressure sensor, and altimeter Sensor, of the one or more users in order of time and indices; filtering the captured data using a data filtering module; synchronizing the captured data of the motion sensor of the one or more users; recognizing the data of motion sensors, pressure sensor, and altimeter Sensor of by signal processing algorithm; computing quaternions, linear angles, and predefined state functions of the motion sensor data captured by one or more accelerometers, one or more gyroscopes, and magnetometers; computing data received from a piezo pressure sensor; blending the computed data of piezo pressure sensor with the data of the motion sensor; computing data received from altimeter sensor; blending the computed data of altimeter sensor with the data of the motion sensor; storing the blended data in the one or more sensing unit until purged; and transmitting the stored data to at least one processing unit.
  • the signal processing algorithm includes a predefined set of instruction to recognize the data of motion sensors, pressure sensor, and altimeter Sensor.
  • the accelerometer is a High-G accelerometer.
  • the gyroscope is a High Range gyroscope.
  • the blending of data /computed data is a process whereby data from multiple sensors are merged into a single data or data set.
  • the at least one processing unit is adapted to perform the steps comprising receiving instruction from one of the user device and cloud server suite; performing self-diagnostics test to determine health check status; storing the health check status locally; sharing the health check status with the user device; receiving the evaluated health check status based on a second set of predefined values; if the health check status failed, shutting down the processing unit to rectify the functioning of said unit by the intervention of the user or system administrator; and if the health check status passed, providing instruction to initialize and configure the one or more sensing units.
  • the at least one processing unit is adapted to perform the steps comprising communicating with the one or more sensing units to receive the captured data, wherein the captured data is derived from one or more motions/physical activities of the user; storing the captured data locally in the processing unit until purged; synchronizing captured data received from the one or more sensors; processing the captured data of the one or more sensing unit for machine learning, classifying the one or more motions/physical activities of the user based on a predefined information stored in a master database; and determining metrics based on the synchronized and processed captured data, thereby provide necessary inputs in rendering, displaying by the user device and sending the same to the cloud server suite.
  • the necessary inputs include but not limited to, Starting and Ending points of the activity from the above data, key activity-based markers points within the activity data.
  • the at least one processing unit is adapted to provide one or more instructions along with data to the at least one of one or more sensing units through radio communication, Bluetooth/ Bluetooth Low Energy (BLE)/Wi-Fi.
  • BLE Bluetooth Low Energy
  • the at least one processing unit is adapted to send or provide one or more instructions along with data to the at least one user device and/or the cloud server suite in an online mode over Wi-Fi complying with 1 EEE 802.1 1 ac/b/g/n protocol and having a minimum data-transmission/receiving range of 150 meters in the line of sight.
  • the at least one processing unit is triggered by a user interface in an offline mode to provide one or more instructions along with data to the one or more sensing units, said user interface includes at least one of a push button, touch screen, and voice user interface (VUI).
  • VUI voice user interface
  • the user device is adapted to perform the steps comprising determining one or more coordinates from the captured data for 2D/3D rendering of a user model; determining at least one of a trajectory of the motion, pressure and altitude of one or more body-parts of each time-frame from the captured data, thereby enabling micro-monitoring of body-parts individually and as a whole; and rendering the one or more determined coordinates, thereby displaying the metrics of the body-part(s) of the user through the predefined user model.
  • the cloud server suite connected to at least one of the processing unit and the user device configured to perform steps comprising communicating with the at least one of processing unit and user device for receiving data, meta-data, metrics and analysis; storing the received data, meta-data, metrics and analysis in a database of the cloud server suite; determining update on one or more predefined global metrics of one or more users for the corresponding physical movement or physical activity based on the received data, meta-data, metrics and analysis; storing the update on one or more predefined global metrics in the database of the cloud server suite; determining one or more predefined features of the machine learning from the data, meta-data, metrics and analysis; storing the determined one or more predefined features of the machine learning in the database of the cloud server suite; sending the determined one or more predefined features of the machine learning to at least one of processing unit and user device upon receiving the request for said features; storing the received one or more predefined features of the machine learning in the at least one processing unit for future usage; and sending the update on one or more predefined global
  • the cloud server suite connected to at least one of the processing unit and the user device configured to perform steps comprising determining global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by at least one of the processing unit and user devices of more than one users of said system; determining global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by one or more external system(s) or third-party system(s) and/or standards and/or data inputs of physical movement or physical training or sport; and sending global comparative metrics and analysis of the physical activity of the user to the at least of the processing unit and user device.
  • the one or more sensing units comprise at least one of one or more motion sensors, piezo pressure sensor, altimeter, memory unit, microprocessor, Bluetooth, wi-fi, radio frequency communication, power supply, re-chargeable battery, and battery charging unit.
  • the one or more processing units comprise at least one of USB interface, one or more Bluetooth communication, ethernet communication, memory unit, microprocessor, wi-fi, radio frequency communication, user display unit, power supply, re-chargeable battery, and battery charging unit.
  • the cloud server suite comprises at least one of a server application program interface (API), database, and one or more sets of predefined instructions.
  • API server application program interface
  • one or more application program interfaces (APIs) of the processing unit and said system are adapted for enabling sensor data inputs from a third-party Sensor Unit(s) based on, as listed in predefined specifications.
  • the data is stored locally in the sensor unit, the processing unit, the user device and the cloud server suite, so as not lose the data integrity.
  • the cloud server suite is adapted to initiate and execute Over-The-Air update of firmware/programming module(s) for one or more sensing units through at least one processing unit, said update is executed with or without the means of at least one user device through respective radio- communications among said units.
  • the cloud server suite is adapted to initiate and execute the Over-The-Air update of firmware/programming module(s) for one or more processing units, said update is executed with or without the means of at least one user device through respective radio-communications among said units.
  • Another embodiment of the present invention describes a method of rendering captured data of one or more users.
  • the method comprises attaching one or more sensor units on one or more body parts of one or more users; providing one or more instructions to initialize one or more sensor units and at least one processing unit; configuring one or more sensor units according to the body parts with which the sensor units are attached and based on motions/physical activity to be captured, upon completion of initialization; capturing the motion/physical activity data of the one or more users; synchronizing the captured data received from the one or more users; processing the captured data to derive one or more information based on the predefined parameters; and rendering the processed data based on one or more user specific models, thereby providing pervasive professional analysis and training to the one or more users during physical movement, physical training or sport.
  • the method comprises communicating with the user device for registering profile of one or more users; receiving pictures and meta-data of the one or more users including dimensions of one or more body-parts as required for physical movement or physical training or sport; and processing the received pictures and meta-data and dimensions of the one or more body-parts of each user of the one or more user for creating one or more predefined models in 2D/3D, said model is proportionately scaled as per the received pictures and data.
  • FIG. 1 illustrates a block diagram of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
  • FIG. 2 illustrates a block diagram of a MSTS sensor unit (101 ) of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
  • FIG. 3 illustrates a flow chart of a process performed by a sensor unit of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
  • MSTS motion sense technology system
  • FIG. 4 illustrates a block diagram of a processing unit of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
  • MSTS motion sense technology system
  • FIG. 5 illustrates a flow chart of a process performed by a processing unit of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
  • MSTS motion sense technology system
  • FIG. 6 illustrates a flow chart of a process performed by a user device of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
  • FIG. 7 illustrates a flow chart of a process performed by a cloud server suite of a motion sense technology system (MSTS), according to an embodiment as disclosed herein.
  • MSTS motion sense technology system
  • the embodiments herein disclose a method and system of hardware embodiments of Sensor units and Sensor processing units for capturing and processing of the motion data of any human/animal/object doing any physical activity, and for visualization and analysis of the same through a user device / software application on a mobile/tablet/laptop/or any Display.
  • the motion data is captured in the fields of training of physiotherapy patients for their gait improvement, training and coaching of players of sports, as well as in other fields or forms of physical training, which is then used as training inputs to assist individual(s) / Medical professionals / coaches / trainers to train / coach in any activity involving physical movement or sport.
  • This invention is aimed to solve such problems by offering an affordable, easily accessible and a pervasive professional analysis and training / coaching system for learning and excelling in the field of physical training or a chosen sport.
  • the invention is used for a holistic physical training or sports by the professional physiotherapists / coaches to train patients to regain and improve their gait and to professional / amateur athletes respectively.
  • the invention also assists an Individual to capture their activity and help them learn from the visual analysis provided by this system, by adding their own sensor units through special methods and capturing data through those devices.
  • FIG. 1 illustrates a block diagram of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein.
  • the system (100) includes, but not limited to, one or more sensing units (101 ), one or more processing units (102), one or more user devices (103) and a cloud server suite (104).
  • the motion sensor includes but not limited to one or more accelerometers, one or more gyroscopes, and magnetometers.
  • the Motion Sense Technology System (100) consists of specifically designed hardware, firmware and/or software, and addresses the needs of medical professionals, amateurs as well as professionals of a sport through a real-time visual feedback of the actual physical-activity/play in 2-Dimensional (2D) and 3-Dimensional (3D) rendering, that will be instantly available on their chosen mobile device(s). This mean that the feedback is based on a trainee's or player's actual physical / playing actions in the area/field and not under any simulated conditions.
  • This Cloud-based system will provide specific visual analysis and data-points which are necessary for methodical learning and practice of the activity or sport.
  • the MSTS system will help in analyzing and improving performance in specific areas of the physical activity or sport.
  • the aim is to create a solution for the sport of Cricket. Going forward, with the given embodiments of similar nature, containing the hardware equipment and software solution, the system can be used to Physiotherapy, Mechanical Physical activity of an Enterprise or Factory-environment, Dance forms, and other physical learning or performance-based activities, etc.
  • the system will also provide the useful visual feedback and analysis as well as a guided learning path in areas of sports, like, Tennis, Golf, Badminton, Table-Tennis, Baseball, Football, Running, Cycling, swimming, Football, Basketball, etc.
  • the MSTS system is adapted to provide flexibility to add their own sensor units that is used to capture specific motion data as well as to get the result-feed from the MSTS system for their analysis, as per the guidelines to be provided.
  • the MSTS system is adapted to accept plug-in of third-party sensor units through the use of MSTS Application Programmable Interfaces (APIs).
  • MSTS Application Programmable Interfaces APIs
  • These predefined MSTS system APIs provided by the MSTS system of the present invention offers flexibility for innovation and integration of large developer community.
  • the MSTS system provides the APIs and guidelines for their usage and to fetch the result-feed. There is no such existing system that allows developers to readily plug-in their own sensors or to use the motion-data from the sensors for developing their applications or solutions.
  • the cloud server suite is accessible via the internet to the processing unit and/or user device for all the operations.
  • the cloud server suite can be connected via the Wi-fi communication.
  • the cloud server suite can be accessed via the Wi-fi or other mediums of internet-access.
  • FIG. 2 illustrates a block diagram of a MSTS sensor unit (101 ) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein.
  • the MSTS sensing unit includes but not limited to, at least one of one or more motion sensors (201 ), piezo pressure sensor (202), and altimeter sensor (203).
  • the MSTS sensor unit (101 ) also includes memory unit (204), microprocessor/ micro-controller (205), Bluetooth (206), wi-fi / radio frequency communication (207), touch and display unit interface or display unit interface (208), power supply (209), re-chargeable battery (210), and battery charging unit (21 1 ).
  • “Bluetooth” means “Bluetooth, Bluetooth Low Energy, Bluetooth 5" for "Sensing units” and "Processing Unit”.
  • the one or more sensor units (101 ) being used in the MSTS system (100) depends upon the type of application such as the motion sensor (201 ) is used to capture motion data, the piezo pressure sensor (202) is used for capturing the data related to change in pressure or altimeter is used to measure the altitude of an object above a fixed level.
  • the MSTS sensor unit (101 ) is able to process the captured data using inbuilt microprocessor (205) and store the same into the memory unit (204).
  • the MSTS sensor unit (101 ) is able to communicate or share the stored data with the MSTS processing unit using Bluetooth (206), wi-fi, radio frequency communication (207).
  • the re-chargeable battery (210) provides appropriate preconfigured power for proper functioning of the MSTS sensor unit (101 ).
  • the re-chargeable battery (210) is charged with the battery charging unit (21 1 ) when battery power goes below a predefined value or based on the instruction received from the MSTS processing unit or user device.
  • FIG. 3 illustrates a flow chart of a process performed by a sensor unit (101 ) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein.
  • the sensing unit is configured for capturing the data of a user with which it is attached based on the configuration.
  • the sensor unit comes to alive or gets activated from sleep mode on receiving a trigger such as by pressing a start button or by shaking the sensor unit within a predefined time period of last usage such within few minutes.
  • a check is performed to determine whether sensor unit is active or still in sleeping mode. If the sensor unit is still in sleeping mode, the step 301 is repeated. If the sensor unit is active, a self-diagnostics test is initiated at step 303.
  • the self- diagnostic test includes but not limited to, battery level, states of one or more sensors, status of radio communication, and status of storage/ memory/ memory card.
  • the result of self-diagnostic test i.e. health check status
  • the sensor unit is shutdown/closed to rectify the error or shortcoming such as for charging of battery of the sensor unit.
  • the microprocessor of the sensor unit performs a calibration process for the sensors (at step 306) such as 9-DoF Accelerometer, Gyroscope, Magnetometer, High-G Accelerometer, High Range Gyroscope, Pressure Sensor, and Altimeter.
  • the sensors are attached on a predefined part of the user (at step 307) based on the data to be captured.
  • the sensor units are configured for one or more session(s) of the physical activity until any change.
  • the configuration includes but not limited to identifying and associating the Sensor unit with the Body-part.
  • a process of recognition of the captured data is carried out through signal processing, and data filters.
  • step 31 1 computation of Quaternions, Linear Angles, and Other necessary state functions are performed.
  • the captured data such as pressure data is converted into a predefined form or meaningful values, so as to blend with rest of the captured motion sensor data.
  • the captured data such as altimeter data is converted into a predefined form or meaningful values, so as to blend with rest of the captured motion sensor data.
  • the sensor data along with computed data and metadata is stored locally in the memory of the sensor unit, until getting purged.
  • step 315 the stored data are transmitted to the MSTS Processing unit.
  • the processing unit (102) comprises at least one of Universal Serial Bus (USB) interface (401 ), Bluetooth communication (402), ethernet communication (403), memory unit (404), microprocessor (405), wi-fi/ radio frequency communication (406), user display unit (407), power supply (408), re-chargeable battery (409), and battery charging unit (410).
  • USB Universal Serial Bus
  • the processing unit (102) communicates with the sensor unit (101 ) and the user device (103) through one or more communication means such as USB interface, Bluetooth, ethernet, Wi-Fi, radio frequency communication.
  • the data received from the sensor unit are processed by the microprocessor of the processing unit and subsequently stored locally in the memory unit of the processing unit.
  • FIG. 5 illustrates a flow chart of a process performed by a processing unit (102) of a motion sense technology system (MSTS)(100), according to an embodiment as disclosed herein.
  • the MSTS processing unit comes to active state from a power-down stage or from Sleep Mode if inactive for a while by pressing a start button or reset button.
  • a check is performed whether processing unit is in active state. If the processing unit is still in sleep mode, the step 501 is repeated.
  • Self- Diagnostics Tests are performed (at step 503), which includes but not limited to, check for battery Level, states of all the on-board components, radio-communication with the MSTS Sensor unit(s), Long-range radio- communication with the user device (which includes MSTS Mobile App), storage/ memory unit/ memory card, and time-function, and time-sync with MSTS Sensor unit(s), etc.
  • the results of Self-Diagnostic Tests, Health Checks Status, Report to MSTS MobileApp and MSTS Cloud suite are stored locally in the memory unit for further processing.
  • a check is performed whether the health check status of the processing unit passed.
  • the processing unit If the health check status of the processing unit failed, the processing unit is shut down/closed to rectify the error or shortcoming by the User for further action such as charging of battery or other any issue. If the health check status passed, the processing unit communicates with the MSTS Sensor unit(s) to make them ready to Capture Motion data at step 506. At step 507, the one or more sensor units are placed on the body-part(s) of the user/human/subject as per their respective associated positions. At step 508, a first check is performed whether the processing unit and the sensor unit(s) are ready for performing necessary actions. If the first check at step 508 is failed, a second check at step 509 is performed whether communication from the processing unit and process to make the devices ready are repeated twice.
  • the step 506 is repeated. If the second check at step 509 failed, the step 506 is repeated. If the second check at step 509 passed, the user is provided, at step 510, an alert to check with the user device (more specifically MSTS MobileApp) to fix it or Report if not fixed. If the first check at step 508 passed, a check is performed, at step 51 1 , whether the MSTS processing unit running is in an online Mode. If no at step 51 1 , the processing unit performs the function in an offline mode. In the offline mode at step 512, the processing unit is triggered by a user interface to provide one or more instructions along with data to the one or more sensing units and/or cloud server suite.
  • the user interface includes at least one of a push button, touch screen, and voice user interface (VUI).
  • the processing unit performs the function in an online mode.
  • the processing unit sends Command(s) to the sensor unit to initiate the process of capturing sensor Data after communication from the MSTS MobileApp or user device.
  • the processing unit communicates with the sensor unit and receives captured data or sensor data over radio-communication.
  • the sensor data received from the one or more sensor units are stored locally in the memory unit until purged.
  • the processing unit is adapted to process the received sensor data.
  • the process of sensor data includes but not limited to, synchronizing the Sensor-data received from all the MSTS Sensor unit(s), computing features of all sensor data for machine learning, running the computed features to match against the master database to classify the activity/motion for the user/human/subject, computing metrics based on the synchronized data-set.
  • the processer device stores all such sensor data along with computed data and the meta-data in the memory unit locally until purged.
  • the processing unit transmits sensor data to the MSTS MobileApp/user device and/or MSTS Cloud suite, when connected with them through radio- communication.
  • FIG. 6 illustrates a flow chart of a process performed by a user device (103) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein.
  • the user device is configured for processing data received from the processing unit for particular physical activity by the user according to an embodiment of the present invention.
  • the physical activity includes but not limited to Sport, Physiotherapy, Gait etc.
  • the MSTS MobileApp is installed in the user device.
  • the user device includes but not limited to a mobile phone, tablet, laptop, and computing device.
  • a process of registration of the MSTS Sensor unit and MSTS Processing unit is performed in the user device. This process is one-time process in the MSTS system for same set of devices.
  • a process for registration user/profile registration is performed for each new User/Human/Subject.
  • a check is performed for availability of 3D Rendering available for the Human/Subject for the User. If no at step 604, a process is performed (at step 605) to capture user's physical profile pictures including dimensions of the body-parts as required by the physical activity installed in the user device for availability of 3D Rendering available for the Human/Subject for the User.
  • the user device sends the captured pictures (in step 604), along with the calculated dimensions of the human/subject's body-parts to the MSTS cloud suite for the purpose of creating a proportionately scaled 2D/3D Rendering for the Human/Subject model for the User.
  • the user device fetches the proportionate 2D/3D Rendering of the human/subject's model received from the MSTS Cloud Suite.
  • the one or more sensor units and processing unit are triggered by switching on the power.
  • a check is performed whether the sensor units and the processing unit are functional. If no at step 609, the step 608 is repeated. If yes at step 609, initiating self-diagnostics tests for sensor unit(s), processing unit, and user device to derive health check status at step 610.
  • the user device also checks the availability of radio communication and server cloud suite at step 610.
  • the results of self-diagnostic tests and health checks status are stored local in the memory unit.
  • the status is also shared with the processing unit for further processing.
  • a check is performed whether MSTS System is Ready i.e. whether all Checks are Passed. If no at step 612, the sensor unit is shut down/closed, at step 612, for further action by the User, e.g. Charging. If yes at step 612, calibration of the one or more sensor units are initiated, along with syncing and association with the user at step 613.
  • one or more sensor units are attached on one or more body-parts of the user/ human/subject.
  • the one or more sensor units are identified and associated with the one or more body-parts through automated identification process for one or more sessions of the physical movement/ physical activity until any change.
  • the user device sends command(s) to the one or more sensor units through the processing unit to initiate capturing of sensor Data in online mode.
  • the user device receives the captured data or sensor data of the one or more sensor units from one or more processing units.
  • the user device computes the coordinates from the received data for the 2D/3D Rendering of the human/subject model along with the Trajectory of the motion, pressure and altitude of body-part(s) of each time-frame. Then rendering is performed for all such coordinates in 2D/3D.
  • the user device computes and displays the metrics and angles of movements of the body-part(s) of the human/subject.
  • the user device based on the computation and rendering of coordinates in 2D/3D rendering, the user device prepares, computes and renders a comparative analysis of motion(s) of each time-frame for at least two or more data-sets of the same User or multiple users.
  • the user device receives user- selected data-set for the same human/subject or other one or more human(s) /subject(s) from the cloud server suite. Then, the user device renders the advanced comparative analysis said method of computation of coordinates, metrics and analysis for time-frames and indices. At step 622, the user device stores all said data, computed metrics and meta-data, locally until getting purged. At step 623, the user device transmits all stored data to MSTS Cloud Suite.
  • FIG. 7 illustrates a flow chart of a process performed by a cloud server suite (104) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein.
  • a process of registration of the one or more sensor units (i.e. sensing units) and one or more processing units (i.e. processing units) are performed by the cloud server suite. This process is one time if the complete MSTS System is to be used including the MSTS hardware.
  • a process of registration of one or more users/profiles is performed. This registration process is for each new user/Human/Subject.
  • the cloud server suite receives the pictures and metadata of the user/human/subject including dimensions of the body-parts as required by the physical activity installed in the user device/mobile application.
  • the cloud server suite process the pictures and meta-data and dimensions of the body-part(s) of the user/human/subject to create a proportionately scaled 2D/3D Rendering for the user/human/subject model.
  • the cloud server suite sends the correctly proportionate 2D/3D rendering for the user/human/subject model to the user device/ MobileApp.
  • the cloud server suit also runs methods to propagate sharing-functionality and send notification (s) of multiple kinds to the user(s), for example Sensor-Data and Meta-data uploads.
  • the cloud server suite receives and Shares sensor-data bookmarks from other users, and software / firmware updates.
  • step 707 methods of Housekeeping of MSTS Cloud Suite Repository is performed.
  • step 708 methods of Managing of the MSTS Sensor units applicable for one or more physical activities being monitored and used for, by the MSTS Cloud Suite.
  • step 709 Methods of Managing of the MSTS Processing unit being used for, by the MSTS Cloud Suite is performed.
  • the cloud server suite receives and stores one or more Sensor Data and meta-data being uploaded from MSTS processing unit and/or user device/MSTS MobileApp.
  • the cloud server suit computes and updates additional Global Metrics for all users (humans/subjects) of the corresponding physical activity, from the received Sensor-Data and Meta-data, Metrics and Analysis.
  • the Resulting Global Metrics is stored in the MSTS Cloud Suite Repository.
  • the cloud server suite computes additional Features of the Machine Learning from the received Sensor-Data and Meta-data, Metrics and Analysis.
  • the Resulting Global Machine Learning Training Database is also stored in the MSTS Cloud Suite Repository.
  • the cloud server suite determines global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by one or more external system(s) or third-party system(s) and/or standards and/or data inputs of physical movement or physical training or sport.
  • the cloud server suite sends the said Machine Learning Database updates on demand to the user device/MSTS MobileApp and the processing unit.
  • the processing unit stores the received updates of the Machine Learning Database in its repository for future usage.
  • the cloud server suite sends the said comparative metrics updates on demand to the user device/ MobileApp and the processing unit.
  • the processing unit and MobileApp used the said updates of the metrics to provide Global Comparative Metrics and Analysis of the physical activity of the user/human/subject.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physiology (AREA)
  • Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiments herein disclose a system and method for rendering captured data of one or more users. The system comprises one or more sensing units, at least one processing unit, a user device and a cloud server suite. The sensing unit is configured for capturing data of one or more users. The processing unit is connected to the sensing unit for processing the captured data received from the sensing unit(s). The processing unit is adapted to communicate with the sensing unit(s) for evaluating health of the sensing unit(s) and providing one or more predefined instructions. The user device is connected to the processing unit for receiving the processed data and rendering the processed data through one or more predefined user-specific models, thereby providing pervasive professional analysis and training to the one or more users during physical movement or physical training or sport.

Description

MOTION SENSE TECHNOLOGY SYSTEM
FIELD OF INVENTION
[001 ] The present disclosure relates to a motion sense technology system. More specifically, a system and method for rendering captured data of one or more users. Further, the invention discloses system of hardware embodiments of Motion Sensor units and Motion Sensor Processing unit for capturing and processing of the motion data of any human/animal/object doing any physical activity, and for visualization and analysis of the same through software application on a mobile/tablet/laptop/or any Display. The motion data can be captured in the fields of training of physiotherapy patients for their gait improvement, training and coaching of players of sports, as well as in other fields or forms of physical training, which can then be used as training inputs to assist individual(s) / medical professionals / coaches / trainers to train / coach in any activity involving physical movement or sport.
BACKGROUND
[002] Over the years, there is a growing demand for gait analysis and physiotherapy / training for the patients recovering from any kind of orthopedic condition/surgery. Still, in the process of recovering from any such kind of orthopedic condition, there is a lack of scientifically measured data and analysis of patient's condition and rate of improvements before and after going through such a procedure. Similarly, the Sports and Physical Training has become very competitive over the years, demanding a lot of detailed and continuous performance analysis. Though driven by an individual's mental and physical abilities & passion, learning and excelling in a sport is possible only through a process of methodical learning that encompasses continuous assessment of actual playing actions, SWOT analysis, as well as Overcoming correctly spotted weaknesses in physical movements of the player's body and playing- instruments in the actual playing conditions over a period of time. In this process, amongst several inhibiting factors, there are two main inhibiting factors.
[003] Firstly, in the professional training / coaching for physical training / sports as well as in the personalized learning, the good and intensive training activities are concentrated only in limited geographical pockets. More often than not, it is not so technically advanced, and yet is becoming increasingly expensive. [004] Secondly, in terms of training and learning of a physical activity / sport, the technology used currently is mainly based on assessment through human-eye, capturing of still-images, or a basic videography and subsequently playing back those over later for analysis and understanding of the improvement areas. Alternatively, the advance motion-capture systems use visual markers that are placed on one or more body-parts of the human or the subject, for whom the activity / motion is to be analyzed. The motion of the human or the subject is captured through special cameras equipped with capability to captured the infrared light emitted on to the visual markers, the light reflected by such markers is then captured by such expensive camera-system (usually a series of 6 to 8 such cameras placed at strategic locations in the activity- environment). A few solutions mix these Visual marker and Infrared-light Camera systems with those of MEMS IMU sensors (such as, Accelerometer, Gyroscope, Magnetometer sensors). However, all such equipment and their usage require all or some of the following essential environment constraints and/or attributes, ambient quality of lighting, right distance between the human or the subject whose motion is to be analyzed and the right placement of camera systems at specific angles and distance for the correct focus of the image and video quality for later processing. All this requires placing the equipment(s) in an intrusive manner in the field where the action is taking place, often in a lab-condition like environment that is not equivalent to the actual field of activity or play, thus affecting the conditions of real activity / play to some extent as well as giving a trigger / sense to the individual / player of acting / playing-under- simulated-conditions environment. Some of these equipment can be used only for training conditions which then, as a result, doesn't reflect the actual playing / physical conditions for the player to play / exercise naturally without any inhibitions. Above all, besides being expensive, adequate human-resources and intervention is needed to record physical / sport actions in such simulated / artificial settings. Additionally, given the nature of such equipment as well as requirements of their placement and operating conditions, it makes it difficult to use them in real-life conditions of body-movements or professional match playing conditions.
[005] Alternatively, there are videography equipment which can be strategically placed farther away from the actual field of the play. Such equipment would capture the right details of the real-life match play. However, this approach uses a combination of very expensive equipment (e.g. a number of High-speed Broadcast-quality Cameras, Analysis and Playback equipment) that are used to capture view from various points in the field, and then playback and analyze the physical sporting actions. The Analysis of physical actions also requires special workstations and computing powers. As a result, their use is restricted to only high-end professional matches / training activities which have a way to monetize them.
[006] Hence, there is truly a need of solution that is inexpensive, portable, and can be used in simulated as well as in the conditions of real- activity / play, which encompass equipment as well as consist of method(s) to analyze and learn the physical activity in actual settings.
SUMMARY [007] Various embodiments herein describe a system for rendering captured data of one or more users. The system comprises one or more sensing units configured for capturing data of one or more users, at least one processing unit connected to the one or more sensing units for processing the captured data received from the one or more sensing units, wherein the at least one processing unit being adapted to communicate with the one or more sensing units for evaluating health of the one or more sensing units and providing one or more predefined instructions, and a user device connected to the at least one processing unit for receiving the processed data and rendering the processed data through one or more predefined user-specific models, thereby providing pervasive professional analysis and training to the one or more users during physical movement or physical training or sport. [008] According to one embodiment, a cloud server suite connected to one of the at least one processing unit and the user device configured to perform steps comprising communicating with the user device for registering profile of one or more users, receiving pictures and meta-data of the one or more users including dimensions of one or more body-parts as required for physical movement or physical training or sport, and processing the received pictures and meta-data and dimensions of the one or more body-parts of each user of the one or more user for creating one or more predefined models in 2D/3D, said model is proportionately scaled as per the received pictures and data.
[009] According to another embodiment, the one or more sensing units comprising one or more sensors, said sensor is selected from a group comprising accelerometer, gyroscope, magnetometer, pressure sensor, and altimeter sensors. [0010] According to yet another embodiment, the one or more sensing units are attached to a body parts of the user by means of attachment. The means of attachment include but not limited to tape, band, adhesive, strap etc.
[001 1 ] According to yet another embodiment, the one or more sensing unit is adapted to perform steps comprising receiving instruction from at least one of the user device and the at least one processing unit, performing self-diagnostics test to determine health check status, storing the health check status locally, sharing the health check status with the at least one processing unit, receiving the evaluated health check status from the at least one processing unit based on a first set of predefined values; if the health check status failed, shutting down the one or more sensing units to rectify the functioning of said unit by the intervention of the user or system administrator; if the health check status passed, configuring the one or more sensing units based on the body parts of the user and one or more predefined actions to be performed by the user; and capturing the data of the user based on the configuration; and sharing the captured data with the at least one processing unit. The system administrator, who is able to identify and rectify the functioning of units, includes but not limited to a human being, and/or a smart module/firmware.
[0012] According to yet another embodiment, the one or more sensing units is adapted to perform the steps comprising capturing data, by motion sensors, pressure sensor, and altimeter Sensor, of the one or more users in order of time and indices; filtering the captured data using a data filtering module; synchronizing the captured data of the motion sensor of the one or more users; recognizing the data of motion sensors, pressure sensor, and altimeter Sensor of by signal processing algorithm; computing quaternions, linear angles, and predefined state functions of the motion sensor data captured by one or more accelerometers, one or more gyroscopes, and magnetometers; computing data received from a piezo pressure sensor; blending the computed data of piezo pressure sensor with the data of the motion sensor; computing data received from altimeter sensor; blending the computed data of altimeter sensor with the data of the motion sensor; storing the blended data in the one or more sensing unit until purged; and transmitting the stored data to at least one processing unit. In one embodiment, the signal processing algorithm includes a predefined set of instruction to recognize the data of motion sensors, pressure sensor, and altimeter Sensor. In one embodiment, the accelerometer is a High-G accelerometer. In one embodiment, the gyroscope is a High Range gyroscope. In one embodiment, the blending of data /computed data is a process whereby data from multiple sensors are merged into a single data or data set. [0013] According to yet another embodiment, the at least one processing unit is adapted to perform the steps comprising receiving instruction from one of the user device and cloud server suite; performing self-diagnostics test to determine health check status; storing the health check status locally; sharing the health check status with the user device; receiving the evaluated health check status based on a second set of predefined values; if the health check status failed, shutting down the processing unit to rectify the functioning of said unit by the intervention of the user or system administrator; and if the health check status passed, providing instruction to initialize and configure the one or more sensing units.
[0014] According to yet another embodiment, the at least one processing unit is adapted to perform the steps comprising communicating with the one or more sensing units to receive the captured data, wherein the captured data is derived from one or more motions/physical activities of the user; storing the captured data locally in the processing unit until purged; synchronizing captured data received from the one or more sensors; processing the captured data of the one or more sensing unit for machine learning, classifying the one or more motions/physical activities of the user based on a predefined information stored in a master database; and determining metrics based on the synchronized and processed captured data, thereby provide necessary inputs in rendering, displaying by the user device and sending the same to the cloud server suite. In one embodiment, the necessary inputs include but not limited to, Starting and Ending points of the activity from the above data, key activity-based markers points within the activity data.
[0015] According to yet another embodiment, the at least one processing unit is adapted to provide one or more instructions along with data to the at least one of one or more sensing units through radio communication, Bluetooth/ Bluetooth Low Energy (BLE)/Wi-Fi.
[0016] According to yet another embodiment, the at least one processing unit is adapted to send or provide one or more instructions along with data to the at least one user device and/or the cloud server suite in an online mode over Wi-Fi complying with 1 EEE 802.1 1 ac/b/g/n protocol and having a minimum data-transmission/receiving range of 150 meters in the line of sight.
[0017] According to yet another embodiment, the at least one processing unit is triggered by a user interface in an offline mode to provide one or more instructions along with data to the one or more sensing units, said user interface includes at least one of a push button, touch screen, and voice user interface (VUI). [0018] According to yet another embodiment, the user device is adapted to perform the steps comprising determining one or more coordinates from the captured data for 2D/3D rendering of a user model; determining at least one of a trajectory of the motion, pressure and altitude of one or more body-parts of each time-frame from the captured data, thereby enabling micro-monitoring of body-parts individually and as a whole; and rendering the one or more determined coordinates, thereby displaying the metrics of the body-part(s) of the user through the predefined user model.
[0019] According to yet another embodiment, the cloud server suite connected to at least one of the processing unit and the user device configured to perform steps comprising communicating with the at least one of processing unit and user device for receiving data, meta-data, metrics and analysis; storing the received data, meta-data, metrics and analysis in a database of the cloud server suite; determining update on one or more predefined global metrics of one or more users for the corresponding physical movement or physical activity based on the received data, meta-data, metrics and analysis; storing the update on one or more predefined global metrics in the database of the cloud server suite; determining one or more predefined features of the machine learning from the data, meta-data, metrics and analysis; storing the determined one or more predefined features of the machine learning in the database of the cloud server suite; sending the determined one or more predefined features of the machine learning to at least one of processing unit and user device upon receiving the request for said features; storing the received one or more predefined features of the machine learning in the at least one processing unit for future usage; and sending the update on one or more predefined global metrics to at least one of the processing unit and user device, upon receiving the request for said global metrics. [0020] According to yet another embodiment, the cloud server suite connected to at least one of the processing unit and the user device configured to perform steps comprising determining global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by at least one of the processing unit and user devices of more than one users of said system; determining global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by one or more external system(s) or third-party system(s) and/or standards and/or data inputs of physical movement or physical training or sport; and sending global comparative metrics and analysis of the physical activity of the user to the at least of the processing unit and user device.
[0021 ] According to yet another embodiment, the one or more sensing units comprise at least one of one or more motion sensors, piezo pressure sensor, altimeter, memory unit, microprocessor, Bluetooth, wi-fi, radio frequency communication, power supply, re-chargeable battery, and battery charging unit.
[0022] According to yet another embodiment, the one or more processing units comprise at least one of USB interface, one or more Bluetooth communication, ethernet communication, memory unit, microprocessor, wi-fi, radio frequency communication, user display unit, power supply, re-chargeable battery, and battery charging unit.
[0023] According to yet another embodiment, the cloud server suite comprises at least one of a server application program interface (API), database, and one or more sets of predefined instructions.
[0024] According to yet another embodiment, one or more application program interfaces (APIs) of the processing unit and said system are adapted for enabling sensor data inputs from a third-party Sensor Unit(s) based on, as listed in predefined specifications. [0025] According to yet another embodiment, the data is stored locally in the sensor unit, the processing unit, the user device and the cloud server suite, so as not lose the data integrity.
[0026] According to yet another embodiment, the cloud server suite is adapted to initiate and execute Over-The-Air update of firmware/programming module(s) for one or more sensing units through at least one processing unit, said update is executed with or without the means of at least one user device through respective radio- communications among said units.
[0027] According to yet another embodiment, the cloud server suite is adapted to initiate and execute the Over-The-Air update of firmware/programming module(s) for one or more processing units, said update is executed with or without the means of at least one user device through respective radio-communications among said units.
[0028] Another embodiment of the present invention describes a method of rendering captured data of one or more users. The method comprises attaching one or more sensor units on one or more body parts of one or more users; providing one or more instructions to initialize one or more sensor units and at least one processing unit; configuring one or more sensor units according to the body parts with which the sensor units are attached and based on motions/physical activity to be captured, upon completion of initialization; capturing the motion/physical activity data of the one or more users; synchronizing the captured data received from the one or more users; processing the captured data to derive one or more information based on the predefined parameters; and rendering the processed data based on one or more user specific models, thereby providing pervasive professional analysis and training to the one or more users during physical movement, physical training or sport.
[0029] According to another embodiment, the method comprises communicating with the user device for registering profile of one or more users; receiving pictures and meta-data of the one or more users including dimensions of one or more body-parts as required for physical movement or physical training or sport; and processing the received pictures and meta-data and dimensions of the one or more body-parts of each user of the one or more user for creating one or more predefined models in 2D/3D, said model is proportionately scaled as per the received pictures and data.
[0030] These and other aspects of the example embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating example embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the example embodiments herein without departing from the spirit thereof, and the example embodiments herein include all such modifications.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[0031 ] Embodiments herein are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which: [0032] FIG. 1 illustrates a block diagram of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
[0033] FIG. 2 illustrates a block diagram of a MSTS sensor unit (101 ) of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
[0034] FIG. 3 illustrates a flow chart of a process performed by a sensor unit of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
[0035] FIG. 4 illustrates a block diagram of a processing unit of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
[0036] FIG. 5 illustrates a flow chart of a process performed by a processing unit of a motion sense technology system (MSTS), according to an embodiment as disclosed herein;
[0037] FIG. 6 illustrates a flow chart of a process performed by a user device of a motion sense technology system (MSTS), according to an embodiment as disclosed herein; and
[0038] FIG. 7 illustrates a flow chart of a process performed by a cloud server suite of a motion sense technology system (MSTS), according to an embodiment as disclosed herein.
DETAILED DESCRIPTION OF THE DRAWINGS
[0039] The example embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well- known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The description herein is intended merely to facilitate an understanding of ways in which the example embodiments herein can be practiced and to further enable those of skill in the art to practice the example embodiments herein. Accordingly, this disclosure should not be construed as limiting the scope of the example embodiments herein. [0040] The embodiments herein disclose a method and system of hardware embodiments of Sensor units and Sensor processing units for capturing and processing of the motion data of any human/animal/object doing any physical activity, and for visualization and analysis of the same through a user device / software application on a mobile/tablet/laptop/or any Display. The motion data is captured in the fields of training of physiotherapy patients for their gait improvement, training and coaching of players of sports, as well as in other fields or forms of physical training, which is then used as training inputs to assist individual(s) / Medical professionals / coaches / trainers to train / coach in any activity involving physical movement or sport.
[0041 ] This invention is aimed to solve such problems by offering an affordable, easily accessible and a pervasive professional analysis and training / coaching system for learning and excelling in the field of physical training or a chosen sport. The invention is used for a holistic physical training or sports by the professional physiotherapists / coaches to train patients to regain and improve their gait and to professional / amateur athletes respectively. The invention also assists an Individual to capture their activity and help them learn from the visual analysis provided by this system, by adding their own sensor units through special methods and capturing data through those devices.
[0042] FIG. 1 illustrates a block diagram of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein. The system (100) includes, but not limited to, one or more sensing units (101 ), one or more processing units (102), one or more user devices (103) and a cloud server suite (104). In one embodiment, the motion sensor includes but not limited to one or more accelerometers, one or more gyroscopes, and magnetometers. [0043] The Motion Sense Technology System (MSTS) (100) consists of specifically designed hardware, firmware and/or software, and addresses the needs of medical professionals, amateurs as well as professionals of a sport through a real-time visual feedback of the actual physical-activity/play in 2-Dimensional (2D) and 3-Dimensional (3D) rendering, that will be instantly available on their chosen mobile device(s). This mean that the feedback is based on a trainee's or player's actual physical / playing actions in the area/field and not under any simulated conditions. This Cloud-based system will provide specific visual analysis and data-points which are necessary for methodical learning and practice of the activity or sport. Through its unique Artificial Intelligence (Al) solution in a continuous assessment lifecycle and achievement of guided goals, the MSTS system will help in analyzing and improving performance in specific areas of the physical activity or sport. [0044] In the beginning, the aim is to create a solution for the sport of Cricket. Going forward, with the given embodiments of similar nature, containing the hardware equipment and software solution, the system can be used to Physiotherapy, Mechanical Physical activity of an Enterprise or Factory-environment, Dance forms, and other physical learning or performance-based activities, etc. The system will also provide the useful visual feedback and analysis as well as a guided learning path in areas of sports, like, Tennis, Golf, Badminton, Table-Tennis, Baseball, Football, Running, Cycling, Swimming, Football, Basketball, etc.
[0045] In one embodiment, for third-party Developers, the MSTS system is adapted to provide flexibility to add their own sensor units that is used to capture specific motion data as well as to get the result-feed from the MSTS system for their analysis, as per the guidelines to be provided. The MSTS system is adapted to accept plug-in of third-party sensor units through the use of MSTS Application Programmable Interfaces (APIs). These predefined MSTS system APIs provided by the MSTS system of the present invention offers flexibility for innovation and integration of large developer community. The MSTS system provides the APIs and guidelines for their usage and to fetch the result-feed. There is no such existing system that allows developers to readily plug-in their own sensors or to use the motion-data from the sensors for developing their applications or solutions.
[0046] In one embodiment, the cloud server suite is accessible via the internet to the processing unit and/or user device for all the operations. To the processing unit, the cloud server suite can be connected via the Wi-fi communication. To the user device, the cloud server suite can be accessed via the Wi-fi or other mediums of internet-access.
[0047] FIG. 2 illustrates a block diagram of a MSTS sensor unit (101 ) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein. The MSTS sensing unit includes but not limited to, at least one of one or more motion sensors (201 ), piezo pressure sensor (202), and altimeter sensor (203). The MSTS sensor unit (101 ) also includes memory unit (204), microprocessor/ micro-controller (205), Bluetooth (206), wi-fi / radio frequency communication (207), touch and display unit interface or display unit interface (208), power supply (209), re-chargeable battery (210), and battery charging unit (21 1 ). In one embodiment, "Bluetooth" means "Bluetooth, Bluetooth Low Energy, Bluetooth 5" for "Sensing units" and "Processing Unit".
[0048] The one or more sensor units (101 ) being used in the MSTS system (100) depends upon the type of application such as the motion sensor (201 ) is used to capture motion data, the piezo pressure sensor (202) is used for capturing the data related to change in pressure or altimeter is used to measure the altitude of an object above a fixed level. The MSTS sensor unit (101 ) is able to process the captured data using inbuilt microprocessor (205) and store the same into the memory unit (204). The MSTS sensor unit (101 ) is able to communicate or share the stored data with the MSTS processing unit using Bluetooth (206), wi-fi, radio frequency communication (207). The re-chargeable battery (210) provides appropriate preconfigured power for proper functioning of the MSTS sensor unit (101 ). The re-chargeable battery (210) is charged with the battery charging unit (21 1 ) when battery power goes below a predefined value or based on the instruction received from the MSTS processing unit or user device.
[0049] FIG. 3 illustrates a flow chart of a process performed by a sensor unit (101 ) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein. The sensing unit is configured for capturing the data of a user with which it is attached based on the configuration. At step 301 , the sensor unit comes to alive or gets activated from sleep mode on receiving a trigger such as by pressing a start button or by shaking the sensor unit within a predefined time period of last usage such within few minutes. At step 302, a check is performed to determine whether sensor unit is active or still in sleeping mode. If the sensor unit is still in sleeping mode, the step 301 is repeated. If the sensor unit is active, a self-diagnostics test is initiated at step 303. The self- diagnostic test includes but not limited to, battery level, states of one or more sensors, status of radio communication, and status of storage/ memory/ memory card. At step 304, the result of self-diagnostic test (i.e. health check status) is stored locally in the memory unit and at the same time the health check status is shared with the processing unit for further processing. At step 305, a check is performed whether health check status passed. If the health check status failed, the sensor unit is shutdown/closed to rectify the error or shortcoming such as for charging of battery of the sensor unit. If the health check status is passed, the microprocessor of the sensor unit performs a calibration process for the sensors (at step 306) such as 9-DoF Accelerometer, Gyroscope, Magnetometer, High-G Accelerometer, High Range Gyroscope, Pressure Sensor, and Altimeter. Once calibration is performed, the sensors are attached on a predefined part of the user (at step 307) based on the data to be captured. At step 308, the sensor units are configured for one or more session(s) of the physical activity until any change. The configuration includes but not limited to identifying and associating the Sensor unit with the Body-part. At step 309, playing and capturing data by the motion sensors, pressure sensor, and altimeter sensor in order of time and indices. At step 310, a process of recognition of the captured data is carried out through signal processing, and data filters. At step 31 1 , computation of Quaternions, Linear Angles, and Other necessary state functions are performed. At step 312, the captured data such as pressure data is converted into a predefined form or meaningful values, so as to blend with rest of the captured motion sensor data. At step 313, the captured data such as altimeter data is converted into a predefined form or meaningful values, so as to blend with rest of the captured motion sensor data. At step 314, the sensor data along with computed data and metadata is stored locally in the memory of the sensor unit, until getting purged. At step 315, the stored data are transmitted to the MSTS Processing unit. [0050] FIG. 4 illustrates a block diagram of a processing unit (102) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein. The processing unit (102) comprises at least one of Universal Serial Bus (USB) interface (401 ), Bluetooth communication (402), ethernet communication (403), memory unit (404), microprocessor (405), wi-fi/ radio frequency communication (406), user display unit (407), power supply (408), re-chargeable battery (409), and battery charging unit (410).
[0051 ] The processing unit (102) communicates with the sensor unit (101 ) and the user device (103) through one or more communication means such as USB interface, Bluetooth, ethernet, Wi-Fi, radio frequency communication. The data received from the sensor unit are processed by the microprocessor of the processing unit and subsequently stored locally in the memory unit of the processing unit.
[0052] FIG. 5 illustrates a flow chart of a process performed by a processing unit (102) of a motion sense technology system (MSTS)(100), according to an embodiment as disclosed herein. At step 501 , the MSTS processing unit comes to active state from a power-down stage or from Sleep Mode if inactive for a while by pressing a start button or reset button. At step 502, a check is performed whether processing unit is in active state. If the processing unit is still in sleep mode, the step 501 is repeated. If the processing unit is found to be in active state, Self- Diagnostics Tests are performed (at step 503), which includes but not limited to, check for battery Level, states of all the on-board components, radio-communication with the MSTS Sensor unit(s), Long-range radio- communication with the user device (which includes MSTS Mobile App), storage/ memory unit/ memory card, and time-function, and time-sync with MSTS Sensor unit(s), etc. At step 504, the results of Self-Diagnostic Tests, Health Checks Status, Report to MSTS MobileApp and MSTS Cloud suite are stored locally in the memory unit for further processing. At step 505, a check is performed whether the health check status of the processing unit passed. If the health check status of the processing unit failed, the processing unit is shut down/closed to rectify the error or shortcoming by the User for further action such as charging of battery or other any issue. If the health check status passed, the processing unit communicates with the MSTS Sensor unit(s) to make them ready to Capture Motion data at step 506. At step 507, the one or more sensor units are placed on the body-part(s) of the user/human/subject as per their respective associated positions. At step 508, a first check is performed whether the processing unit and the sensor unit(s) are ready for performing necessary actions. If the first check at step 508 is failed, a second check at step 509 is performed whether communication from the processing unit and process to make the devices ready are repeated twice. If the second check at step 509 failed, the step 506 is repeated. If the second check at step 509 passed, the user is provided, at step 510, an alert to check with the user device (more specifically MSTS MobileApp) to fix it or Report if not fixed. If the first check at step 508 passed, a check is performed, at step 51 1 , whether the MSTS processing unit running is in an online Mode. If no at step 51 1 , the processing unit performs the function in an offline mode. In the offline mode at step 512, the processing unit is triggered by a user interface to provide one or more instructions along with data to the one or more sensing units and/or cloud server suite. The user interface includes at least one of a push button, touch screen, and voice user interface (VUI). If yes at step 51 1 , the processing unit performs the function in an online mode. In the online mode at step 513, the processing unit sends Command(s) to the sensor unit to initiate the process of capturing sensor Data after communication from the MSTS MobileApp or user device. At step 514, the processing unit communicates with the sensor unit and receives captured data or sensor data over radio-communication. At step 515, the sensor data received from the one or more sensor units are stored locally in the memory unit until purged. At step 516, the processing unit is adapted to process the received sensor data. The process of sensor data includes but not limited to, synchronizing the Sensor-data received from all the MSTS Sensor unit(s), computing features of all sensor data for machine learning, running the computed features to match against the master database to classify the activity/motion for the user/human/subject, computing metrics based on the synchronized data-set. At step 517, the processer device stores all such sensor data along with computed data and the meta-data in the memory unit locally until purged. At step 518, the processing unit transmits sensor data to the MSTS MobileApp/user device and/or MSTS Cloud suite, when connected with them through radio- communication.
[0053] FIG. 6 illustrates a flow chart of a process performed by a user device (103) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein. At step 601 , the user device is configured for processing data received from the processing unit for particular physical activity by the user according to an embodiment of the present invention. The physical activity includes but not limited to Sport, Physiotherapy, Gait etc. In another embodiment, the MSTS MobileApp is installed in the user device. The user device includes but not limited to a mobile phone, tablet, laptop, and computing device. At step 602, a process of registration of the MSTS Sensor unit and MSTS Processing unit is performed in the user device. This process is one-time process in the MSTS system for same set of devices. At step, 603, a process for registration user/profile registration is performed for each new User/Human/Subject. At step 604, a check is performed for availability of 3D Rendering available for the Human/Subject for the User. If no at step 604, a process is performed (at step 605) to capture user's physical profile pictures including dimensions of the body-parts as required by the physical activity installed in the user device for availability of 3D Rendering available for the Human/Subject for the User. At step 606, the user device sends the captured pictures (in step 604), along with the calculated dimensions of the human/subject's body-parts to the MSTS cloud suite for the purpose of creating a proportionately scaled 2D/3D Rendering for the Human/Subject model for the User. At step 607, the user device fetches the proportionate 2D/3D Rendering of the human/subject's model received from the MSTS Cloud Suite. At step 608, the one or more sensor units and processing unit are triggered by switching on the power. At step 609, a check is performed whether the sensor units and the processing unit are functional. If no at step 609, the step 608 is repeated. If yes at step 609, initiating self-diagnostics tests for sensor unit(s), processing unit, and user device to derive health check status at step 610. The user device also checks the availability of radio communication and server cloud suite at step 610. At step 61 1 , the results of self-diagnostic tests and health checks status are stored local in the memory unit. The status is also shared with the processing unit for further processing. At step 612, a check is performed whether MSTS System is Ready i.e. whether all Checks are Passed. If no at step 612, the sensor unit is shut down/closed, at step 612, for further action by the User, e.g. Charging. If yes at step 612, calibration of the one or more sensor units are initiated, along with syncing and association with the user at step 613.
[0054] At step 614, one or more sensor units are attached on one or more body-parts of the user/ human/subject. At step 615, the one or more sensor units are identified and associated with the one or more body-parts through automated identification process for one or more sessions of the physical movement/ physical activity until any change. At step 616, the user device sends command(s) to the one or more sensor units through the processing unit to initiate capturing of sensor Data in online mode. At step 617, the user device receives the captured data or sensor data of the one or more sensor units from one or more processing units. At step 618, the user device computes the coordinates from the received data for the 2D/3D Rendering of the human/subject model along with the Trajectory of the motion, pressure and altitude of body-part(s) of each time-frame. Then rendering is performed for all such coordinates in 2D/3D. At step 619, the user device computes and displays the metrics and angles of movements of the body-part(s) of the human/subject. At step 620, based on the computation and rendering of coordinates in 2D/3D rendering, the user device prepares, computes and renders a comparative analysis of motion(s) of each time-frame for at least two or more data-sets of the same User or multiple users. At step 621 , the user device receives user- selected data-set for the same human/subject or other one or more human(s) /subject(s) from the cloud server suite. Then, the user device renders the advanced comparative analysis said method of computation of coordinates, metrics and analysis for time-frames and indices. At step 622, the user device stores all said data, computed metrics and meta-data, locally until getting purged. At step 623, the user device transmits all stored data to MSTS Cloud Suite.
[0055] FIG. 7 illustrates a flow chart of a process performed by a cloud server suite (104) of a motion sense technology system (MSTS) (100), according to an embodiment as disclosed herein. At step 701 , a process of registration of the one or more sensor units (i.e. sensing units) and one or more processing units (i.e. processing units) are performed by the cloud server suite. This process is one time if the complete MSTS System is to be used including the MSTS hardware. At step 702, a process of registration of one or more users/profiles is performed. This registration process is for each new user/Human/Subject. At step 703, once the connection is established with the user device based on the demand/request, the cloud server suite receives the pictures and metadata of the user/human/subject including dimensions of the body-parts as required by the physical activity installed in the user device/mobile application. At step 704, the cloud server suite process the pictures and meta-data and dimensions of the body-part(s) of the user/human/subject to create a proportionately scaled 2D/3D Rendering for the user/human/subject model. Then at step 705, the cloud server suite sends the correctly proportionate 2D/3D rendering for the user/human/subject model to the user device/ MobileApp. [0056] At step, 706, the cloud server suit also runs methods to propagate sharing-functionality and send notification (s) of multiple kinds to the user(s), for example Sensor-Data and Meta-data uploads. The cloud server suite receives and Shares sensor-data bookmarks from other users, and software / firmware updates.
[0057] At step 707, methods of Housekeeping of MSTS Cloud Suite Repository is performed.
[0058] At step 708, methods of Managing of the MSTS Sensor units applicable for one or more physical activities being monitored and used for, by the MSTS Cloud Suite. [0059] At step 709, Methods of Managing of the MSTS Processing unit being used for, by the MSTS Cloud Suite is performed.
[0060] At step 710, the cloud server suite receives and stores one or more Sensor Data and meta-data being uploaded from MSTS processing unit and/or user device/MSTS MobileApp. At step 71 1 , the cloud server suit computes and updates additional Global Metrics for all users (humans/subjects) of the corresponding physical activity, from the received Sensor-Data and Meta-data, Metrics and Analysis. The Resulting Global Metrics is stored in the MSTS Cloud Suite Repository. At step 712, the cloud server suite computes additional Features of the Machine Learning from the received Sensor-Data and Meta-data, Metrics and Analysis. The Resulting Global Machine Learning Training Database is also stored in the MSTS Cloud Suite Repository. At step 713, the cloud server suite determines global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by one or more external system(s) or third-party system(s) and/or standards and/or data inputs of physical movement or physical training or sport. At step 714, the cloud server suite sends the said Machine Learning Database updates on demand to the user device/MSTS MobileApp and the processing unit. The processing unit stores the received updates of the Machine Learning Database in its repository for future usage. At step 715, the cloud server suite sends the said comparative metrics updates on demand to the user device/ MobileApp and the processing unit. The processing unit and MobileApp used the said updates of the metrics to provide Global Comparative Metrics and Analysis of the physical activity of the user/human/subject.
[0061 ] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Claims

We claim:
1 . A system for rendering captured data of one or more users, comprising:
one or more sensing units configured for capturing data of one or more users;
at least one processing unit connected to the one or more sensing unit for processing the captured data received from the one or more sensing units, wherein the at least one processing unit being adapted to communicate with the one or more sensing units for evaluating health of the one or more sensing units and providing one or more predefined instructions; and
a user device connected to the at least one processing unit for receiving the processed data and rendering the processed data through one or more predefined user-specific models, thereby providing pervasive professional analysis and training to the one or more users during physical movement or physical training or sport.
2. The system as claimed in claim 1 further comprising a cloud server suite connected to one of the at least one processing unit and the user device configured to perform steps comprising:
communicating with the user device for registering profile of one or more users;
receiving pictures and meta-data of the one or more users including dimensions of one or more body-parts as required for physical movement or physical training or sport; and
processing the received pictures and meta-data and dimensions of the one or more body-parts of each user of the one or more user for creating one or more predefined models in 2D/3D, said model is proportionately scaled as per the received pictures and data.
The system as claimed in claim 1 , wherein the one or more sensing units comprising one or more sensors, said sensor is selected from a group comprising accelerometer, gyroscope, magnetometer, pressure sensor, and altimeter sensors.
The system as claimed in claim 1 , wherein the one or more sensing units are attached to a body parts of the user by means of attachment.
The system as claimed in claim 1 , wherein the one or more sensing unit is adapted to perform steps comprising:
receiving instruction from at least one of the user device and the at least one processing unit;
performing self-diagnostics test to determine health check status; storing the health check status locally;
sharing the health check status with the at least one processing unit;
receiving the evaluated health check status from the at least one processing unit based on a first set of predefined values;
if the health check status failed, shutting down the one or more sensing units to rectify the functioning of said unit by the intervention of the user or system administrator;
if the health check status passed, configuring the one or more sensing units based on the body parts of the user and one or more predefined actions to be performed by the user;
capturing the data of the user based on the configuration; and sharing the captured data with the at least one processing unit.
The system as claimed in claim 1 , wherein the one or more sensing units is adapted to perform the steps comprising: capturing data, by motion sensors, pressure sensor, and altimeter Sensor, of the one or more users in order of time and indices;
filtering the captured data using a data filtering module;
synchronizing the captured data of the motion sensor of the one or more users;
recognizing the data of motion sensors, pressure sensor, and altimeter Sensor of by signal processing algorithm;
computing quaternions, linear angles, and predefined state functions of the motion sensor data captured by one or more accelerometers, one or more gyroscopes, and magnetometers; computing data received from a piezo pressure sensor;
blending the computed data of piezo pressure sensor with the data of the motion sensor;
computing data received from altimeter sensor;
blending the computed data of altimeter sensor with the data of the motion sensor;
storing the blended data in the one or more sensing unit until purged; and
transmitting the stored data to at least one processing unit.
The system as claimed in claim 1 , wherein the at least one processing unit is adapted to perform the steps comprising:
receiving instruction from one of the user device and cloud server suite;
performing self-diagnostics test to determine health check status; storing the health check status locally;
sharing the health check status with the user device;
receiving the evaluated health check status based on a second set of predefined values; if the health check status failed, shutting down the processing unit to rectify the functioning of said unit by the intervention of the user or system administrator; and
if the health check status passed, providing instruction to initialize and configure the one or more sensing units.
8. The system as claimed in claim 1 , wherein the at least one processing unit is adapted to perform the steps comprising:
communicating with the one or more sensing units to receive the captured data, wherein the captured data is derived from one or more motions/physical activities of the user;
storing the captured data locally in the processing unit until purged;
synchronizing captured data received from the one or more sensors;
processing the captured data of the one or more sensing unit for machine learning,
classifying the one or more motions/physical activities of the user based on a predefined information stored in a master database; and determining metrics based on the synchronized and processed captured data, thereby provide necessary inputs in rendering, displaying by the user device and sending the same to the cloud server suite.
9. The system as claimed in claim 1 , wherein the at least one processing unit is adapted to provide one or more instructions along with data to the at least one of one or more sensing units through radio communication, Bluetooth/ Bluetooth Low Energy (BLE)/Wi-Fi.
10. The system as claimed in claim 1 , wherein the at least one processing unit is adapted to send or provide one or more instructions along with data to the at least one user device and/or the cloud server suite in an online mode over Wi-Fi complying with 1 EEE 802.1 1 ac/b/g/n protocol and having a minimum data-transmission/receiving range of 150 meters in the line of sight.
1 1 . The system as claimed in claim 1 , wherein the at least one processing unit is triggered by a user interface in an offline mode to provide one or more instructions along with data to the one or more sensing units, said user interface includes at least one of a push button, touch screen, and voice user interface (VUI).
12. The system as claimed in claim 1 , wherein the user device is adapted to perform the steps comprising:
determining one or more coordinates from the captured data for 2D/3D rendering of a user model;
determining at least one of a trajectory of the motion, pressure and altitude of one or more body-parts of each time-frame from the captured data, thereby enabling micro-monitoring of body-parts individually and as a whole; and
rendering the one or more determined coordinates, thereby displaying the metrics of the body-part(s) of the user through the predefined user model.
13. The system as claimed in claim 2, wherein the cloud server suite connected to at least one of the processing unit and the user device configured to perform steps comprising:
communicating with the at least one of processing unit and user device for receiving data, meta-data, metrics and analysis;
storing the received data, meta-data, metrics and analysis in a database of the cloud server suite; determining update on one or more predefined global metrics of one or more users for the corresponding physical movement or physical activity based on the received data, meta-data, metrics and analysis;
storing the update on one or more predefined global metrics in the database of the cloud server suite;
determining one or more predefined features of the machine learning from the data, meta-data, metrics and analysis;
storing the determined one or more predefined features of the machine learning in the database of the cloud server suite;
sending the determined one or more predefined features of the machine learning to at least one of processing unit and user device upon receiving the request for said features;
storing the received one or more predefined features of the machine learning in the at least one processing unit for future usage; and
sending the update on one or more predefined global metrics to at least one of the processing unit and user device, upon receiving the request for said global metrics.
14. The system as claimed in claim 2, wherein the cloud server suite connected to at least one of the processing unit and the user device configured to perform steps comprising:
determining global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by at least one of the processing unit and user devices of more than one users of said system; and
determining global comparative metrics and analysis of the physical activity of the user based on the received global metrics and analysis by one or more external system(s) or third-party system(s) and/or standards and/or data inputs of physical movement or physical training or sport; and
sending global comparative metrics and analysis of the physical activity of the user to the at least of the processing unit and user device.
15. The system as claimed in claim 1 , wherein the one or more sensing units comprise at least one of one or more motion sensors, piezo pressure sensor, altimeter, memory unit, microprocessor, Bluetooth, wi-fi, radio frequency communication, power supply, re-chargeable battery, and battery charging unit.
16. The system as claimed in claim 1 , wherein the one or more processing units comprise at least one of USB interface, one or more Bluetooth communication, ethernet communication, memory unit, microprocessor, wi-fi, radio frequency communication, user display unit, power supply, re-chargeable battery, and battery charging unit.
17. The system as claimed in claim 2, wherein the cloud server suite comprises at least one of a server application program interface (API), database, and one or more sets of predefined instructions.
18. The system as claimed in claim 1 , wherein one or more application program interfaces (APIs) of the processing unit and said system are adapted for enabling sensor data inputs from a third-party Sensor Unit(s) based on, as listed in predefined specifications.
19. The system as claimed in claim 1 , wherein the data is stored locally in the sensor unit, the processing unit, the user device and the cloud server suite, so as not lose the data integrity.
20. The system as claimed in claim 2, wherein the cloud server suite is adapted to initiate and execute Over-The-Air update of firmware/programming module(s) for one or more sensing units through at least one processing unit,
said update is executed with or without the means of at least one user device through respective radio-communications among said units.
21 . The system as claimed in claim 2, wherein the cloud server suite is adapted to initiate and execute the Over-The-Air update of firmware/programming module(s) for one or more processing units, said update is executed with or without the means of at least one user device through respective radio-communications among said units.
22. The system as claimed in claim 2, wherein the cloud server suite is connected to at least one of the processing unit and the user device, via internet.
23. The system as claimed in claim 2, wherein the cloud server suite is connected to the processing unit via Wi-fi communication, and the user device via Wi-fi or other mediums of internet-access.
24. A method of rendering captured data of one or more users, the method comprising:
attaching one or more sensor units on one or more body parts of one or more users;
providing one or more instructions to initialize one or more sensor units and at least one processing unit;
configuring one or more sensor units according to the body parts with which the sensor units are attached and based on motions/physical activity to be captured, upon completion of initialization;
capturing the motion/physical activity data of the one or more users;
synchronizing the captured data received from the one or more users;
processing the captured data to derive one or more information based on the predefined parameters; and
rendering the processed data based on one or more user specific models, thereby providing pervasive professional analysis and training to the one or more users during physical movement, physical training or sport.
25. The method as claimed in claim 24 further comprising:
communicating with the user device for registering profile of one or more users;
receiving pictures and meta-data of the one or more users including dimensions of one or more body-parts as required for physical movement or physical training or sport; and
processing the received pictures and meta-data and dimensions of the one or more body-parts of each user of the one or more user for creating one or more predefined models in 2D/3D, said model is proportionately scaled as per the received pictures and data.
PCT/IN2018/050489 2017-07-27 2018-07-25 Motion sense technology system Ceased WO2019021315A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201711026818 2017-07-27
IN201711026818 2017-07-27

Publications (1)

Publication Number Publication Date
WO2019021315A1 true WO2019021315A1 (en) 2019-01-31

Family

ID=65041016

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2018/050489 Ceased WO2019021315A1 (en) 2017-07-27 2018-07-25 Motion sense technology system

Country Status (1)

Country Link
WO (1) WO2019021315A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110502107A (en) * 2019-07-26 2019-11-26 森博迪(深圳)科技有限公司 Wearable real-time action instructs system and method
US20250022209A1 (en) * 2021-12-01 2025-01-16 Sony Group Corporation Image production system, image production method, and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7825815B2 (en) * 2006-01-09 2010-11-02 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20130321677A1 (en) * 2012-05-31 2013-12-05 Apple Inc. Systems and methods for raw image processing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7825815B2 (en) * 2006-01-09 2010-11-02 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20130321677A1 (en) * 2012-05-31 2013-12-05 Apple Inc. Systems and methods for raw image processing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110502107A (en) * 2019-07-26 2019-11-26 森博迪(深圳)科技有限公司 Wearable real-time action instructs system and method
US20250022209A1 (en) * 2021-12-01 2025-01-16 Sony Group Corporation Image production system, image production method, and program

Similar Documents

Publication Publication Date Title
AU2017331639B2 (en) A system and method to analyze and improve sports performance using monitoring devices
US20250246303A1 (en) Flight Time
US10438415B2 (en) Systems and methods for mixed reality medical training
US10121065B2 (en) Athletic attribute determinations from image data
JP6814196B2 (en) Integrated sensor and video motion analysis method
JP6273364B2 (en) Energy consuming equipment
US11030918B2 (en) Identification and analysis of movement using sensor devices
US20250307866A1 (en) Multi-Factor Authentication and Post-Authentication Processing System
US20160098941A1 (en) Methods and apparatus for goaltending applications including collecting performance metrics, video and sensor analysis
JP2018138167A (en) Sessions and groups
US11482126B2 (en) Augmented reality system for providing movement sequences and monitoring performance
US11806579B2 (en) Sports operating system
US9248361B1 (en) Motion capture and analysis systems for use in training athletes
CN107213619A (en) Sports Training Evaluation System
CN103312957A (en) Information processing apparatus, information processing method, and recording medium, for displaying information of object
US10306687B2 (en) Transmitting athletic data using non-connected state of discovery signal
US20210280082A1 (en) Providing Workout Recap
JP2017000481A (en) Analysis system and analysis method
JP2018522618A (en) Frameworks, devices and methodologies configured to enable gamification through sensor-based monitoring of physically executed skills, including location-specific gamification
JP2018522361A (en) Framework, device and methodology configured to allow automated classification and / or retrieval of media data based on user performance attributes obtained from a performance sensor unit
KR102095647B1 (en) Comparison of operation using smart devices Comparison device and operation Comparison method through dance comparison method
US20200111376A1 (en) Augmented reality training devices and methods
US11049321B2 (en) Sensor-based object tracking and monitoring
US20110137213A1 (en) Method and system for therapeutic exergaming
WO2019021315A1 (en) Motion sense technology system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18838283

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18838283

Country of ref document: EP

Kind code of ref document: A1