[go: up one dir, main page]

CN108815804B - VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal - Google Patents

VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal Download PDF

Info

Publication number
CN108815804B
CN108815804B CN201810602719.0A CN201810602719A CN108815804B CN 108815804 B CN108815804 B CN 108815804B CN 201810602719 A CN201810602719 A CN 201810602719A CN 108815804 B CN108815804 B CN 108815804B
Authority
CN
China
Prior art keywords
gesture
myo
arm
upper limb
virtual environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810602719.0A
Other languages
Chinese (zh)
Other versions
CN108815804A (en
Inventor
王晶
任诗媛
乐赞
郭晓辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Medical Technology Co., Ltd.
Original Assignee
Shenzhen Rhb Medical Tech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Rhb Medical Tech Co ltd filed Critical Shenzhen Rhb Medical Tech Co ltd
Priority to CN201810602719.0A priority Critical patent/CN108815804B/en
Publication of CN108815804A publication Critical patent/CN108815804A/en
Application granted granted Critical
Publication of CN108815804B publication Critical patent/CN108815804B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • A63B23/14Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles for wrist joints
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/12Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
    • A63B23/16Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles for hands or fingers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0658Position or arrangement of display
    • A63B2071/0661Position or arrangement of display arranged on the user
    • A63B2071/0666Position or arrangement of display arranged on the user worn on the head or face, e.g. combined with goggles or glasses
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/08Measuring physiological parameters of the user other bio-electrical signals
    • A63B2230/085Measuring physiological parameters of the user other bio-electrical signals used as a control parameter for the apparatus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/62Measuring physiological parameters of the user posture
    • A63B2230/625Measuring physiological parameters of the user posture used as a control parameter for the apparatus

Landscapes

  • Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The invention discloses a VR upper limb rehabilitation training platform and a VR upper limb rehabilitation training method based on MYO arm rings and a mobile terminal, wherein the VR upper limb rehabilitation training platform comprises a MYO arm belt, a Unity virtual environment, a mobile terminal platform and a VR module; the MYO armband comprises a nine-axis inertial sensor unit, eight surface electromyographic sensors and a Bluetooth receiver; the Unity virtual environment comprises a 3d upper limb model, real-time visual feedback of myoelectric information, a scene of a rehabilitation training game and a training mode thereof; the mobile terminal platform is a mobile phone or a flat platform used for carrying a virtual environment based on the MYO armlet, mapping the gesture and arm posture information acquired by the MYO armlet into a VR (virtual reality) environment, and visually displaying the gesture and arm posture information to a user, so that the mobile terminal platform is convenient to carry and use; and the VR module feeds back the virtual scene constructed in the Unity virtual environment to the hand action state of the user by using VR glasses, so that the environment immersion and the human-computer interaction are enhanced. Compared with the prior art, the invention has the advantages of low cost, strong operability, good practicability, convenient carrying, strong immersion and good interactivity.

Description

VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal
Technical Field
The invention relates to a low-cost virtual environment 3d arm training platform developed for helping upper limb rehabilitation of stroke patients, in particular to a VR upper limb rehabilitation training platform based on MYO arm rings and a moving end.
Background
The prevalence rate of stroke in China has a rising trend in recent years and becomes a main disease which is harmful to the health of people in China, 85% of stroke patients are accompanied by upper limb functional damage when suffering from the disease, about 55% -75% of patients still have upper limb dysfunction within 6 months after suffering from the disease, which is also a main reason for hand function damage, and the treatment and rehabilitation of hand limb hemiplegia after suffering from the stroke become research hotspots of modern rehabilitation medicine and rehabilitation engineering.
An innovative MY0 arm ring, available from Thalmic Labs, canadian venture, reads sEMG signals, can be worn over the elbow joint of any arm to collect sEMG signals generated by arm muscles, and has eight channels, each of which is arranged at equal intervals. Original sEMG signal can be gathered to MY0 armlet, spreads the signal through the bluetooth of low power, and the interference is little, and signal quality is good and the low price, as the control source, has characteristics such as with low costs, wear comfortablely, and accords with the practicality.
The patent with the application number of 201610379614.4 discloses a prosthetic hand control method based on a MYO arm ring, and the patent with the application number of 201710168821.X discloses a prosthetic hand control system based on a MYO arm ring, and the prosthetic hand control system comprises a signal acquisition module, an STM32 module, a fuzzy controller module, a prosthetic hand module, a gripping force feedback module and a PC (personal computer) matched with offline training. The artificial hand control mentioned above has the problems of high system cost, poor gripping flexibility, poor operation sense and practicality, and may cause uncomfortable experience when being worn by a patient.
In order to solve the problems of high cost, poor comfort and the like of real myoelectric prosthetic hand equipment, a patent with application number 201611073067.3 discloses a virtual prosthetic hand training platform based on MYO armlet and visual line tracking and a training method thereof.
Therefore, the invention analyzes the motion gesture information of a user by using the MY0 armlet, maps the motion gesture information to a 3d upper limb model in the Unity environment, realizes the interaction of myoelectricity gesture information and a virtual environment by using a software algorithm, then performs data processing and recognition on the surface myoelectricity signal acquired by the MYO armlet, adds an observation-simulation training scene in the Unity environment, controls the 3d arm to complete actions such as grasping and the like in real time, displays myoelectricity related information on a GUI (graphical user interface), realizes the communication between the MYO armlet and a mobile platform on hardware through a Bluetooth module, and finally uses VR glasses to enhance the environmental immersion of the user.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention aims to provide a VR upper limb rehabilitation training platform based on a MYO arm ring and a moving end and a method thereof, and the VR upper limb rehabilitation training platform which is low in cost, strong in operability, good in practicability, convenient to carry, strong in immersion and good in interactivity is built, so that convenience is provided for rehabilitation training and daily life of patients with upper limb disabilities.
The invention is realized by the following technical scheme.
The invention relates to a VR upper limb rehabilitation training platform based on MYO arm rings and a moving end, which comprises:
the computer adopts the Unity3D software to construct a Unity virtual environment, which comprises a 3d upper limb model, real-time visual feedback of myoelectric information, a therapeutic virtual environment, a scene of a rehabilitation training game and a training mode of the scene;
the MYO arm ring comprises a nine-axis inertial sensor, eight surface electromyographic sensors and a Bluetooth receiver, wherein the nine-axis inertial sensor is used for detecting the motion track, the direction and the arm posture information of an arm, and the surface electromyographic sensors are used for detecting the electromyographic signals of different gestures and the gesture information of the different gestures; the communication with the Unity virtual environment and the mobile terminal platform is realized through a Bluetooth receiver;
the mobile equipment is used for carrying a mobile phone or a tablet personal computer based on a Unity virtual environment, mapping gesture information and arm posture information acquired by the MYO arm ring into VR glasses, and visually displaying the VR glasses to a user;
the VR glasses are used as a wearing body for carrying the mobile equipment, and gesture information and arm posture information of the user are fed back to the mobile equipment through the VR glasses in the Unity virtual environment, so that man-machine interaction is performed;
the user wears the MYO armlet and the VR glasses, the arm posture information of the hand of the user is collected, the gesture information is analyzed and recognized through the myoelectric signals, then the hand movement is converted into corresponding hand movement in the Unity virtual environment, and the user performs upper limb rehabilitation training through operating the mobile device.
The invention further provides a VR upper limb rehabilitation training method for the MYO arm ring and the moving end of the platform, which comprises the following steps:
the first step is as follows: a user wears MYO armlet and VR glasses and determines whether the communication between the user and the Unity virtual environment in the computer is successful or not and the Bluetooth connection between the user and the mobile terminal equipment is successful;
the second step is that: viewing the initial position of an arm in a 3d upper limb model in the Unity virtual environment from the mobile equipment through a gesture of inward wrist turning of a user, and calibrating the initial position to a screen of the mobile equipment which is just opposite to VR glasses;
the third step: writing a data reading and posture synchronization algorithm program related to a MYO arm ring in a Unity environment, synchronizing the spatial position information of an arm in an upper limb model of a user according to gyroscope data of a nine-axis inertial sensor on the MYO arm ring, and analyzing a current gesture according to a hand myoelectric signal of the user;
the fourth step: in a basic stage, taking treatment as a scene, sending a gesture signal to a 3d upper limb model in a Unity virtual environment, executing different hand actions, and displaying myoelectric information acquired by a MYO arm ring in real time on a mobile equipment interface;
the fifth step: in the training stage, a game is taken as a scene, a virtual interactive object is added, an observation-simulation training mode is adopted in a Unity virtual environment scene, animation is set to finish grabbing the observation scene placed in a moving way, then the upper limbs are actively moved to simulate the observation animation scene, and the movement of the limbs is finished;
if the hand is detected to normally hold the virtual interactive object, the color of the image of the virtual interactive object is changed from A to B, and if the color of the image of the virtual interactive object is not changed, the user is not in contact with the virtual interactive object;
if the hand is detected to be opened to release the virtual interactive object, and the virtual interactive object is placed at a specified position, changing the color of the image of the virtual interactive object from B to C;
repeating the above actions, and obtaining the training effect of the user through the feedback of the mobile equipment.
Furthermore, in the process of displaying myoelectric information collected by the MYO arm ring in real time on a mobile equipment interface, a GUI (graphical user interface) is added in a Unity virtual environment, and myoelectric information is visually displayed by utilizing arm rotation angle, acceleration information and position data collected by the MYO arm ring in a basic stage; in the training stage, the myoelectric information is indirectly displayed by utilizing the score condition, action accuracy and speed of the training process, so as to feed back to the user in real time.
Furthermore, in the scenario of the Unity virtual environment, the mobile device-based motion observation and motion simulation rehabilitation training scenario in the virtual environment is realized, that is, the virtual motion scenario is observed to induce the active motion intention of the patient, and meanwhile, the virtual motion scenario is actively simulated in the game training to perform rehabilitation training.
Furthermore, a mac address code corresponding to the MYO arm ring and the mobile device is added into the script of the Unity virtual environment of the computer, so that the successful connection between the Bluetooth on the MYO arm ring and the Bluetooth of the mobile device is realized, and the Unity virtual environment of the computer is transferred to the mobile device so as to be carried about for rehabilitation training.
Further, when the VR glasses are used, a gyroscope built in the VR glasses is used for sensing the dynamic change of the positions of the eyes, and a binocular virtual reality camera is adaptively constructed in a scene according to the distance between VR lenses and the interpupillary distance and the screen size of the mobile device, so that a 3D immersion effect is realized; the size module in the Unity virtual environment is used in the computer to monitor the currently oriented object.
Further, in the second step, a 3d upper limb model is imported into the Unity virtual environment, an initialization script of the MYO arm ring is given to the 3d upper limb model, and whether the gesture changes is judged; then taking a wrist inward turning gesture as a correction starting gesture during correction each time, and taking a forearm in a 3d upper limb right facing the screen of the mobile device as an initial gesture after correction; the normal vector and the rotation direction of the forearm are compensated when the forearm is calibrated at the beginning, the rotation direction of the forearm is adjusted to be consistent with the coordinate axis direction of the MYO arm ring on the mobile equipment, and the actions of fixing the upper arm of the 3d upper limb and rotating the forearm are realized until the forearm faces the inside of the screen when the forearm is calibrated at the last.
Furthermore, in the arm calibration, the euler angles are used to describe the rotation change of the space where the arm is located, the euler angles of the upper arm of the 3d upper limb are fixed in an absolute coordinate system, the forearm of the 3d upper limb is aligned with the center of the picture, then in a relative coordinate system of the forearm, a Z axis is defined as a coordinate axis of a plane where two perpendicular forearm are located, a Y axis is in the horizontal plane along the horizontal direction of the forearm, an X axis is in the horizontal plane along the vertical direction of the forearm, Yaw rotation is performed around the Z axis to adjust the angle between the forearm and the upper arm to a normal physiological position, Roll rotation is performed around the Y axis to adjust the euler angles of the forearm, Pitch rotation is performed around the X axis to adjust the euler angles of the wrist of the 3d upper limb, and the orientation is reset after the measurement device positions the limb orientation, so as to synchronize the initial coordinates in the virtual and real world coordinates.
Furthermore, in the fourth step, a preset part of a MYO arm ring is assigned to a virtual 3d upper limb to enable the virtual 3d upper limb to have various attributes of the MYO arm ring, and then a target object of each joint of the 3d upper limb, an action zone bit, a gesture completion state, finger speed and time length are defined; when each frame of picture of the Unity virtual environment is updated, whether the gesture changes from the last gesture is detected, if so, the gesture is set to be the currently detected gesture, and if not, the gesture is set to be the gesture which is randomly placed when the user relaxes.
Compared with the prior art, the invention has the following beneficial effects:
compared with a PC terminal, the movable terminal has the characteristics of convenience in carrying and good practicability, and Cardboard VR glasses have the characteristic of strong immersion; the game in the training mode in the scene is easy to operate and interesting, the observation simulated game training can improve the active participation degree of the patient, the visualization of myoelectric information can enable the patient to observe the training data more intuitively, and the man-machine interaction is enhanced, so that convenience is provided for the rehabilitation training and daily life of the patient with disabled upper limbs.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention:
FIG. 1 is a general block diagram of a VR upper limb rehabilitation training platform based on MYO arm rings and a mobile terminal according to the present invention;
FIG. 2 is a flow chart of the algorithm of the present invention;
fig. 3 is a schematic diagram of an interactive mode of rehabilitation training according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, VR upper limb rehabilitation training platform based on MYO arm ring and moving end includes: computer, MYO armlet, mobile device and VR glasses. Wherein:
the computer adopts the Unity3D software to construct a Unity virtual environment, which comprises a 3d upper limb model, real-time visual feedback of myoelectric information, a therapeutic virtual environment, a scene of a rehabilitation training game and a training mode thereof.
The Unity virtual environment is generated by Unity3D software on a computer, and a rehabilitation training platform mainly depends on the environment to carry out related training, wherein the related training comprises equipment acquisition of MYO arm rings, data reading and posture synchronization program writing, program writing for adaptively constructing a binocular virtual reality camera in a scene in a Cardboard, establishment of a 3d upper limb model, GUI interface display of myoelectric information, establishment of a Unity virtual environment for medical treatment in a basic stage, establishment of a scene of a rehabilitation training game in a training stage, algorithm programming of the training game and the like;
in myoelectricity visualization feedback of the Unity virtual environment, a GUI (graphical user interface) is added in the Unity virtual environment, the basic stage is visually displayed by utilizing the arm rotation angle, acceleration information and position data acquired by the MYO arm ring, and the training stage is indirectly displayed by utilizing the score condition, action accuracy and speed of the training process, so that myoelectricity information is fed back to a user in real time.
In the scene of the Unity virtual environment, the motion observation and the motion simulation rehabilitation training scene in the virtual reality environment are realized based on the mobile terminal, namely, the active motion intention of a user is induced by observing the virtual motion scene, and meanwhile, the virtual motion scene is actively simulated in the game training for rehabilitation training.
The MYO arm ring comprises a nine-axis inertial sensor, eight surface electromyographic sensors and a Bluetooth receiver, wherein the nine-axis inertial sensor unit comprises a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer, the motion track, the direction and the arm posture information of an arm are detected through the nine-axis inertial sensor, and the electromyographic signals and the gesture information of different gestures are detected through the surface electromyographic sensors; the communication with the Unity virtual environment and the mobile terminal platform is realized through a Bluetooth receiver; the MY0 arm ring is innovative equipment produced and promoted by Thalmic Labs of Canada entrepreneurial corporation, is used for reading surface electromyographic signals, and hand signals are generated by arm forearm muscles, so that the MY0 arm ring can be worn above the elbow joint of any arm to collect bioelectricity signals generated by the arm muscles, and is provided with eight channels, each channel is arranged at equal intervals, and the collection frequency is 200Hz of the highest data output frequency.
The mobile device is used for carrying a mobile phone or a tablet personal computer based on a Unity virtual environment, mapping gesture information and arm posture information collected by the MYO arm ring into VR glasses, and visually displaying the VR glasses to a user.
The mobile device is Android equipment and is used for carrying a Unity virtual environment based on a MYO arm ring, mapping gesture information and arm posture information collected by the MYO arm ring to the Unity virtual environment, placing Android in a Cardboard VR glasses box, and visually displaying the Android to a user, so that the mobile device is convenient to carry and use.
And the VR glasses are used as a wearing body for carrying the mobile equipment, and the gesture information and the arm posture information of the user are fed back to the mobile equipment through the VR glasses when the Unity virtual environment is observed on the mobile equipment, so that man-machine interaction is carried out.
The VR glasses mainly refer to Cardboard VR glasses, components such as a paperboard, a lenticular lens, a magnet, a magic tape, a binding band, an NFC tape and the like are included in a Cardboard paper box, the Cardboard paper box is a cheap virtual reality device, and a user can feel charm of virtual reality through a mobile phone; after the relevant algorithm is added, the content in the mobile phone can be displayed in a split screen mode in the virtual scene built in the Unity virtual environment, the content seen by two eyes has parallax, so that a stereoscopic effect is generated, and the hand action state of the user is fed back by using the Cardboard VR glasses, so that the environment immersion feeling and the man-machine interaction can be enhanced.
The user wears the MYO armlet and the VR glasses, the arm posture information of the hand of the user is collected, the gesture information is analyzed and recognized through the myoelectric signals, then the hand movement is converted into corresponding hand movement in the Unity virtual environment, and the user performs upper limb rehabilitation training through operating the mobile device.
Fig. 2 is a flowchart of a method of a VR upper limb rehabilitation training platform based on a MYO arm ring and a mobile terminal, requiring a user to wear a MYO arm ring and Cardboard VR glasses, acquiring myoelectric signals of a hand, acquiring the myoelectric signals by using equipment, processing and analyzing gesture actions by data reading and gesture synchronization algorithms, transmitting recognition results to a Unity virtual environment by using bluetooth 4.0LE, converting the gesture recognition results into corresponding hand actions in the Unity virtual environment, and performing gameplay training by wearing VR glasses added with an Android mobile phone, wherein the method specifically includes the following steps:
the first step is as follows: the user wears the MYO armlet and the Cardboard VR glasses, and determines whether the communication between the user and the Unity virtual environment in the computer is successful or not and the Bluetooth connection between the user and the Android device is successful.
The MYO arm ring is worn at the antecubital joint of the left arm:
the MYO arm ring is communicated with a Unity virtual environment through a Bluetooth receiver, the MYO arm ring is connected with the Android mobile phone, the mac address code of the Android mobile phone is added into a Unity virtual environment algorithm, a virtual reality training scene is led out from a computer end to the Android mobile phone to generate an App, the Bluetooth of the mobile phone is started, the App can be opened to be immediately connected with the MYO arm ring through the Bluetooth, and the hand can see the motion track of the hand in the screen of the mobile phone when the hand is swung.
The mobile phone is placed in a Cardboard VR glasses box and fixed on the head by a binding band.
And the Cardboard VR glasses need add the VR module in the Unity virtual environment, according to Cardboard VR camera lens interval and interpupillary distance, cooperate the screen size of Android equipment, construct binocular virtual reality camera in the scene suitability, can see two piecemeals about in the cell-phone screen in plane environment, the position dynamic change of eyes is responded to with the gyroscope on the VR glasses, be equivalent to the gaze function in the Unity virtual environment, with this observation object that faces now, and wear VR glasses and can make the scene have the three-dimensional effect of 3D, user's the impression of reinforcing.
The second step is that: the initial position of the forearm in the 3d upper limb in the virtual environment is calibrated to be directly inside the screen by a gesture of wrist inversion.
In order to make the 3d upper limb appear in the visual field, the initial position of the forearm in the 3d upper limb needs to be calibrated, the left arm of the user is firstly enabled to face the inside of the screen, and then the wrist is turned inwards, namely the forearm in the 3d upper limb is enabled to be in the position right ahead:
and importing a 3d upper limb model in the Unity virtual environment, assigning an initialization script of the MYO arm ring to the 3d upper limb model, and judging whether the gesture changes according to whether the recognized gesture is the last gesture. Then taking a wrist inward turning gesture as a correction starting gesture during correction each time, and taking the position of a forearm in a 3d upper limb just facing the inside of the screen as an initial gesture position after correction;
the method comprises the steps of compensating a normal vector and a rotation direction of a forearm when the forearm in an initial 3d upper limb is calibrated, adjusting the rotation direction of the forearm to be basically consistent with the motion direction of the forearm on an Android device, namely keeping the coordinate axis direction of the forearm consistent with MYO coordinates in the Android device, realizing the actions of fixing an upper arm in the 3d upper limb and rotating the forearm, describing the rotation change of the space where the forearm is located by using an Euler angle, aligning the forearm to the center of a picture by fixing the Euler angle of the upper arm in an absolute coordinate system, then defining a Z axis as a coordinate axis of a plane where two vertical forearm are located in a relative coordinate system of the forearm, defining a Y axis along the horizontal direction of the forearm in the horizontal plane, adjusting the angle between the forearm and the upper arm to a normal physiological position by carrying out Yaw rotation around the Z axis to adjust the Euler angle of the forearm around the Y axis, the Euler angle of the wrist in the 3d upper limb is adjusted by carrying out Pitch rotation around the X axis, the position is reset after the position of the limb is positioned by the measuring device, the initial coordinates in the virtual and real world coordinates are synchronized, and the proprioception brought by the synchronization of the virtual limb and the limb of the patient is enhanced; until the final calibration the forearm is facing the inside of the screen.
The third step: the method comprises the steps of writing an MYO arm ring and an algorithm program related to equipment acquisition, data reading and posture synchronization in a Unity virtual environment, synchronizing the spatial position information of the hand and the limb of a patient according to gyroscope data of a nine-axis inertial sensor in the MYO arm ring, and analyzing a current gesture according to an electromyographic signal.
The built-in gesture signals of the MYO armlet comprise 5 gestures, namely inward wrist turning, outward wrist turning, fist making, fist unfolding, double-knock of a thumb and a middle finger and the like.
The fourth step: in a basic stage, hospital treatment is taken as a scene, the gesture signals are sent to the 3d upper limb model, different hand actions such as grasping and opening are executed, and data collected by the MYO arm ring are displayed in real time on an Android interface.
Under the background of hospital treatment, the left hand opens the five fingers, and after the action is detected, the MYO arm ring can generate short vibration, and the fingers in the 3d upper limb are slowly opened; then, when the left hand is used for making a fist, the MYO armlet can vibrate briefly, fingers in the 3d upper limb can also be slowly folded, and the MYO armlet and the fingers in the 3d upper limb can not change when other actions are performed.
In a Unity virtual environment at a computer end, firstly, a preset part of a MYO arm ring is given to a virtual 3d upper limb model to enable the model to have various attributes of the MYO arm ring, and then a target object of each joint, an action mark position, a gesture completion state, finger speed and time length in the 3d upper limb are defined; when the environment picture is updated every frame, whether the gesture changes from the last gesture in the Unity virtual environment is detected, if yes, the gesture is set to be the currently detected gesture, and if no updated gesture is detected, the gesture is set to be the gesture placed at will when the user relaxes.
Joint target objects, including knuckles such as the respective proximal and distal fingers of the thumb, index finger, middle finger, ring finger, and little finger in the 3d upper limb model, whose motion of rotation about the joint is the object, are used to select different objects when performing different gestures.
The action flag represents two different actions of grasping, wherein the flag represents a fist-making gesture when the flag is 1, and represents an opening gesture when the flag is 2.
The gesture completion state represents three different gesture states, wherein a state of 0 indicates that the gesture is in an initial position, a state of 1 indicates that the gesture is in execution, and a state of 2 indicates that the gesture motion is completed.
The finger speed is used for adjusting the flexibility of finger movement in the 3d upper limb, and the speed is adjusted to ensure the rhythm of the grasping training; the time length is used for determining the time of finger movement in the 3d upper limb, the movement range of the fingers can be ensured by combining the movement speed of the fingers, and the mapping relation between the electromyographic signal intensity and the movement speed is established by taking the normal physiological joint movement range as constraint.
If the gesture in the Unity virtual environment is detected to be a fist-making action, the last action is not a fist-making action, and the gesture is in an initial position or a completed state, namely the action flag bit is not 1, and the gesture completion state is not 1, the existing gesture action is set to be the fist-making state, the gesture completion state is set to be an executing state, namely the action flag bit is changed to be 1, the gesture completion state is changed to be 1, and timing is started from this moment; if the gesture is detected to be an open gesture, the last action is not open, and the gesture is in an initial position or a completed state, namely the action flag bit is not 2 and the gesture completion state is not 1, the existing gesture action is set to be the open state, the gesture completion state is set to be an executing state, namely the action flag bit is changed into 2 and the gesture completion state is changed into 1, and the timing is started from this moment.
If the gesture is in a fist-making state and the gesture motion is not finished, namely the motion flag bit is 1 and the gesture completion state is not 2, when the timing time length reaches the specified time length of 4 seconds, setting the gesture in the completed state and changing the gesture completion state into 2, and executing the motion of each finger joint in the upper limb of 3d according to a certain speed proportion; if the gesture is in an open state and the gesture motion is not finished, namely the motion flag bit is 2 and the gesture completion state is not 2, when the timing time length reaches the specified time length of 4 seconds, setting the gesture in the completed state, and changing the gesture completion state into 2, executing the motion of each finger joint according to a certain speed proportion; wherein, the speed ratio of the three knuckles of the thumb in the 3d upper limb is 1: 4:6, keeping the joint speeds of the other fingers the same as the knuckle speed with the fastest motion speed of the thumb; and updating the picture, and repeating the above circulation.
The Android interface is designed through a GUI interface, and text description, the rotation angle acquired by the gyroscope, the acceleration information acquired by the accelerometer and position data of three directions are added so as to display the data acquired by the MYO arm ring, so that the Android equipment has the characteristics of real-time performance and visualization at the basic stage.
The fifth step: in the training stage, a pitching game is taken as a scene, objects such as small balls are added, and if the hands normally hold the balls, the gray (A) of the balls is changed into the blue (B) of the balls; then the ball is moved above the ball frame, and when the ball is released by detecting the opening of the hand, the ball is changed from blue (B) to green (C); repeating the above actions, the Android device feeds back the training effect of the user through the score (see fig. 3).
The method comprises the steps that a rehabilitation training scene of motion observation and motion simulation under a Unity virtual environment is achieved based on Android equipment, namely, the active motion intention of a patient is induced by observing the virtual motion scene, and meanwhile, the virtual motion scene is actively simulated in the game training for rehabilitation training; in the scene of a pitching game, firstly setting an animation to complete an observation scene of grabbing, moving and placing, and then actively observing the animation scene to simulate and complete the motion of limbs of the player: the actions of grasping the small ball by the left hand, moving the small ball and finally releasing the small ball are firstly carried out, so that the actions of making a fist and opening the hand of the user can be observed, and the scoring condition of throwing the small ball into the ball frame can be seen in real time:
adding a boundary collision detection function of a Unity3D software physical system into a Unity virtual environment at a computer end, defining the material of a ball to comprise three colors of gray (A), blue (B) and green (C), setting the initial color of the small ball to be gray, when moving a 3d upper limb above the ball, detecting that a hand of the 3d upper limb performs a fist making action and contacting with the boundary of the ball, setting the color of the ball to be blue, and if the fist making action is detected or the ball boundary is not touched, keeping the color of the ball to be gray; when a hand grasps the small ball, the small ball is moved to the position right above the ball frame, the hand is detected to perform opening action, and the ball collides with the frame bottom within a specified time, the color of the ball is set to be green, and then the ball is destroyed, if the hand is detected to be opened or the ball is not in contact with the frame bottom, the color of the ball is kept to be blue, and the ball is destroyed.
And a scoring module, an accuracy module and a speed module are added in the GUI interface design of the Unity virtual environment. A scoring module: when the small ball turns green, the small ball is divided into five parts, and when the small ball only turns blue, the small ball is divided into two parts; repeating the actions for 20 times in a group of training until 20 groups of actions finish training, and judging the training effect of different users or the training effect of the same user in different training periods according to the percentage; an accuracy module: in a group of 20 training actions, dividing the total number of the final green-changing times of the ball by 20 to represent that a whole set of actions is completed, and judging the grabbing accuracy of different users by percentage; a speed module: the number of the small balls grabbed in 3 minutes is calculated, the module only considers the number of the green small balls, the unit is one minute, and grabbing response speeds of different users are judged. Meanwhile, the training log is saved as the training log of the user, the training process of each user is recorded, the rehabilitation process is dynamically monitored, and the effectiveness degree of the training system is judged.
The present invention is not limited to the above-mentioned embodiments, and based on the technical solutions disclosed in the present invention, those skilled in the art can make some substitutions and modifications to some technical features without creative efforts according to the disclosed technical contents, and these substitutions and modifications are all within the protection scope of the present invention.

Claims (7)

1. The utility model provides a VR upper limbs rehabilitation training platform based on MYO armlet and removal end, its characterized in that, the VR upper limbs rehabilitation training platform of MYO armlet and removal end includes:
the computer adopts the Unity3D software to construct a Unity virtual environment, and the Unity virtual environment comprises real-time visual feedback of myoelectric information of the 3d upper limb model, a therapeutic virtual environment, a scene of a rehabilitation training game and a training mode of the rehabilitation training game;
the MYO arm ring comprises a nine-axis inertial sensor, eight surface electromyographic sensors and a Bluetooth receiver, wherein the nine-axis inertial sensor is used for detecting the motion track, the direction and the arm posture information of an arm, the surface electromyographic sensors are used for detecting the electromyographic signals of different gestures and the gesture information of the different gestures, and the Bluetooth receiver is used for realizing communication with a Unity virtual environment and a mobile end platform;
the mobile equipment is used for carrying a mobile phone or a tablet personal computer based on a Unity virtual environment, mapping gesture information and arm posture information acquired by the MYO arm ring into VR glasses, and visually displaying the VR glasses to a user;
the VR glasses are used as a wearing body for carrying the mobile equipment, and gesture information and arm posture information of the user are fed back to the mobile equipment through the VR glasses in the Unity virtual environment, so that man-machine interaction is performed;
the user wears MYO armlet and VR glasses, the arm posture information of the hand of the user is collected, the gesture information is analyzed and recognized through myoelectric signals, then corresponding hand actions are converted in the Unity virtual environment, and the user performs upper limb rehabilitation training by operating the mobile equipment;
the training method of the MYO arm ring and VR upper limb rehabilitation training platform at the mobile end comprises the following steps:
the first step is as follows: a user wears MYO armlet and VR glasses and determines whether the communication between the user and the Unity virtual environment in the computer is successful or not and the Bluetooth connection between the user and the mobile terminal equipment is successful;
the second step is that: viewing the initial position of an arm in a 3d upper limb model in the Unity virtual environment from the mobile equipment through a gesture of inward wrist turning of a user, and calibrating the initial position to a screen of the mobile equipment which is just opposite to VR glasses;
in the second step, a 3d upper limb model is imported into the Unity virtual environment, an initialization script of the MYO arm ring is given to the 3d upper limb model, and whether the gesture changes or not is judged; then taking a wrist inward turning gesture as a correction starting gesture during correction each time, and taking a forearm in a 3d upper limb right facing the screen of the mobile device as an initial gesture after correction; compensating a normal vector and a rotation direction of the forearm when the forearm is calibrated at the beginning, and adjusting the rotation direction of the forearm to enable the rotation direction of the forearm to be consistent with the coordinate axis direction of a MYO arm ring on the mobile equipment, so that the actions of fixing the upper arm of the 3d upper limb and rotating the forearm are realized until the forearm faces the inside of a screen when the forearm is calibrated at the last;
the third step: writing a data reading and posture synchronization algorithm program related to a MYO arm ring in a Unity environment, synchronizing the spatial position information of an arm in an upper limb model of a user according to gyroscope data of a nine-axis inertial sensor on the MYO arm ring, and analyzing a current gesture according to a hand myoelectric signal of the user;
the fourth step: in a basic stage, taking treatment as a scene, sending a gesture signal to a 3d upper limb model in a Unity virtual environment, executing different hand actions, and displaying myoelectric information acquired by a MYO arm ring in real time on a mobile equipment interface;
the fifth step: in the training stage, a game is taken as a scene, a virtual interactive object is added, an observation-simulation training mode is adopted in a Unity virtual environment scene, animation is set to finish grabbing the observation scene placed in a moving way, then the upper limbs are actively moved to simulate the observation animation scene, and the movement of the limbs is finished;
if the hand is detected to normally hold the virtual interactive object, the color of the image of the virtual interactive object is changed from A to B, and if the color of the image of the virtual interactive object is not changed, the user is not in contact with the virtual interactive object;
if the hand is detected to be opened to release the virtual interactive object, and the virtual interactive object is placed at a specified position, changing the color of the image of the virtual interactive object from B to C;
repeating the above actions, and obtaining the training effect of the user through the feedback of the mobile equipment.
2. The platform of claim 1, wherein in displaying myoelectric information collected by a MYO armlet in real time on a mobile device interface, a GUI (graphical user interface) is added in a Unity virtual environment, and arm rotation angle, acceleration information and position data collected by the MYO armlet are used for visually displaying myoelectric information in a basic stage; in the training stage, the myoelectric information is indirectly displayed by utilizing the score condition, action accuracy and speed of the training process, so as to feed back to the user in real time.
3. The platform of claim 1, wherein the Unity virtual environment comprises a scene in which a rehabilitation training scene is simulated by observing the motion and simulating the motion in the virtual environment based on the mobile device, that is, by observing the virtual motion scene, the active motion intention of the patient is induced, and meanwhile, the rehabilitation training is performed by actively simulating the virtual motion scene in the game training.
4. The platform of claim 1, wherein mac address codes corresponding to the MYO arm ring and the mobile device are added to a script of the Unity virtual environment of the computer, so that successful connection between Bluetooth on the MYO arm ring and Bluetooth of the mobile device is realized, and the Unity virtual environment of the computer is transferred to the mobile device so as to be carried around for rehabilitation training.
5. The platform of claim 1, wherein in use of the VR glasses, a gyroscope built in the VR glasses is used for sensing dynamic changes of positions of eyes, and a binocular virtual reality camera is adaptively constructed in a scene according to VR lens distance and interpupillary distance and the screen size of a mobile device, so that a 3D immersion effect is realized; the size module in the Unity virtual environment is used in the computer to monitor the currently oriented object.
6. The platform of claim 1, wherein in the arm calibration, the Euler angles are used to describe the rotation change of the space where the arm is located, the Euler angle of the upper arm of the 3d upper limb is fixed in an absolute coordinate system, the lower arm of the 3d upper limb is aligned with the center of the picture, then in the relative coordinate system of the forearm, defining the Z axis as the coordinate axis of the plane where the two vertical forearm are located, the Y axis along the horizontal direction of the forearm in the horizontal plane, the X axis along the vertical direction of the forearm in the horizontal plane, carrying out Yaw rotation around the Z axis to adjust the angle between the forearm and the upper arm to a normal physiological position, firstly carrying out Roll rotation around the Y axis to adjust the Euler angle of the forearm, and then carrying out Pitch rotation around the X axis to adjust the Euler angle of the wrist in the 3d upper limb, the initial coordinates in the virtual and real world coordinates are synchronized by repositioning the measurement device after positioning the limb.
7. The platform of claim 1, wherein in the fourth step, preset pieces of a MYO arm ring are firstly assigned to a virtual 3d upper limb to have various attributes of the MYO arm ring, and then a target object of each joint of the 3d upper limb, an action flag, a gesture completion state, finger speed and time length are defined; when each frame of picture of the Unity virtual environment is updated, whether the gesture changes from the last gesture is detected, if so, the gesture is set to be the currently detected gesture, and if not, the gesture is set to be the gesture which is randomly placed when the user relaxes.
CN201810602719.0A 2018-06-12 2018-06-12 VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal Active CN108815804B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810602719.0A CN108815804B (en) 2018-06-12 2018-06-12 VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810602719.0A CN108815804B (en) 2018-06-12 2018-06-12 VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal

Publications (2)

Publication Number Publication Date
CN108815804A CN108815804A (en) 2018-11-16
CN108815804B true CN108815804B (en) 2020-06-09

Family

ID=64144900

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810602719.0A Active CN108815804B (en) 2018-06-12 2018-06-12 VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal

Country Status (1)

Country Link
CN (1) CN108815804B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109701224B (en) * 2019-02-22 2024-02-23 重庆市北碚区中医院 Augmented reality AR wrist joint rehabilitation evaluation and training system
CN110232963B (en) * 2019-05-06 2021-09-07 中山大学附属第一医院 A system and method for evaluating upper limb motor function based on stereoscopic display technology
CN110227243A (en) * 2019-06-11 2019-09-13 刘简 Table tennis practice intelligent correcting system and its working method
CN110298286B (en) * 2019-06-24 2021-04-30 中国科学院深圳先进技术研究院 A virtual reality rehabilitation training method and system based on surface electromyography and depth images
CN110706776A (en) * 2019-09-20 2020-01-17 广东技术师范大学 Apoplexy rehabilitation training system based on virtual reality technology and using method thereof
CN110624217A (en) * 2019-09-23 2019-12-31 孙孟雯 Rehabilitation glove based on multi-sensor fusion and implementation method thereof
CN111840920A (en) * 2020-07-06 2020-10-30 暨南大学 A virtual reality-based upper limb intelligent rehabilitation system
CN111714334B (en) * 2020-07-13 2022-08-05 厦门威恩科技有限公司 Upper limb rehabilitation training robot and control method
CN111991762A (en) * 2020-09-02 2020-11-27 冼鹏全 Psychotherapy-based wearable upper limb rehabilitation device for stroke patient and cooperative working method
CN113101137B (en) * 2021-04-06 2023-06-02 合肥工业大学 A robot for upper limb rehabilitation based on motion mapping and virtual reality
CN113181621B (en) * 2021-06-09 2024-03-08 张彤 Auxiliary training equipment using VR and force feedback mechanical arm
CN114469465B (en) * 2021-12-28 2024-11-19 浪潮工业互联网股份有限公司 A control method, device and medium based on intelligent prosthesis
CN114637395A (en) * 2022-02-14 2022-06-17 上海诠视传感技术有限公司 A method for training hand-eye coordination through AR glasses
CN114377358B (en) * 2022-02-22 2025-01-28 南京医科大学 An upper limb home rehabilitation system based on Sphero spherical robot
CN115047979B (en) * 2022-08-15 2022-11-01 歌尔股份有限公司 Head-mounted display equipment control system and interaction method
CN115691756A (en) * 2022-10-18 2023-02-03 中国人民解放军陆军军医大学 Remote home treatment real-time monitoring system
CN116312947B (en) * 2023-03-13 2023-11-24 北京航空航天大学 Immersive ankle and foot rehabilitation training method based on upper limb movement signals and electronic equipment
CN116510249A (en) * 2023-05-09 2023-08-01 福州大学 A hand virtual rehabilitation training system and training method based on electromyographic signals

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103230664A (en) * 2013-04-17 2013-08-07 南通大学 Upper limb movement rehabilitation training system and method based on Kinect sensor
CA2933053A1 (en) * 2013-12-20 2015-06-25 Integrum Ab System for neuromuscular rehabilitation
CN106530926A (en) * 2016-11-29 2017-03-22 东南大学 Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking
CN106621287A (en) * 2017-02-07 2017-05-10 西安交通大学 Upper limb rehabilitation training method based on brain-computer interface and virtual reality technology
CN107544675A (en) * 2017-09-08 2018-01-05 天津大学 Brain control formula virtual reality method
CN107626040A (en) * 2017-10-24 2018-01-26 杭州易脑复苏科技有限公司 It is a kind of based on the rehabilitation system and method that can interact virtual reality and nerve electric stimulation
CN107694034A (en) * 2017-12-04 2018-02-16 陈林 Neck trainer based on virtual reality

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103230664A (en) * 2013-04-17 2013-08-07 南通大学 Upper limb movement rehabilitation training system and method based on Kinect sensor
CA2933053A1 (en) * 2013-12-20 2015-06-25 Integrum Ab System for neuromuscular rehabilitation
CN106530926A (en) * 2016-11-29 2017-03-22 东南大学 Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking
CN106621287A (en) * 2017-02-07 2017-05-10 西安交通大学 Upper limb rehabilitation training method based on brain-computer interface and virtual reality technology
CN107544675A (en) * 2017-09-08 2018-01-05 天津大学 Brain control formula virtual reality method
CN107626040A (en) * 2017-10-24 2018-01-26 杭州易脑复苏科技有限公司 It is a kind of based on the rehabilitation system and method that can interact virtual reality and nerve electric stimulation
CN107694034A (en) * 2017-12-04 2018-02-16 陈林 Neck trainer based on virtual reality

Also Published As

Publication number Publication date
CN108815804A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108815804B (en) VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal
US11481031B1 (en) Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US10678335B2 (en) Methods, devices, and systems for creating haptic stimulations and tracking motion of a user
El Saddik et al. Haptics technologies: Bringing touch to multimedia
Hu et al. Stereopilot: A wearable target location system for blind and visually impaired using spatial audio rendering
CN109799900B (en) Wrist-mountable computing communication and control device and method of execution thereof
US20190265802A1 (en) Gesture based user interfaces, apparatuses and control systems
WO2021226445A1 (en) Avatar tracking and rendering in virtual reality
US11630520B1 (en) Systems and methods for sensing gestures via vibration-sensitive wearables donned by users of artificial reality systems
CN107205879A (en) Hand rehabilitation kinematic system and method
CN115364327B (en) Hand function training and assessment rehabilitation glove system based on motor imagery
Novacek et al. Overview of controllers of user interface for virtual reality
EP4533219A1 (en) Devices, methods, and graphical user interfaces for generating and displaying a representation of a user
CN119013640A (en) Apparatus, method, and graphical user interface for modifying an avatar in a three-dimensional environment
KR102162922B1 (en) Virtual reality-based hand rehabilitation system with haptic feedback
CN113035000A (en) Virtual reality training system for central integrated rehabilitation therapy technology
Hu et al. Intuitive environmental perception assistance for blind amputees using spatial audio rendering
Shi et al. i-GSI: A novel grasp switching interface based on eye-tracking and augmented reality for multi-grasp prosthetic hands
CN107632702B (en) A holographic projection system using light-sensing data gloves and its working method
CN120072196A (en) Burn patient rehabilitation training method and system based on virtual reality
JP2018190196A (en) Information processing method, apparatus, and program for causing computer to execute information processing method
JP7112077B2 (en) CONTROLLER, CONTROLLER MANUFACTURING METHOD, SIMULATED EXPERIENCE SYSTEM, AND SIMULATED EXPERIENCE METHOD
CN211180839U (en) A kind of sports teaching equipment and sports teaching system
JP2018190397A (en) Information processing method, apparatus, and program for causing computer to execute information processing method
CN112201097A (en) VR-based bionic human body model and use method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20181112

Address after: 518103 Fuhai Street Ocean Development Zone, Baoan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen Medical Technology Co., Ltd.

Address before: 710049 Department of Instrument Science and Precision Manufacturing, School of Machinery, Xi'an Jiaotong University, 28 Xianning Road, Beilin District, Xi'an City, Shaanxi Province

Applicant before: Wang Jing

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant