[go: up one dir, main page]

WO2018135692A1 - Dispositif pouvant être porté destiné à la reconnaissance et à la commande de mouvement, et procédé de commande de reconnaissance de mouvement l'utilisant - Google Patents

Dispositif pouvant être porté destiné à la reconnaissance et à la commande de mouvement, et procédé de commande de reconnaissance de mouvement l'utilisant Download PDF

Info

Publication number
WO2018135692A1
WO2018135692A1 PCT/KR2017/001568 KR2017001568W WO2018135692A1 WO 2018135692 A1 WO2018135692 A1 WO 2018135692A1 KR 2017001568 W KR2017001568 W KR 2017001568W WO 2018135692 A1 WO2018135692 A1 WO 2018135692A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
signal
emg
gesture
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2017/001568
Other languages
English (en)
Korean (ko)
Inventor
김범준
이분진
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industry Academic Cooperation Foundation of Keimyung University
Original Assignee
Industry Academic Cooperation Foundation of Keimyung University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industry Academic Cooperation Foundation of Keimyung University filed Critical Industry Academic Cooperation Foundation of Keimyung University
Publication of WO2018135692A1 publication Critical patent/WO2018135692A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analogue processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Definitions

  • the present invention relates to a wearable device for motion recognition and control and a motion recognition control method using the same, and more specifically, to an EMG-based control interface using a single channel EMG sensor located in an arm band and an IMU sensor located in a forearm.
  • the present invention relates to a wearable device for gesture recognition and control for realizing a wearable and a gesture recognition control method using the same.
  • HCI human computer interaction
  • various researches on interactions that can enhance user's cognitive, emotional, and life experiences are being conducted.
  • HCI is making great efforts to develop a user-friendly interface using voice, vision, gestures, or other innovative input / output channels.
  • One of the most difficult approaches in this area of research is to connect human nerve signals to computers using the electrical properties of the human nervous system.
  • a variety of biomedical signals can be used to connect these nerves and computers, most of which can be obtained from cellular tissues such as specific body tissues, organs or nervous systems.
  • the biosignal refers to an electric or magnetic signal generated in the human body and typically includes signals such as electromyography (EMG), electrocardiogram (ECG), electroencephalogram (EDG), safety (EOG), and skin conductivity (GSR). .
  • EMG electromyography
  • ECG electrocardiogram
  • EDG electroencephalogram
  • EOG safety
  • GSR skin conductivity
  • the bio-signal has been mainly used for the purpose of treatment or rehabilitation in the medical field, but recently, it is used to control the operation of a computer, machine, robot, etc. by inferring the user's intention in human-computer interface (HCI) field.
  • HCI human-computer interface
  • the range of application is widening.
  • a technology of accurately detecting a biosignal and a technology of accurately inferring a user's operation intention from the detected biosignal are very important.
  • the biosignal detection apparatus determines that the user has started the operation when the specific parameter value of the biosignal exceeds the set threshold and performs a control operation corresponding to the operation.
  • the biosignal includes complex noises caused by device-specific noise, electromagnetic radiation, operating noise, and interaction with other tissues.
  • the noise included in the biosignal such as EMG is constantly changing according to the surrounding environment, the user's physical condition, and the contact state of the sensor and the body, so that it is difficult to set an accurate threshold value.
  • a method that can accurately determine whether the operation of the is required. Republic of Korea Patent Application Publication No. 10-2016-0133306 and Republic of Korea Patent Publication No. 10-1000869 is disclosed in the prior art literature.
  • the present invention has been proposed to solve the above problems of the conventionally proposed methods, an EMG sensor module for measuring EMG signals, an IMU sensor module for measuring motion signals, a control module for performing noise reduction preprocessing, And a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal, thereby removing the noise included in the EMG signal used for the operation recognition and control, thereby enabling the operation recognition and control to be more accurate.
  • An object of the present invention is to provide a wearable device and a motion recognition control method using the same.
  • An electromyogram (EMG) sensor module for measuring an EMG signal according to muscle movement of a user, and converting and outputting the measured EMG signal to a digital;
  • An IMU Inertial Motion Unit
  • a control (MCU) module configured to receive an EMG signal of the EMG sensor module and a motion signal of the IMU sensor module, and perform filtering to remove dynamic noise included in the EMG signal based on the motion signal;
  • Including a computer device section is characterized by its configuration.
  • the EMG sensor module Preferably, the EMG sensor module, the EMG sensor module, and
  • It can be configured to include a single channel EMG sensor located in the arm band (arm band) of the user.
  • the EMG sensor module Preferably, the EMG sensor module, the EMG sensor module, and
  • the EMG sensor module More preferably, the EMG sensor module,
  • a differential amplifier can be used as an amplifier for amplifying the rectified signal.
  • the IMU sensor module Preferably, the IMU sensor module,
  • the sensor may be configured of any one of an accelerometer, a gyroscope, and a magnetometer.
  • the motion signal More preferably, the motion signal
  • the EMG sensor module may be used as a reference noise signal for correcting motion artifacts included in an EMG signal measured by the EMG sensor module.
  • control module Preferably, the control module,
  • the apparatus may further include a Bluetooth module for transmitting the EMG signal and the motion signal to which the preprocessing process of removing unnecessary noise including the dynamic noise included in the EMG signal is transmitted to the computer device.
  • control module More preferably, the control module,
  • It may be implemented as a Lily Pad (Simblee) BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device.
  • the EMG sensor module, the IMU sensor module, and the control (MCU) module may be configured to be worn (wearable) is accommodated in the cylindrical case is fastened to the forearm of the user.
  • cylindrical case More preferably, the cylindrical case,
  • It can be produced by 3D printer.
  • cylindrical case More preferably, the cylindrical case,
  • It can be configured to further include a fastening member detachable to the forearm of the user.
  • the computer device unit More preferably, the computer device unit,
  • Data selection and mapping can be performed by referring to a preset datasheet.
  • the computer device unit Even more preferably, the computer device unit,
  • the gesture of the EMG signal is determined and recognized on the basis of the data classified by the feature extraction of the gesture and the result obtained through the data selection and mapping processing process referring to a preset data sheet, and corresponds to the recognized gesture. Operation processing can be executed on the control application.
  • the computer device unit Even more preferably, the computer device unit,
  • the EMG signal and the IMG signal are combined to detect isometric muscle activity that does not appear as a real motion, classify subtle gestures without motion, and control the condition without disturbing the surrounding environment. Can be enabled.
  • the computer device unit Even more preferably, the computer device unit,
  • Wavelet transform of time-frequency analysis may be applied as feature extraction of a gesture according to a user's muscle movement using the EMG signal.
  • the gesture classification is
  • hand close gesture hand open gesture
  • hand extention gesture hand extention gesture
  • the computer device unit Even more preferably, the computer device unit,
  • classification processing may be performed using a K-nearest neighbor (KNN) algorithm.
  • KNN K-nearest neighbor
  • a motion recognition control method using a wearable device including an EMG sensor module, an IMU sensor module, a control (MCU) module, and a computer device,
  • control module performs a preprocessing process to remove unnecessary noise including dynamic noise included in the EMG signal based on the motion signal, and the EMG signal and the motion signal in which the preprocessing process is performed Transmitting to the computer device unit;
  • the computer device unit receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module through the wireless local area communication through the step (3), and performs mapping processing of feature extraction, classification, and data selection. Recognizing the operation of the EMG signal and performing the corresponding control is characterized in that the configuration.
  • step (4) Preferably, in step (4),
  • the computer device extracts and classifies the gesture feature according to the muscle movement of the user by using the EMG signal received from the control module through Bluetooth communication, and uses the movement (IMU) signal.
  • IMU movement
  • data selection and mapping are performed by referring to a preset data sheet, and the data classified through feature extraction of the gesture and the results obtained through processing of data selection and mapping with reference to a preset data sheet are performed.
  • the gesture of the EMG signal may be determined and recognized, and an operation process corresponding to the determined and recognized gesture may be executed on the control application.
  • the computer device unit More preferably, the computer device unit,
  • wavelet transform of time-frequency analysis is applied, and a feature extraction of a gesture according to the user's muscle movement using the EMG signal is applied.
  • the classification of it can be classified using the K-nearest Neighbor (KNN) algorithm.
  • an EMG sensor module for measuring an EMG signal an IMU sensor module for measuring a motion signal, and noise pretreatment may be performed.
  • a control module and a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal the motion recognition and control can be eliminated to reduce noise included in the used EMG signal to enable more accurate motion recognition.
  • the single-channel EMG sensor is located in the arm band of the user, and configured to place the IMU sensor on the forearm,
  • the EMG-based control interface is wearable, allowing convenient use of the detachable, and the combination of the EMG signal and the IMU signal can be used to accurately recognize subtle gestures without movement.
  • FIG. 1 is a block diagram illustrating a configuration of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a system structure of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 3 is a view illustrating a state of a hand gesture recognized by a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 4 is a view showing an operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 5 is a view showing another operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 6 is a view showing another operating state of the arm in a wearing state of the wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 7 illustrates training data of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 8 is a graph illustrating a recognition score for each gesture of a wearable device for gesture recognition and control according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating an operation flow of a method for controlling motion recognition using a wearable device according to an embodiment of the present invention.
  • S140 the computer device unit recognizing the operation of the EMG signal and performing the corresponding control through the feature extraction, classification, and mapping of data selection using the EMG signal and the motion signal from which the dynamic noise is removed
  • FIG. 1 is a block diagram showing the configuration of a wearable device for gesture recognition and control according to an embodiment of the present invention
  • Figure 2 is a wearable device for gesture recognition and control according to an embodiment of the present invention
  • FIG. 3 is a diagram illustrating a system structure
  • FIG. 3 is a view illustrating a hand gesture recognized by a wearable device for gesture recognition and control according to an embodiment of the present invention
  • FIGS. 4 to 6 are views of the present invention.
  • FIG. 1 is a diagram illustrating various operation states of an arm in a wearing state of a wearable device for gesture recognition and control according to an embodiment. 1 and 2, the wearable device 100 for motion recognition and control according to an embodiment of the present invention, the EMG sensor module 110, IMU sensor module 120, control (MCU) Module 130, and the computer device 140 may be configured.
  • MCU control
  • the EMG sensor module 110 measures an EMG signal according to a user's muscle movement, converts the measured EMG signal into a digital signal, and outputs the digital EMG signal to the control module 130.
  • EMG electromyogram
  • Such an electromyogram (EMG) sensor module 110 may be configured to include a single channel EMG sensor located in an arm band of a user, as shown in FIGS. 4 to 6.
  • the EMG sensor module 110 is attached to the user's arm and rectified the raw EMG signal measured according to the muscle movement of the user, and then amplified the rectified signal to amplify the filtered EMG signal.
  • the A / D converter can be used to convert digital signals.
  • the EMG sensor module 110 may use a differential amplifier as an amplifier for amplifying the rectified signal.
  • the IMU sensor module 120 measures a user's motion signal, converts the measured motion signal into a digital signal, and outputs it to the control module 130.
  • the IMU (Inertial Motion Unit) sensor module 120 is an IMU sensor positioned on the user's forearm to measure a user's movement, as shown in FIGS. 4 to 6, and includes an accelerometer and a gyroscope. ), And a magnetometer of the sensor.
  • the motion signal may be used as a reference noise signal for correcting the motion artifacts included in the EMG signal measured by the EMG sensor module 110.
  • the control (MCU) module 130 receives the EMG signal of the EMG sensor module 110 and the motion signal of the IMU sensor module 120, and receives dynamic motion artifacts included in the EMG signal based on the motion signal. It is the configuration of the processing module that performs filtering to remove.
  • the control module 130 is a Bluetooth module for transmitting the EMG signal and the motion signal to which the preprocessing process for removing unnecessary noise including the dynamic noise included in the EMG signal is transmitted to the computer device unit 140. 131 may be further included.
  • the control module 130 is implemented as a Lily Pad Simmblee BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device 140. Can be.
  • the EMG sensor module 110, the IMU sensor module 120, and the control (MCU) module 130 is accommodated in the cylindrical case 101, as shown in Figures 4 to 6, the user's forearm It can be configured as a wearable (wearable) fastened to.
  • the cylindrical case 101 may be manufactured by a 3D printer, and may further include a fastening member 102 that is detachable to a forearm of a user.
  • the computer device unit 140 receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module 130 through wireless near field communication, and operates the EMG signal through mapping processing of feature extraction, classification, and data selection. It is a configuration that recognizes and performs the corresponding control.
  • the computer device unit 140 extracts and classifies gesture features according to muscle movements of a user using EMG signals received from the control module 130 through Bluetooth communication, and uses movement (IMU) signals. In order to distinguish gestures taken by the user, data selection and mapping may be performed by referring to a preset data sheet.
  • the computer device 140 determines and recognizes the gesture of the EMG signal based on the data classified by the feature extraction of the gesture and the result obtained through the data selection and mapping processing with reference to the preset data sheet.
  • the operation processing corresponding to the recognized gesture may be executed on the control application.
  • the computer device unit 140 combines EMG and EMG signals to detect isometric muscle activity that does not appear as actual movement, classifies subtle gestures without movement, Control can be made without interruption.
  • the computer device 140 may apply a wavelet transform of time-frequency analysis as feature extraction of a gesture according to a user's muscle movement using an EMG signal.
  • the gesture classification may be made of three classes, a hand close gesture, a hand open gesture, and a hand extention gesture, as shown in FIG. 3.
  • the wavelet transform (WT) is a time-frequency analysis method that is easy to analyze abnormal signals such as EMG signals.
  • the advantage of using wavelet transform is that it is easy to set the dimension of the feature vector, and The result is almost the same as using the dimension feature vector.
  • the wavelet transform is developed from Fourier analysis and applied for a short time interval, and the size of the window can be modified through real-time frequency analysis.
  • an analysis of a signal having a high frequency can produce the same result as an analysis of a low frequency band.
  • distinct features must be taken from each matched pattern in order to distinguish the gestures taken by the user, several features can be extracted. Among them, general statistical features such as maximum, minimum, root mean square, etc. It may include.
  • the computer device 140 may classify the feature extraction of the gesture according to the muscle movement of the user using the EMG signal, and may classify using the K-nearest Neighbor (KNN) algorithm.
  • KNN K-nearest Neighbor
  • the support vector machine (SVM) classification algorithm can be used in a linear or nonlinear manner using Kerenel, but it requires a lot of data learning for accurate results.
  • the results of KNN are compared with the results of SVM using the KNN algorithm, and it can be seen through experiments that KNN yields better results than SVM. The reason is that KNN itself is non-linear, so it can easily detect data that is distributed linearly or non-linearly, and requires fewer data points to achieve good results.
  • FIG. 7 is a diagram illustrating training data of a wearable device for gesture recognition and control according to an embodiment of the present invention
  • FIG. 8 is a gesture for each wearable device for gesture recognition and control according to an embodiment of the present invention. It is a figure which shows the recognition score graphically.
  • FIG. 7 is a table showing each group of gestures corresponding to 0, 1, and 2 of the hand close gesture, the hand open gesture, and the hand extension gesture after the preprocessing and feature extraction. After learning the data (MAX, MIN, MAV, RMS, STD), choose which function will yield the best results. 8 confirms the score of the model to confirm the accuracy of gesture recognition, and shows the variance with the score.
  • the control module 130 receives an EMG signal (S110) and the control module 130. Receiving this movement (IMU) signal (S120), the control control module 130 transmits the EMG signal and the movement signal, the preprocessing process is performed to the computer device unit 140 (S130), and the computer device unit Recognizing the operation of the EMG signal and performing the corresponding control through the process of mapping the feature extraction, classification, and data selection using the EMG signal and the motion signal from which the dynamic noise is removed (S140) Can be implemented.
  • IMU movement
  • the wearable device 100 used in the motion recognition control method of the present invention may include an EMG sensor module 110 and an IMU sensor module 120 for recognizing a gesture of a hand, as shown in FIGS. 1 to 6, respectively.
  • the control (MCU) module 130 and the computer device unit 140 is provided.
  • the motion recognition control method using the wearable device 100 described above will be described in detail.
  • the control module 130 receives an EMG signal measured from the EMG sensor module 110 through the muscle movement of the user's forearm and converted into a digital signal.
  • the electromyogram sensor module 110 may include a single channel EMG sensor located in an arm band of a user, as shown in FIGS. 4 to 6.
  • the EMG sensor module 110 is attached to the user's arm and rectified the raw EMG signal measured according to the muscle movement of the user, and then amplified the rectified signal to amplify the filtered EMG signal.
  • the A / D converter can be used to convert digital signals.
  • the EMG sensor module 110 may use a differential amplifier as an amplifier for amplifying the rectified signal.
  • the control module 130 receives a movement (IMU) signal measured by the user's movement from the IMU sensor module 120 and converted into a digital signal.
  • IMU Inertial Motion Unit
  • the IMU (Inertial Motion Unit) sensor module 120 is an IMU sensor for measuring the degree of movement of the user, which is located on the forearm of the user, as shown in FIGS. 4 to 6, and includes an accelerometer and a gyroscope. Gyroscope, and a magnetometer (Magnetometer) can be composed of any one sensor.
  • control module 130 performs a preprocessing process of removing unnecessary noise including dynamic noise included in the EMG signal based on the motion signal, and the EMG signal and the motion in which the preprocessing process is performed.
  • the signal is transmitted to the computer device unit 140.
  • the control module 130 may further include a Bluetooth module 131 for transmitting the EMG signal and the motion signal to which the preprocessing process has been performed to the computer device unit 140.
  • the control module 130 may be implemented as a lily pad simble BLE board integrated with low power Bluetooth 4.0 for full power communication with the computer device 140.
  • step S140 the computer device unit 140 receives the EMG signal and the motion signal from which the dynamic noise has been removed from the control module 130 through the wireless local area communication, and maps feature extraction, classification, and data selection through step S130. Through processing, the operation of the EMG signal is recognized and control corresponding thereto is performed.
  • step S140 the computer device 140 extracts and classifies gesture features according to muscle movements of the user using EMG signals received from the control module 130 through Bluetooth communication.
  • Data selection and mapping with reference to a preset data sheet to distinguish gestures taken by a user using a signal Data selection and mapping with reference to a preset data sheet and data classified by extracting a feature of a gesture Determining and recognizing the gesture of the EMG signal based on the result obtained through the operation, the operation processing corresponding to the determined gesture is determined and executed on the control application.
  • the computer device 140 extracts a feature of a gesture according to a user's muscle movement using an EMG signal, applies a wavelet transform of time-frequency analysis, and uses the EMG signal.
  • classification may be performed using a K-nearest neighbor (KNN) algorithm.
  • a wearable device for motion recognition and control and a motion recognition control method using the same include an EMG sensor module measuring an EMG signal, an IMU sensor module measuring a motion signal, and noise And a control module for performing preprocessing of the removal, and a computer device unit for recognizing and controlling the operation of the gesture based on the EMG signal, thereby removing the noise contained in the EMG signal used for the motion recognition and control to more accurately recognize the motion.
  • This can be done by combining the EMG sensor module, the IMU sensor module and the control module in one case, and configuring the single channel EMG sensor in the user's arm band and the IMU sensor in the forearm.
  • EMG-based control interface is wearable, allowing easy use of detachable EMG A call and how the IMU signals are combined using a subtle gesture no motion is possible to ensure that they accurately recognized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Signal Processing (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Computer Hardware Design (AREA)
  • Power Engineering (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

Selon la présente invention, un dispositif pouvant être porté destiné à la reconnaissance et à la commande de mouvement et un procédé de commande de reconnaissance de mouvement utilisant le dispositif comprennent : un module de détecteur électromyogramme (EMG) pour mesurer un signal EMG ; un module de détecteur IMU pour mesurer un signal de mouvement ; un module de commande réalisant le prétraitement de suppression de bruit ; et une unité de dispositif informatique reconnaissant et commandant le mouvement d'un geste sur la base du signal EMG. Par conséquent, le bruit compris dans un signal EMG utilisé pour la reconnaissance et la commande de mouvement est supprimé, ce qui permet une reconnaissance de mouvement plus précise.
PCT/KR2017/001568 2017-01-22 2017-02-13 Dispositif pouvant être porté destiné à la reconnaissance et à la commande de mouvement, et procédé de commande de reconnaissance de mouvement l'utilisant Ceased WO2018135692A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020170010103A KR101963694B1 (ko) 2017-01-22 2017-01-22 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법
KR10-2017-0010103 2017-01-22

Publications (1)

Publication Number Publication Date
WO2018135692A1 true WO2018135692A1 (fr) 2018-07-26

Family

ID=62908520

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/001568 Ceased WO2018135692A1 (fr) 2017-01-22 2017-02-13 Dispositif pouvant être porté destiné à la reconnaissance et à la commande de mouvement, et procédé de commande de reconnaissance de mouvement l'utilisant

Country Status (2)

Country Link
KR (1) KR101963694B1 (fr)
WO (1) WO2018135692A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111046731A (zh) * 2019-11-11 2020-04-21 中国科学院计算技术研究所 基于表面肌电信号进行手势识别的迁移学习方法和识别方法
CN111700718A (zh) * 2020-07-13 2020-09-25 北京海益同展信息科技有限公司 一种识别握姿的方法、装置、假肢及可读存储介质
WO2020206179A1 (fr) * 2019-04-05 2020-10-08 Baylor College Of Medicine Méthode et système de détection et d'analyse d'un syndrome de la traversée thoracobrachiale (sttb)
CN113970968A (zh) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 一种智能仿生手的动作预判方法
CN114265498A (zh) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 一种多模态手势识别和视觉反馈机制结合的方法
CN114972442A (zh) * 2022-05-31 2022-08-30 中国地质大学(武汉) 一种面向智能手环的手势交互系统及方法
CN115922658A (zh) * 2022-09-13 2023-04-07 郑州大学 基于视肌融合遥操作的多地形灾后探测机器人系统
CN119017385A (zh) * 2024-08-26 2024-11-26 武汉恒新动力科技有限公司 一种类遥控器式的机器人体感操控指挥识别测试装置
US12346500B1 (en) 2023-04-17 2025-07-01 Snap Inc. EMG speech signal detection

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102258168B1 (ko) * 2019-11-14 2021-05-28 주식회사 마이뉴런 마비환자 동작 분석 모니터링 시스템 및 방법
KR102327168B1 (ko) 2020-03-02 2021-11-16 국방과학연구소 인체 거동이 예측 가능한 인체 모델링 시스템 및 방법
KR102537429B1 (ko) * 2020-11-27 2023-05-26 (주)로임시스템 생체신호 이원화 처리 장치
CN113534960B (zh) * 2021-07-29 2024-05-28 中国科学技术大学 基于imu和表面肌电信号的上臂假肢控制方法及系统
EP4369153A4 (fr) 2021-12-09 2025-01-01 Samsung Electronics Co., Ltd. Procédé de reconnaissance de geste utilisant un dispositif à porter sur soi, et dispositif associé
KR20230087297A (ko) * 2021-12-09 2023-06-16 삼성전자주식회사 웨어러블 디바이스를 이용한 제스처 인식 방법 및 그 장치
CN114683292B (zh) * 2022-06-01 2022-08-30 深圳市心流科技有限公司 肌电设备的采样频率控制方法、智能仿生手及存储介质
KR20250014889A (ko) * 2023-07-21 2025-02-03 삼성전자주식회사 전자 장치 및 전자 장치의 제어 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090326406A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Wearable electromyography-based controllers for human-computer interface
US20140368424A1 (en) * 2013-06-17 2014-12-18 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US20150057770A1 (en) * 2013-08-23 2015-02-26 Thaimic Labs Inc. Systems, articles, and methods for human-electronics interfaces
KR20150112741A (ko) * 2014-03-27 2015-10-07 전자부품연구원 웨어러블 장치 및 이를 이용한 정보 입력 방법
KR20150123254A (ko) * 2013-02-22 2015-11-03 탈믹 랩스 인크 제스처-기반 제어를 위해 근활성도 센서 신호와 관성 센서 신호를 결합하는 방법 및 기기

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090326406A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Wearable electromyography-based controllers for human-computer interface
KR20150123254A (ko) * 2013-02-22 2015-11-03 탈믹 랩스 인크 제스처-기반 제어를 위해 근활성도 센서 신호와 관성 센서 신호를 결합하는 방법 및 기기
US20140368424A1 (en) * 2013-06-17 2014-12-18 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US20150057770A1 (en) * 2013-08-23 2015-02-26 Thaimic Labs Inc. Systems, articles, and methods for human-electronics interfaces
KR20150112741A (ko) * 2014-03-27 2015-10-07 전자부품연구원 웨어러블 장치 및 이를 이용한 정보 입력 방법

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020206179A1 (fr) * 2019-04-05 2020-10-08 Baylor College Of Medicine Méthode et système de détection et d'analyse d'un syndrome de la traversée thoracobrachiale (sttb)
CN111046731A (zh) * 2019-11-11 2020-04-21 中国科学院计算技术研究所 基于表面肌电信号进行手势识别的迁移学习方法和识别方法
CN111700718A (zh) * 2020-07-13 2020-09-25 北京海益同展信息科技有限公司 一种识别握姿的方法、装置、假肢及可读存储介质
CN114265498A (zh) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 一种多模态手势识别和视觉反馈机制结合的方法
CN114265498B (zh) * 2021-12-16 2023-10-27 中国电子科技集团公司第二十八研究所 一种多模态手势识别和视觉反馈机制结合的方法
CN113970968A (zh) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 一种智能仿生手的动作预判方法
CN113970968B (zh) * 2021-12-22 2022-05-17 深圳市心流科技有限公司 一种智能仿生手的动作预判方法
CN114972442A (zh) * 2022-05-31 2022-08-30 中国地质大学(武汉) 一种面向智能手环的手势交互系统及方法
CN115922658A (zh) * 2022-09-13 2023-04-07 郑州大学 基于视肌融合遥操作的多地形灾后探测机器人系统
US12346500B1 (en) 2023-04-17 2025-07-01 Snap Inc. EMG speech signal detection
CN119017385A (zh) * 2024-08-26 2024-11-26 武汉恒新动力科技有限公司 一种类遥控器式的机器人体感操控指挥识别测试装置

Also Published As

Publication number Publication date
KR101963694B1 (ko) 2019-03-29
KR20180086547A (ko) 2018-08-01

Similar Documents

Publication Publication Date Title
WO2018135692A1 (fr) Dispositif pouvant être porté destiné à la reconnaissance et à la commande de mouvement, et procédé de commande de reconnaissance de mouvement l'utilisant
Usakli et al. Design of a novel efficient human–computer interface: An electrooculagram based virtual keyboard
EP1374765B1 (fr) Terminal mobile pour la mesure d'un signal biologique
CN105022488B (zh) 基于ssvep脑电电位的无线bci输入系统
WO2018093131A1 (fr) Dispositif pour mesurer l'apnée du sommeil et procédé associé
US7963931B2 (en) Methods and devices of multi-functional operating system for care-taking machine
KR101218200B1 (ko) 인체 장착형 센서셋 및 그 동작 방법
EP1304072A3 (fr) Méthode et dispositif pour la comparaison sérielle des électrocardiogrammes
WO2017192010A1 (fr) Appareil et procédé d'extraction de caractéristique cardiovasculaire
EP3288447A1 (fr) Appareil et procédé d'extraction de caractéristique cardiovasculaire
CN107193374B (zh) 一种主动故意手势动作的检测装置及检测方法
WO2011093557A1 (fr) Dispositif et procédé pour l'extraction de caractéristique de signaux biométriques
WO2017099340A1 (fr) Dispositif électronique, son procédé de traitement de signal, système de mesure de signal biologique, et support d'enregistrement lisible par ordinateur de manière non transitoire
CN113951897A (zh) 一种多模态静息脑电数据干扰消除和标记方法及装置
CN112083797A (zh) 一种基于眼电的智能手持终端控制系统及控制方法
WO2023096417A1 (fr) Dispositif électronique et procédé de commande associé
KR20240109400A (ko) 근전도 및 가속도계를 이용한 심전도 동잡음 감소 방법 및 장치
WO2015178549A1 (fr) Procédé et appareil pour la fourniture d'un service de sécurité en utilisant un seuil de stimulation ou moins
Brahmaiah et al. Data acquisition system of electrooculogram
JP7607378B2 (ja) 情報処理方法
CN110033772B (zh) 基于ppg信号的非声学语音信息检测装置
WO2022260228A1 (fr) Procédé et dispositif de détection automatique d'une section de signal de bruit
WO2019156289A1 (fr) Dispositif électronique et son procédé de commande
EP4181769B1 (fr) Appareil de mesure de bio-potentiel
WO2011013881A1 (fr) Système de stéthoscope oesophagien numérique portatif et procédé utilisant celui-ci pour examiner l’état d’un patient

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17892127

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17892127

Country of ref document: EP

Kind code of ref document: A1