[go: up one dir, main page]

WO2018141409A1 - Lancement d'une opération de commande en réponse à un geste de la tête - Google Patents

Lancement d'une opération de commande en réponse à un geste de la tête Download PDF

Info

Publication number
WO2018141409A1
WO2018141409A1 PCT/EP2017/052515 EP2017052515W WO2018141409A1 WO 2018141409 A1 WO2018141409 A1 WO 2018141409A1 EP 2017052515 W EP2017052515 W EP 2017052515W WO 2018141409 A1 WO2018141409 A1 WO 2018141409A1
Authority
WO
WIPO (PCT)
Prior art keywords
eeg
computing device
head
sensor data
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2017/052515
Other languages
English (en)
Inventor
Matthew John LAWRENSON
Jan Jasper VAN DEN BERG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telefonaktiebolaget LM Ericsson AB
Original Assignee
Telefonaktiebolaget LM Ericsson AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget LM Ericsson AB filed Critical Telefonaktiebolaget LM Ericsson AB
Priority to PCT/EP2017/052515 priority Critical patent/WO2018141409A1/fr
Publication of WO2018141409A1 publication Critical patent/WO2018141409A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the invention relates to a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, a method of initiating a control operation in response to a head gesture performed by a user of a computing device, a corresponding computer program, and a corresponding computer program product.
  • touch control is the primary interaction method for most users and is used to control the operation of, and interact with, various types of computing devices.
  • mobile computing devices such as smartphones and tablets, which users control through buttons or a touchscreen displaying a graphical user-interface.
  • the possibility to interact with the touchscreen is disabled when not being intentionally used to avoid input of unintentional commands.
  • touchscreen-based devices are oftentimes able to detect that they are stored in a pocket, or held to a person's ear, and the touchscreen is disabled accordingly.
  • Known touch-control solutions typically require a user to touch a touchscreen several times, or to press several buttons, in order to activate a function. In addition, they oftentimes require the user to be looking at the screen. This prevents or impedes control of the computing devices when they device are stored in a pocket, or in situations where the user's vision or hands are occupied, such as when the user is riding a bike or driving a car.
  • Voice control solutions offer an alternative way of interacting with computing devices, and is based on voice recognition software to interpret commands spoken by a user. These solutions suffer from the disadvantage that a user first needs to activate the voice control functionality of the device. For example, Apple's Siri is activated by pressing the home button for two seconds. Alternative voice activation techniques may require the user to utter a key phrase, such as "OK Google" or "Alexa", in order to activate the voice control function of a computing device.
  • Voice control may also constitute a security risk, as voice control commands may be mimicked with malicious intent, e.g., by giving voice commands to a device through hidden audio embedded in a YouTube video (see, e.g., "Hidden Voice Commands", by N. Carlini, P. Mishra, T. Vaidya, Y. Zhang, M. Sherr, C. Shields, D. Wagner, and W. Zhou, Proceedings of the 25th USENIX Security Symposium, pages 513-530, USENIX Association, 2016).
  • many people do not like to talk to devices in public, partly due to privacy concerns arise from speaking commands aloud.
  • Electroencephalogram EEG
  • brain waves Electroencephalogram
  • BCIs Brain-Computer Interfaces
  • BCIs Brain-Computer Interfaces
  • a computing device for initiating a control operation.
  • the control operation is initiated in response to a head gesture performed by a user of the computing device.
  • the computing device comprises processing means operative to acquire EEG sensor data from EEG sensors contacting a skin of the user, and to detect a characteristic EEG data pattern in the acquired EEG data.
  • the characteristic EEG data pattern is associated with a control mode.
  • the processing means is further operative to acquire motion sensor data from at least one motion sensor attached to the head, and to detect a characteristic movement of the head. The characteristic movement is commensurate with the performed head gesture.
  • the processing means is further operative to initiate a control operation which is associated with the performed head gesture.
  • a method of initiating a control operation is provided.
  • the control operation is initiated in response to a head gesture performed by a user of a computing device.
  • the method comprises acquiring EEG sensor data from EEG sensors contacting a skin of the user, and detecting a characteristic EEG data pattern in the acquired EEG data.
  • the characteristic EEG data pattern is associated with a control mode.
  • the method further comprises acquiring motion sensor data from at least one motion sensor attached to the head, and detecting a characteristic movement of the head. The characteristic movement is commensurate with the performed head gesture.
  • the method further comprises initiating a control operation which is associated with the performed head gesture.
  • a computer program comprises computer-executable instructions for causing a device to perform the method according to an embodiment of the second aspect of the invention, when the computer- executable instructions are executed on a processing unit comprised in the device.
  • a computer program product comprises a computer- readable storage medium which has the computer program according to the third aspect of the invention embodied therein.
  • the invention makes use of an understanding that an improved solution for enabling a user of a computing device to control an operation of the computing device, or a controlled device which is separate from the computing device, may be achieved by initiating a control operation in dependence on a combination of a mental gesture imagined by the user and a head gesture performed by the user.
  • the detected characteristic EEG pattern is commensurate with the mental gesture imagined by the user, and the characteristic movement of the head is commensurate with the head gesture performed by the user.
  • the user may control an operation of function of the computing device, or a controlled device, by imaging a specific mental gesture to enter a control mode which enables the user to initiate a control operation using head gestures, e.g., a change in orientation of the head and/or a rotation of the head.
  • the mental gesture imaged by the user may, e.g., be a spinning cube, a face, or similar.
  • the mental gesture which is associated with the control mode is selected so as to be easily detectable.
  • several control modes may be defined, corresponding to different control operations such as changing a volume setting and changing a brightness setting, respectively.
  • each control mode is associated with a different mental gesture and a corresponding characteristic EEG data pattern. For instance, the user may activate volume control by thinking about a cube, whereas brightness control is activated by thinking about a face.
  • embodiments of the invention relieve the user from retrieving the computing device from a pocket and/or touching the computing device.
  • a more reliable control of an operation of the computing device, or a controlled device is achieved in comparison with solutions relying on EEG data alone.
  • embodiments of the invention provide an improved security over solutions relying on voice control, as it is more difficult for an adversary to initiate a control operation.
  • embodiments of the invention are advantageous in situations where privacy and etiquette is a concern, and renders the technique insensitive to acoustically noisy environments.
  • control operation may, e.g., relate to a communication session, such as a voice call, a video call, a chat session, or the like.
  • control operation may relate to execution of a software application or app, to rendering of media, or to a user-interface of the computing device or the controlled device.
  • the computing device, or the controlled device may, e.g., be a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, or a television.
  • control operation may effect changing a volume of media play-out, changing a brightness of a display, scrolling a page, moving a cursor, changing a setting associated with the device or a software application, answering an incoming call, and so forth.
  • the computing device or the controlled device may also be a vehicle, such as a car or an Unmanned Aerial Vehicle (UAV), aka drone, and the control operation may relate to controlling the vehicle, e.g., steering the vehicle.
  • UAV Unmanned Aerial Vehicle
  • the characteristic EEG data pattern is derived during a calibration phase.
  • one or more specific mental gestures may be identified which can be reliably detected in the acquired EEG sensor data.
  • the characteristic EEG patterns which are triggered by certain mental gestures vary between individuals and are preferably learned for each user.
  • a comparison of acquired EEG sensor data to reference EEG sensor data which was acquired during the calibration phase is subsequently used for classifying the acquired EEG sensor data, in particular to detect whether the user has imagined a specific mental gesture which is associated with the control mode.
  • the user is notified in response to detecting the characteristic EEG pattern.
  • the computing device may emit an audible signal, display a visual notification, or provide the user with a haptic feedback. Thereby the user is notified, or alerted, that the computing device has successfully detected the
  • the computing device further comprises EEG sensors arranged for contacting a skin of the user, and at least one motion sensor for measuring a movement of the head. That is, the EEG sensors and the motion sensors are provided with the computing device.
  • the computing device may, e.g., be a BCI headset, an in-ear EEG device, or an around-ear EEG device.
  • the computing device further comprises a communications module, and the EEG sensor data and the motion sensor data are acquired by receiving the EEG sensor data and the motion sensor data, via the communications module, from the EEG sensors and the at least one motion sensor, respectively.
  • the computing device may receive the EEG sensor data and the motion sensor data from a BCI headset, an in-ear EEG device, or an around-ear EEG device.
  • Fig. 1 shows computing devices for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with embodiments of the invention.
  • Fig. 2 shows a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with another embodiment of the invention.
  • Fig. 3 shows a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with a further embodiment of the invention.
  • Fig. 4 illustrates answering an incoming call, as an example of a control operation initiated by an embodiment of the invention.
  • Fig. 5 illustrates scrolling a page, as another example of a control operation initiated by an embodiment of the invention.
  • Fig. 6 shows an embodiment of the processing means comprised in the computing device for initiating a control operation in response to a head gesture performed by a user of the computing device.
  • Fig. 7 shows another embodiment of the processing means comprised in the computing device for initiating a control operation in response to a head gesture performed by a user of the computing device.
  • Fig. 8 shows a method of initiating a control operation in response to a head gesture performed by a user of a computing device, in accordance with embodiments of the invention.
  • an embodiment 100 of the computing device for initiating a control operation in response to a head gesture performed by a user 1 10 of the computing device is illustrated as a tablet, a smartphone, or a phablet (a device which is intermediate in size between that of a smartphone and that of a tablet).
  • Computing device 100 is illustrated to comprise a processing means 103, a communications module 104, and a display 106, e.g., a touchscreen. It will be appreciated that computing device 100 may further comprise additional components, e.g. a microphone, a loudspeaker 105, or the like.
  • Communications module 104 is operative to effect wireless
  • communications module 104 may further be operative to effect wireless communications with a Radio Access Network (RAN) or with another compatible device, based on a cellular telecommunications technique such as the Global System for Mobile communications (GSM), Universal Mobile Telecommunications System (UMTS), Long Term Evolution (LTE), or any 5G standard, e.g., Next
  • NG Next Generation
  • NR New Radio
  • Computing device 100 is operative to acquire EEG sensor data from a set of EEG sensors 101 contacting a skin of user 1 10.
  • EEG is a technique which can be used for detecting a subject's brain activity by placing sensors, i.e., electrodes on the subject's scalp or other parts of the subject's head, e.g., within the ear channel and around the ear ("The neurophysiological bases of EEG and EEG measurement: a review for the rest of us", by A. F. Jackson and D. J. Bolger, Psychophysiology, vol. 51 , pages 1061— 1071 , Wiley, 2014).
  • Electrodes are used for measuring small electric potentials which are generated by action potentials of firing neurons, which are electrochemical excitations caused by the creation of an ion current in the cell's axon to activate connected cells through the synapses.
  • action potentials of firing neurons which are electrochemical excitations caused by the creation of an ion current in the cell's axon to activate connected cells through the synapses.
  • the most common method to capture EEG signals is by placing the electrodes directly on the scalp of the subject, similar to what is illustrated in Fig. 1 , it has recently been demonstrated that EEG signals from within the subject's ear channel can be detected and reliably classified ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and
  • Body Sensor Networks pages 130-135, IEEE, 2016.
  • the EEG sensor data may be acquired from a BCI headset 150 which user 1 10 is wearing, comprising EEG sensors 101 , i.e., electrodes, which are arranged for contacting a scalp and/or forehead of user 1 10 and capturing nerve signals from user 1 10.
  • the EEG sensor data may, e.g., be acquired by receiving the EEG sensor data from BCI headset 150 via communications module 104 and a compatible communications module 104 comprised in BCI headset 150.
  • Computing device 100 is further operative to detect a characteristic EEG data pattern in the acquired EEG data.
  • the characteristic EEG data pattern is associated with a control mode which allows user 1 10 to control computing device 100, or a separate controlled device, using head gestures.
  • the detected characteristic EEG pattern is commensurate with a specific mental gesture imagined by user 1 10, such as a spinning cube, a face, or the like. This specific mental gesture is preferably determined during a learning or calibration phase, as the ability to reliably detect mental gestures imagined by an individual varies between subjects.
  • EEG sensor data is acquired from EEG sensors 101 while user 1 10 is imaging a mental gesture which is selected to activate the control mode of computing device 100, i.e., which allows user 1 10 to control computing device 100, or a separate controlled device, by means of head gestures.
  • the thereby acquired EEG sensor data is stored as reference EEG sensor data.
  • the characteristic EEG data pattern is detected by comparing EEG sensor data acquired from EEG sensors 101 to the stored reference EEG sensor data, and determining whether a similarity condition between the two sets of EEG sensor data is fulfilled. In practice, this may be achieved by calculating a correlation between the two data sets and comparing the calculated correlation to a threshold value.
  • correlation is a statistical relationship which reflects the extent to which two random variables, such as the acquired EEG sensor data and stored reference EEG sensor data, are related with each other.
  • the correlation between two random variables is commonly referred to as cross-correlation and can be quantified by means of a correlation function, which can be expressed as an integral over the two random variables over time.
  • correlation functions are normalized such that a perfect correlation between the two random variables, i.e., the two random variables are identical, result in a maximum value which oftentimes is chosen to be equal to one ("1 ").
  • the correlation of two completely independent random variables yields a correlation value of zero ("0").
  • An example is the well-known Pearson product-moment correlation coefficient.
  • a more reliably detection of the characteristic EEG patent in the acquired EEG sensor data may be achieved by utilizing a machine-learning algorithm which is trained during a calibration or learning phase, and using the trained machine-learning algorithm for classifying EEG sensor data which is acquired during normal operation of computing device 100 ("Machine- Learning-Based Coadaptive Calibration for Brain-Computer Interfaces", by C. Vidaurre, C. Sannelli, K.-R. Muller, and B. Blankertz, Neural Computation, vol. 23, pages 791-816, MIT Press Journals, 201 1 ).
  • the characteristic EEG pattern may be detected by means of a Support Vector Classifier (SVC), a technique which has been demonstrated to work reliably even with in-ear EEG ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and J. Chuang, IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pages 130-135, IEEE, 2016).
  • SVC Support Vector Classifier
  • Computing device 100 is further operative to acquire motion sensor data from at least one motion sensor 102 attached to the head, and detect a characteristic movement of the head which is commensurate with the performed head gesture.
  • the motion sensor data may, e.g., be acquired by receiving the motion sensor data from BCI headset 150, via communications module 104 and a compatible communications module 104 comprised in BCI headset 150.
  • Motion sensor 102 may, e.g., comprise one or more inertial sensors, accelerometers, gyroscopes, or any combination thereof.
  • accelerometers of the type which are provided with today's mobile phones, smartphones, and tablets, can be used to detect the orientation of the device.
  • the gyroscope adds an additional dimension to the information supplied by the accelerometer by tracking a rotation or twist of the device. More specifically, an accelerometer measures a linear
  • a gyroscope measures an angular rotational velocity.
  • a change in orientation and/or rotation of a device can be reliably determined.
  • the accuracy the acquired motion sensor data may be improved by employing differential measurements between at least two motion sensors 102 which are attached at separate locations on the head of user 1 10, e.g., one motion sensor 102 on the left side of the head and one motion sensor 102 on the right side of the head, e.g., close to the ears.
  • motion sensor 102 comprised in BCI headset 150 which is worn by user 1 10 can be used to determine whether user 1 10 is performing a specific head gesture which, e.g., involves tilting the head backwards or forwards 1 12, tilting the head left or right 1 13, rotating the head clockwise or counter-clockwise 1 14, or any combination thereof. If performed in
  • the different movements may either be performed
  • a head gesture may involve tilting the head forwards followed by a clockwise rotation of the head.
  • the detection of a characteristic movement of the head which is commensurate with the performed head gesture is achieved by comparing the acquired motion sensor data to reference motion sensor data, and concluding that the characteristic movement of the head is detected if the acquired motion sensor data and the reference motion sensor data fulfil a similarity condition.
  • the reference motion sensor data may, e.g., be obtained and stored by computing device 100 during a learning or calibration phase, by acquiring motion sensor data from motion sensor 102 while user 1 10 performs a head gesture which he/or she wishes to use for initiating an associated control operation.
  • different sets of reference motion sensor data corresponding to multiple head gestures may be obtained and stored by computing device 100 so as to enable user 1 10 to initiate different control operations, as is described further below.
  • the comparison of the acquired motion sensor data and the reference motion sensor data may, e.g., be performed by calculation a correlation between the two data sets, as is described hereinbefore, and comparing the calculated correlation to a threshold value.
  • Computing device 100 is further operative to initiate a control operation which is associated with the performed head gesture.
  • the control operation may control an operation of computing device 100, or an operation of a controlled device which is separate from computing device 100, as is described in further detail below.
  • the control operation may, e.g., comprise controlling rendering or play-out of media, such as music or video, as is illustrated in Fig. 1 by a media control center 120 which is displayed by computing device 100.
  • a certain head gesture may initiate changing a media track, such as a song (e.g., "Born to run"), which is currently rendered by computing device 100 and/or played-out using loudspeaker 105 or earphones 105 operatively connected to computing device 100.
  • the initiated control operation has a similar effect as utilizing a user-interface element 121 or 122 for skipping to the next or the preceding track, respectively.
  • the performed head gesture may initiate increasing or decreasing a volume of audio which is currently played-out by computing device 100, similar to utilizing a user- interface element 123 or 124 for increasing or decreasing the volume, respectively.
  • each head gesture being commensurate with a detectable characteristic movement of the head, may be associated with different control operations. For instance, tilting the head forward may increase the volume, whereas tilting the head backward decreases the volume. Likewise, tilting the head to the right may skip to the next track, whereas tilting the head to the left skips to a preceding track.
  • control operation may comprise initiating a communication session, such as voice call, a video call, a chat, or any other type of communication session received by computing device 100 or a separate controlled device.
  • control operation may comprise answering a request for establishing a communication session, or terminating a communication session.
  • computing device 100 may be operative to answer an incoming call in response to a first head gesture, such as nodding the head in a way a person would do to indicate confirmation or approval (a silent "Yes"), instead of touching a user- interface element 421 .
  • computing device 100 may be operative to reject or deny an incoming call in response to a second head gesture, such as shaking the head in a way a person would do to indicate disagreement (a silent "No"), instead of touching a user-interface
  • the control operation may comprise controlling a user-interface of computing device 100, or a separate controlled device.
  • computing device 100 may be operative to scroll a displayed page 520 upwards in response to a first head gesture, such as tilting the head backward, instead of touching a user- interface element 521 , and to scroll page 520 downwards in response to a second head gesture, such as tilting the head forward, instead of touching a user-interface element 522.
  • computing device 100 may be operative to move a cursor of a user-interface, or a focus selecting a user-interface element, in response to a head gesture.
  • the control operation may alternatively relate to any other operation or function of computing device 100, or a separate controlled device, and may, e.g., comprise starting execution of an application, i.e., a software application or app, stopping execution of an application, changing a setting of the computing device, a controlled device, or an application, and so forth.
  • an application i.e., a software application or app
  • stopping execution of an application changing a setting of the computing device, a controlled device, or an application, and so forth.
  • one or more head gestures in combination with a mental gesture may be used for adjusting the brightness of display 106, or waking computing device 100 or a separate controlled device from sleep-mode.
  • Computing device 100 may optionally be operative to derive a measure which is related to the performed head gesture from the acquired motion sensor data, such as the distance, or angle, travelled when tilting or rotating the head.
  • the initiated control operation may additionally be dependent on the derived measure. For instance, if the control mode allows user 1 10 to control the volume of audio played-out by computing device 100, the change in volume which is effected by initiating the control operation may be dependent on the derived measure, e.g., the angle by which the head is tilted forwards or backwards.
  • Computing device 100 may optionally be operative to notify, or alert, user 1 10 in response to detecting the characteristic EEG pattern. For instance, computing device 100 may emit an audible signal, display a visual notification, or a provide a haptic feedback, e.g., a vibration. Thereby, user 1 10 is notified that computing device 100 has successfully detected the mental gesture which is associated with the control mode, and has
  • computing device 100 may be operative to acquire the motion sensor data in response to detecting the characteristic EEG pattern. This is advantageous in that motion sensor data is only acquired and analyzed when the intention of user 1 10 to enter the control mode has been detected. As an alternative, motion sensor data may be acquired
  • control operation is initiated in response to detecting the characteristic EEG data pattern in the acquired EEG data in combination with detecting the characteristic movement of the head.
  • control mode corresponds to different control operations, such as changing volume, changing brightness, changing a currently played track, and so forth.
  • Each control mode is associated with a specific mental gesture and a corresponding characteristic EEG data pattern. For instance, thinking about a cube may activate volume control, whereas thinking about a face activates brightness control.
  • a comparison between each one of a set of reference EEG sensor data and the acquired EEG sensor data is performed. Depending on which reference EEG sensor data best matches the acquired EEG sensor data, i.e., is most similar, the corresponding control mode is activated or entered.
  • the reference EEG sensor data which matches the acquired EEG sensor data best may, e.g., be determined by calculating a correlation between the acquired EEG sensor data and each reference EEG sensor data, and activating the control mode which corresponds to the reference EEG sensor data which has the highest correlation with the acquired EEG sensor data.
  • the control mode which the user intends to activate may be identified by machine-learning techniques
  • computing device 100 If computing device 100 is operative to notify user 1 10 in response to detecting the characteristic EEG pattern, computing device 100 may further be operative to provide an indication of the activated control mode to user 1 10, e.g., whether volume control or brightness control is activated.
  • the EEG sensor data may be acquired from an in-ear EEG device 200, as is illustrated in Fig. 2.
  • In-ear EEG device 200 is designed for insertion into the ear channel of ear 1 1 1 , similar to an earbud, and comprises EEG sensors or
  • EEG sensor data captured by in-ear EEG devices and the reliable classification of the EEG sensor data using SVCs has recently been demonstrated ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and J. Chuang, IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pages 130-135, IEEE, 2016).
  • In-ear EEG devices are commercially available, e.g., "Aware” from United Sciences (http://efitaware.com/, retrieved on 1 December 2016).
  • In-ear EEG device 200 further comprises at least one motion sensor 102, similar to what is described with reference to motion sensor 102 comprised in computing device 100.
  • the EEG sensor data and the motion sensor data may be acquired from in-ear EEG device 200 via communications module 104 and a compatible communications module 104 comprised in in-ear EEG device 200.
  • the EEG sensor data may be acquired from an around-ear EEG device 300, as is illustrated in Fig. 3.
  • Around-ear EEG device 300 comprises EEG sensors or electrodes 101 which are arranged for contacting the skin around ear 1 1 1 of user 1 10.
  • the use of EEG sensor data from around-ear EEG devices has, e.g., been reported in "Target Speaker Detection with Concealed EEG Around the Ear", by B. Mirkovic,
  • Around-ear EEG device 300 further comprises at least one motion sensor 102, similar to what is described with reference to motion sensor 102 comprised in computing device 100.
  • the EEG sensor data and the motion sensor data may be acquired from around-ear EEG device 300 via communications module 104 and a compatible
  • communications module 104 comprised in around-ear EEG device 300.
  • EEG sensor data which is suitable for detecting the characteristic EEG data pattern.
  • This may, e.g., be achieved using a decoder which implements a transfer function for transforming the raw EEG signals captured by several EEG electrode channels into one time- dependent function.
  • the signal representing the one time-dependent function may, e.g., be used as input for calculating a correlation between the acquired EEG sensor data and the reference EEG sensor data.
  • alternative embodiments of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on any one of BCI
  • headset 150 in-ear EEG device 200, and around-ear EEG device 300, respectively.
  • computing device 150 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing device 100 hereinbefore, computing device 150 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion sensor 102, detect a characteristic movement of the head, and initiate a control operation which is associated with the performed head gesture. The control operation may either control an operation of computing device 150, or the operation of a controlled device which is separate from computing device 150, such as changing a volume setting.
  • a further alternative embodiment 200 of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on in-ear EEG device 200.
  • Computing device 200 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 inside the ear channel of ear 1 1 1 , and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing
  • computing device 200 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion
  • control operation may either control an operation of computing device 200, or the operation of a controlled device which is separate from computing device 200, such as changing a volume setting.
  • yet a further embodiment 300 of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on around-ear EEG
  • Computing device 300 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 around ear 1 1 1 of user 1 10, and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing device 100 hereinbefore, computing device 300 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion sensor 102, detect a characteristic movement of the head, and initiate a control operation which is associated with the performed head gesture. The control operation may either control an operation of computing device 300, or the operation of a controlled device which is separate from computing device 300, such as changing a volume setting.
  • Embodiments of the invention which control operation of a controlled device which is separate from the computing device utilize communications module 104 for transmitting a control signal pertaining to the control operation to the controlled device.
  • the control signal may be transmitted by using any suitable protocol, e.g., the HyperText Transfer Protocol (HTTP), the Constrained Application Protocol (CoAP), or the like, and triggers the controlled device to effect the control operation.
  • HTTP HyperText Transfer Protocol
  • CoAP Constrained Application Protocol
  • the controlled device may be any type of computing device comprising a communications module which is operative to receive a control signal from an embodiment of the computing device for initiating a control operation in response to a head gesture performed by user 1 10, including, e.g., mobile phones, smartphones, tablets, personal computers, laptops, smartwatches, wearables, digital cameras, household appliances, televisions, and vehicles, such as a car or a UAV.
  • the controlled device may, e.g., be smartphone 100 described with reference to Fig.
  • an embodiment of the computing device for initiating a control operation in response to a head gesture performed by user 1 10, such as BCI headset 150, in-ear EEG device 200, or around-ear EEG device 300, may be used for controlling an operation of smartphone 100 through a combination of a mental gesture and a head gesture, as is described hereinbefore.
  • processing means 103 comprised in an embodiment 100, 150, 200, or 300, of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 (hereinafter referred to as 100-300), respectively, are described with reference to Figs. 6 and 7.
  • a first embodiment 600 of processing means 103 is shown in Fig. 6.
  • Processing means 600 comprises a processing unit 602, such as a general purpose processor, and a computer-readable storage medium 603, such as a Random Access Memory (RAM), a Flash memory, or the like.
  • processing means 600 comprises one or more interfaces 601 ("I/O" in Fig. 6) for controlling and/or receiving information from other components comprised in computing device 100-300, such as motion sensor 101 , EEG sensors 102, communications module 104, loudspeaker/earphones 105, and display 106.
  • interface(s) 601 may be operative to acquire EEG sensor data and motion sensor data, either from built-in EEG sensors 101 and motion sensor 102, or from external EEG sensors 101 and motion sensors 102, via communications module 104.
  • the acquired EEG sensor data and motion sensor data may either be received as analog signals, which are digitalized in processing means 600 for subsequent processing, or in a digital format.
  • Memory 603 contains computer-executable instructions 604, i.e., a computer program or software, for computing device 100-300 to become operative to perform in accordance with embodiments of the invention as described herein, when computer-executable instructions 604 are executed on processing unit 602.
  • computer-executable instructions 604 i.e., a computer program or software
  • processing means 700 comprises one or more interfaces 701 ("I/O" in Fig. 7) for controlling and/or receiving information from other components comprised in computing device 100-300, such as motion sensor 101 , EEG sensors 102, communications module 104, loudspeaker/earphones 105, and display 106.
  • interface(s) 701 may be operative to acquire EEG sensor data and motion sensor data, either from built-in EEG sensors 101 and motion sensor 102, or from external EEG sensors 101 and motion sensors 102, via communications module 104.
  • the acquired EEG sensor data and motion sensor data may either be received as analog signals, which are digitalized in processing means 700 for subsequent processing, or in a digital format.
  • Processing means 700 further comprises an EEG module 702, a motion module 703, a control module 704, and, optionally, a notification module 705, which are configured to cause computing device 100-300 to perform in accordance with embodiments of the invention as described herein.
  • EEG module 702 is configured to cause computing device 100-300 to acquire EEG sensor data from EEG sensors 101 contacting a skin of user 1 10 and detect a characteristic EEG data pattern in the acquired EEG data.
  • Motion module 703 is configured to acquire motion sensor data from motion sensor 102 attached to the head of user 1 10 and detect a characteristic movement of the head.
  • Control module 704 is configured to initiate a control operation which is associated with the performed head gesture. The control operation may, e.g., control an operation of computing device 100-300, or that of a controlled device which is separate from computing device 100-300. In the latter case, control module 704 is configured to initiate the control operation by transmitting, via communications module 104, a control signal pertaining to the control operation to the controlled device.
  • EEG module 702 and motion module 703 may be configured to acquire the EEG sensor data and the motion sensor data by receiving, via communications module 104, the EEG sensor data and the motion sensor data from EEG sensors 101 and motion sensor 102, respectively.
  • EEG module 702 may further be configured to derive the characteristic EEG data pattern during a learning or calibration phase.
  • motion module 703 may be configured to acquire the motion sensor data in response to detecting the characteristic EEG pattern.
  • Optional notification module 705 may be configured to notify user 1 10 in response to detecting the characteristic EEG pattern.
  • Interfaces 601 and 701 , and modules 702-705, as well as any additional modules comprised in processing means 700, may be
  • analogue electronic circuitry digital electronic circuitry
  • processing means executing a suitable computer program, i.e., software.
  • Method 800 is performed by a computing device such as a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, a television, or a vehicle.
  • a computing device such as a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, a television, or a vehicle.
  • Method 800 comprises acquiring 801 EEG sensor data from EEG sensors contacting a skin of the user and detecting 802 a characteristic EEG data pattern in the acquired EEG data.
  • Method 800 further comprises acquiring 804 motion sensor data from at least one motion sensor attached to the head and detecting 805 a characteristic movement of the head.
  • Method 800 further comprises initiating 806 a control operation which is associated with the performed head gesture.
  • the control operation may, e.g., control an operation of the computing device.
  • the control operation may control an operation of a controlled device which is separate from the computing device.
  • the initiating 806 the control operation comprises transmitting, via a communications module comprised in the computing device, a control signal pertaining to the control operation to the controlled device.
  • the control operation may, e.g., comprise any one of initiating a communication session, answering a request for establishing a communication session, terminating a communication session, starting execution of an application, stopping execution of an application, changing a setting, controlling rendering of media, or controlling a user-interface of the computing device or the controlled device.
  • the EEG sensor data and the motion sensor data may, e.g., be acquired 801/804 from EEG sensors and at least one motion sensor comprised in the computing device.
  • the EEG sensor data and the motion sensor data may be acquired by receiving, via a communications module comprised in the computing device, the EEG sensor data and the motion sensor data from the EEG sensors and the at least one motion sensor, respectively.
  • the motion sensor data is only acquired 804 if the characteristic EEG pattern is detected, as is illustrated in Fig. 8.
  • the motion sensor data may alternatively be acquired concurrently with
  • method 800 may further comprise notifying 803 the user that the characteristic EEG pattern has been detected.
  • method 800 may further comprise deriving the
  • method 800 may comprise additional, or modified, steps in accordance with what is described throughout this disclosure.
  • An embodiment of method 800 may be implemented as software, such as computer program 604, to be executed by a processing unit comprised in the computing device, whereby the computing device becomes operative to perform in accordance with embodiments of the invention described herein

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon l'invention, un dispositif informatique (100, 150) permet de lancer une opération de commande en réponse à un geste de la tête (112-114) effectué par un utilisateur (110) dudit dispositif informatique. Le dispositif informatique est conçu pour acquérir des données de capteurs d'électroencéphalogramme (EEG) provenant de capteurs d'EEG (101) en contact avec la peau de l'utilisateur, détecter un schéma de données d'EEG caractéristique dans les données d'EEG acquises, ce schéma étant associé à un mode de commande, acquérir des données de capteur de mouvement provenant d'au moins un capteur de mouvement (102) fixé sur la tête, détecter un mouvement caractéristique de la tête, ce mouvement étant en accord avec le geste de la tête effectué, et lancer une opération de commande qui est associée au geste de la tête effectué. Le schéma d'EEG caractéristique détecté est en accord avec un geste mental spécifique imaginé par l'utilisateur. L'opération de commande peut, par exemple, se rapporter à la commande (121-124) de la lecture d'un support.
PCT/EP2017/052515 2017-02-06 2017-02-06 Lancement d'une opération de commande en réponse à un geste de la tête Ceased WO2018141409A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2017/052515 WO2018141409A1 (fr) 2017-02-06 2017-02-06 Lancement d'une opération de commande en réponse à un geste de la tête

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2017/052515 WO2018141409A1 (fr) 2017-02-06 2017-02-06 Lancement d'une opération de commande en réponse à un geste de la tête

Publications (1)

Publication Number Publication Date
WO2018141409A1 true WO2018141409A1 (fr) 2018-08-09

Family

ID=57965953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/052515 Ceased WO2018141409A1 (fr) 2017-02-06 2017-02-06 Lancement d'une opération de commande en réponse à un geste de la tête

Country Status (1)

Country Link
WO (1) WO2018141409A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110502103A (zh) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 基于脑机接口的脑控无人机系统及其控制方法
CN110989832A (zh) * 2019-11-21 2020-04-10 维沃移动通信有限公司 一种控制方法和电子设备
CN111290579A (zh) * 2020-02-10 2020-06-16 Oppo广东移动通信有限公司 虚拟内容的控制方法、装置、电子设备及计算机可读介质
CN111831111A (zh) * 2019-04-16 2020-10-27 硅谷介入有限公司 作为控制装置的具有人工智能的脑电图仪
CN112363659A (zh) * 2020-11-09 2021-02-12 平安普惠企业管理有限公司 App界面操作方法、装置、电子设备及存储介质
CN113009931A (zh) * 2021-03-08 2021-06-22 北京邮电大学 一种有人机与无人机混合编队协同控制装置及方法
CN114167984A (zh) * 2021-01-28 2022-03-11 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2428869A1 (fr) * 2010-09-13 2012-03-14 Sony Ericsson Mobile Communications AB Contrôle de dispositif de communication mobile basé sur un mouvement de tête
US20150261298A1 (en) * 2014-03-15 2015-09-17 Microsoft Corporation Trainable sensor-based gesture recognition
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
US9449446B1 (en) * 2012-05-27 2016-09-20 Make Ideas, LLC System employing a plurality of brain/body-generated inputs to control the multi-action operation of a controllable device
US20170017080A1 (en) * 2015-07-17 2017-01-19 Azubuike Victor Onwuta Think and Zoom

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2428869A1 (fr) * 2010-09-13 2012-03-14 Sony Ericsson Mobile Communications AB Contrôle de dispositif de communication mobile basé sur un mouvement de tête
US9449446B1 (en) * 2012-05-27 2016-09-20 Make Ideas, LLC System employing a plurality of brain/body-generated inputs to control the multi-action operation of a controllable device
US20150261298A1 (en) * 2014-03-15 2015-09-17 Microsoft Corporation Trainable sensor-based gesture recognition
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
US20170017080A1 (en) * 2015-07-17 2017-01-19 Azubuike Victor Onwuta Think and Zoom

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
A. F. JACKSON; D. J. BOLGER: "Psychophysiology", vol. 51, 2014, WILEY, article "The neurophysiological bases of EEG and EEG measurement: a review for the rest of us", pages: 1061 - 1071
B. MIRKOVIC; M. G. BLEICHNER; M. DE VOS; S. DEBENER: "Target Speaker Detection with Concealed EEG Around the Ear", FRONTIERS IN NEUROSCIENCE, vol. 10, 2016
C. VIDAURRE; C. SANNELLI; K.-R. MULLER; B. BLANKERTZ: "Neural Computation", vol. 23, 2011, MIT PRESS JOURNALS, article "Machine-Learning-Based Coadaptive Calibration for Brain-Computer Interfaces", pages: 791 - 816
MERRILL NICK ET AL: "Classifying mental gestures with in-ear EEG", 2016 IEEE 13TH INTERNATIONAL CONFERENCE ON WEARABLE AND IMPLANTABLE BODY SENSOR NETWORKS (BSN), IEEE, 14 June 2016 (2016-06-14), pages 130 - 135, XP032925830, DOI: 10.1109/BSN.2016.7516246 *
N. CARLINI; P. MISHRA; T. VAIDYA; Y. ZHANG; M. SHERR; C. SHIELDS; D. WAGNER; W. ZHOU: "Proceedings of the 25th USENIX Security Symposium", 2016, USENIX ASSOCIATION, article "Hidden Voice Commands", pages: 513 - 530
N. MERRILL; M. T. CURRAN; J.-K. YANG; J. CHUANG: "IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN", 2016, IEEE, article "Classifying Mental Gestures with In-Ear EEG", pages: 130 - 135
S. YOUNG ROJAHN: "Samsung Demos a Tablet Controlled by Your Brain", MIT TECHNOLOGY REVIEW, BIOMEDICINE, 19 April 2013 (2013-04-19)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831111A (zh) * 2019-04-16 2020-10-27 硅谷介入有限公司 作为控制装置的具有人工智能的脑电图仪
CN110502103A (zh) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 基于脑机接口的脑控无人机系统及其控制方法
CN110989832A (zh) * 2019-11-21 2020-04-10 维沃移动通信有限公司 一种控制方法和电子设备
CN111290579A (zh) * 2020-02-10 2020-06-16 Oppo广东移动通信有限公司 虚拟内容的控制方法、装置、电子设备及计算机可读介质
CN112363659A (zh) * 2020-11-09 2021-02-12 平安普惠企业管理有限公司 App界面操作方法、装置、电子设备及存储介质
CN114167984A (zh) * 2021-01-28 2022-03-11 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备
CN114167984B (zh) * 2021-01-28 2024-03-12 Oppo广东移动通信有限公司 设备控制方法、装置、存储介质及电子设备
CN113009931A (zh) * 2021-03-08 2021-06-22 北京邮电大学 一种有人机与无人机混合编队协同控制装置及方法
CN113009931B (zh) * 2021-03-08 2022-11-08 北京邮电大学 一种有人机与无人机混合编队协同控制装置及方法

Similar Documents

Publication Publication Date Title
WO2018141409A1 (fr) Lancement d'une opération de commande en réponse à un geste de la tête
US10366778B2 (en) Method and device for processing content based on bio-signals
KR102080747B1 (ko) 이동 단말기 및 그것의 제어 방법
EP2652578B1 (fr) Corrélation de signaux biologiques avec des modes de fonctionnement d'un appareil
US9848796B2 (en) Method and apparatus for controlling media play device
CN111742361B (zh) 一种终端更新语音助手的唤醒语音的方法及终端
US9374647B2 (en) Method and apparatus using head movement for user interface
KR20180102871A (ko) 이동단말기 및 이동단말기의 차량 제어 방법
JP6121618B2 (ja) 発射制御方法、装置、システム、プログラム、及び記録媒体
US12477265B2 (en) Portable audio device
US11902091B2 (en) Adapting a device to a user based on user emotional state
US9543918B1 (en) Configuring notification intensity level using device sensors
CN107613131A (zh) 一种应用程序免打扰方法及移动终端
CN108683968A (zh) 显示控制方法及相关产品
US11294449B2 (en) Multipoint sensor system for efficient power consumption
US12483843B2 (en) Context-based situational awareness for hearing instruments
US20210149483A1 (en) Selective image capture based on multi-modal sensor input
US9495017B2 (en) Computing systems for peripheral control
CN109144263A (zh) 社交辅助方法、装置、存储介质及穿戴式设备
CN109343710A (zh) 消息回复方法和装置
US11164602B1 (en) Detecting loss of attention during playing of media content in a personal electronic device
WO2023216930A1 (fr) Procédé de rétroaction de vibration basé sur un dispositif habitronique, système, dispositif habitronique et dispositif électronique
KR101689713B1 (ko) 휴대 단말기 및 그 동작 방법
KR101696720B1 (ko) 휴대 단말기 및 그 동작 방법
EP4312436A1 (fr) Modes de fonctionnement pour partager des écouteurs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17703418

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17703418

Country of ref document: EP

Kind code of ref document: A1