[go: up one dir, main page]

WO2018141409A1 - Initiating a control operation in response to a head gesture - Google Patents

Initiating a control operation in response to a head gesture Download PDF

Info

Publication number
WO2018141409A1
WO2018141409A1 PCT/EP2017/052515 EP2017052515W WO2018141409A1 WO 2018141409 A1 WO2018141409 A1 WO 2018141409A1 EP 2017052515 W EP2017052515 W EP 2017052515W WO 2018141409 A1 WO2018141409 A1 WO 2018141409A1
Authority
WO
WIPO (PCT)
Prior art keywords
eeg
computing device
head
sensor data
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2017/052515
Other languages
French (fr)
Inventor
Matthew John LAWRENSON
Jan Jasper VAN DEN BERG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Telefonaktiebolaget LM Ericsson AB
Original Assignee
Telefonaktiebolaget LM Ericsson AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Telefonaktiebolaget LM Ericsson AB filed Critical Telefonaktiebolaget LM Ericsson AB
Priority to PCT/EP2017/052515 priority Critical patent/WO2018141409A1/en
Publication of WO2018141409A1 publication Critical patent/WO2018141409A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the invention relates to a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, a method of initiating a control operation in response to a head gesture performed by a user of a computing device, a corresponding computer program, and a corresponding computer program product.
  • touch control is the primary interaction method for most users and is used to control the operation of, and interact with, various types of computing devices.
  • mobile computing devices such as smartphones and tablets, which users control through buttons or a touchscreen displaying a graphical user-interface.
  • the possibility to interact with the touchscreen is disabled when not being intentionally used to avoid input of unintentional commands.
  • touchscreen-based devices are oftentimes able to detect that they are stored in a pocket, or held to a person's ear, and the touchscreen is disabled accordingly.
  • Known touch-control solutions typically require a user to touch a touchscreen several times, or to press several buttons, in order to activate a function. In addition, they oftentimes require the user to be looking at the screen. This prevents or impedes control of the computing devices when they device are stored in a pocket, or in situations where the user's vision or hands are occupied, such as when the user is riding a bike or driving a car.
  • Voice control solutions offer an alternative way of interacting with computing devices, and is based on voice recognition software to interpret commands spoken by a user. These solutions suffer from the disadvantage that a user first needs to activate the voice control functionality of the device. For example, Apple's Siri is activated by pressing the home button for two seconds. Alternative voice activation techniques may require the user to utter a key phrase, such as "OK Google" or "Alexa", in order to activate the voice control function of a computing device.
  • Voice control may also constitute a security risk, as voice control commands may be mimicked with malicious intent, e.g., by giving voice commands to a device through hidden audio embedded in a YouTube video (see, e.g., "Hidden Voice Commands", by N. Carlini, P. Mishra, T. Vaidya, Y. Zhang, M. Sherr, C. Shields, D. Wagner, and W. Zhou, Proceedings of the 25th USENIX Security Symposium, pages 513-530, USENIX Association, 2016).
  • many people do not like to talk to devices in public, partly due to privacy concerns arise from speaking commands aloud.
  • Electroencephalogram EEG
  • brain waves Electroencephalogram
  • BCIs Brain-Computer Interfaces
  • BCIs Brain-Computer Interfaces
  • a computing device for initiating a control operation.
  • the control operation is initiated in response to a head gesture performed by a user of the computing device.
  • the computing device comprises processing means operative to acquire EEG sensor data from EEG sensors contacting a skin of the user, and to detect a characteristic EEG data pattern in the acquired EEG data.
  • the characteristic EEG data pattern is associated with a control mode.
  • the processing means is further operative to acquire motion sensor data from at least one motion sensor attached to the head, and to detect a characteristic movement of the head. The characteristic movement is commensurate with the performed head gesture.
  • the processing means is further operative to initiate a control operation which is associated with the performed head gesture.
  • a method of initiating a control operation is provided.
  • the control operation is initiated in response to a head gesture performed by a user of a computing device.
  • the method comprises acquiring EEG sensor data from EEG sensors contacting a skin of the user, and detecting a characteristic EEG data pattern in the acquired EEG data.
  • the characteristic EEG data pattern is associated with a control mode.
  • the method further comprises acquiring motion sensor data from at least one motion sensor attached to the head, and detecting a characteristic movement of the head. The characteristic movement is commensurate with the performed head gesture.
  • the method further comprises initiating a control operation which is associated with the performed head gesture.
  • a computer program comprises computer-executable instructions for causing a device to perform the method according to an embodiment of the second aspect of the invention, when the computer- executable instructions are executed on a processing unit comprised in the device.
  • a computer program product comprises a computer- readable storage medium which has the computer program according to the third aspect of the invention embodied therein.
  • the invention makes use of an understanding that an improved solution for enabling a user of a computing device to control an operation of the computing device, or a controlled device which is separate from the computing device, may be achieved by initiating a control operation in dependence on a combination of a mental gesture imagined by the user and a head gesture performed by the user.
  • the detected characteristic EEG pattern is commensurate with the mental gesture imagined by the user, and the characteristic movement of the head is commensurate with the head gesture performed by the user.
  • the user may control an operation of function of the computing device, or a controlled device, by imaging a specific mental gesture to enter a control mode which enables the user to initiate a control operation using head gestures, e.g., a change in orientation of the head and/or a rotation of the head.
  • the mental gesture imaged by the user may, e.g., be a spinning cube, a face, or similar.
  • the mental gesture which is associated with the control mode is selected so as to be easily detectable.
  • several control modes may be defined, corresponding to different control operations such as changing a volume setting and changing a brightness setting, respectively.
  • each control mode is associated with a different mental gesture and a corresponding characteristic EEG data pattern. For instance, the user may activate volume control by thinking about a cube, whereas brightness control is activated by thinking about a face.
  • embodiments of the invention relieve the user from retrieving the computing device from a pocket and/or touching the computing device.
  • a more reliable control of an operation of the computing device, or a controlled device is achieved in comparison with solutions relying on EEG data alone.
  • embodiments of the invention provide an improved security over solutions relying on voice control, as it is more difficult for an adversary to initiate a control operation.
  • embodiments of the invention are advantageous in situations where privacy and etiquette is a concern, and renders the technique insensitive to acoustically noisy environments.
  • control operation may, e.g., relate to a communication session, such as a voice call, a video call, a chat session, or the like.
  • control operation may relate to execution of a software application or app, to rendering of media, or to a user-interface of the computing device or the controlled device.
  • the computing device, or the controlled device may, e.g., be a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, or a television.
  • control operation may effect changing a volume of media play-out, changing a brightness of a display, scrolling a page, moving a cursor, changing a setting associated with the device or a software application, answering an incoming call, and so forth.
  • the computing device or the controlled device may also be a vehicle, such as a car or an Unmanned Aerial Vehicle (UAV), aka drone, and the control operation may relate to controlling the vehicle, e.g., steering the vehicle.
  • UAV Unmanned Aerial Vehicle
  • the characteristic EEG data pattern is derived during a calibration phase.
  • one or more specific mental gestures may be identified which can be reliably detected in the acquired EEG sensor data.
  • the characteristic EEG patterns which are triggered by certain mental gestures vary between individuals and are preferably learned for each user.
  • a comparison of acquired EEG sensor data to reference EEG sensor data which was acquired during the calibration phase is subsequently used for classifying the acquired EEG sensor data, in particular to detect whether the user has imagined a specific mental gesture which is associated with the control mode.
  • the user is notified in response to detecting the characteristic EEG pattern.
  • the computing device may emit an audible signal, display a visual notification, or provide the user with a haptic feedback. Thereby the user is notified, or alerted, that the computing device has successfully detected the
  • the computing device further comprises EEG sensors arranged for contacting a skin of the user, and at least one motion sensor for measuring a movement of the head. That is, the EEG sensors and the motion sensors are provided with the computing device.
  • the computing device may, e.g., be a BCI headset, an in-ear EEG device, or an around-ear EEG device.
  • the computing device further comprises a communications module, and the EEG sensor data and the motion sensor data are acquired by receiving the EEG sensor data and the motion sensor data, via the communications module, from the EEG sensors and the at least one motion sensor, respectively.
  • the computing device may receive the EEG sensor data and the motion sensor data from a BCI headset, an in-ear EEG device, or an around-ear EEG device.
  • Fig. 1 shows computing devices for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with embodiments of the invention.
  • Fig. 2 shows a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with another embodiment of the invention.
  • Fig. 3 shows a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with a further embodiment of the invention.
  • Fig. 4 illustrates answering an incoming call, as an example of a control operation initiated by an embodiment of the invention.
  • Fig. 5 illustrates scrolling a page, as another example of a control operation initiated by an embodiment of the invention.
  • Fig. 6 shows an embodiment of the processing means comprised in the computing device for initiating a control operation in response to a head gesture performed by a user of the computing device.
  • Fig. 7 shows another embodiment of the processing means comprised in the computing device for initiating a control operation in response to a head gesture performed by a user of the computing device.
  • Fig. 8 shows a method of initiating a control operation in response to a head gesture performed by a user of a computing device, in accordance with embodiments of the invention.
  • an embodiment 100 of the computing device for initiating a control operation in response to a head gesture performed by a user 1 10 of the computing device is illustrated as a tablet, a smartphone, or a phablet (a device which is intermediate in size between that of a smartphone and that of a tablet).
  • Computing device 100 is illustrated to comprise a processing means 103, a communications module 104, and a display 106, e.g., a touchscreen. It will be appreciated that computing device 100 may further comprise additional components, e.g. a microphone, a loudspeaker 105, or the like.
  • Communications module 104 is operative to effect wireless
  • communications module 104 may further be operative to effect wireless communications with a Radio Access Network (RAN) or with another compatible device, based on a cellular telecommunications technique such as the Global System for Mobile communications (GSM), Universal Mobile Telecommunications System (UMTS), Long Term Evolution (LTE), or any 5G standard, e.g., Next
  • NG Next Generation
  • NR New Radio
  • Computing device 100 is operative to acquire EEG sensor data from a set of EEG sensors 101 contacting a skin of user 1 10.
  • EEG is a technique which can be used for detecting a subject's brain activity by placing sensors, i.e., electrodes on the subject's scalp or other parts of the subject's head, e.g., within the ear channel and around the ear ("The neurophysiological bases of EEG and EEG measurement: a review for the rest of us", by A. F. Jackson and D. J. Bolger, Psychophysiology, vol. 51 , pages 1061— 1071 , Wiley, 2014).
  • Electrodes are used for measuring small electric potentials which are generated by action potentials of firing neurons, which are electrochemical excitations caused by the creation of an ion current in the cell's axon to activate connected cells through the synapses.
  • action potentials of firing neurons which are electrochemical excitations caused by the creation of an ion current in the cell's axon to activate connected cells through the synapses.
  • the most common method to capture EEG signals is by placing the electrodes directly on the scalp of the subject, similar to what is illustrated in Fig. 1 , it has recently been demonstrated that EEG signals from within the subject's ear channel can be detected and reliably classified ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and
  • Body Sensor Networks pages 130-135, IEEE, 2016.
  • the EEG sensor data may be acquired from a BCI headset 150 which user 1 10 is wearing, comprising EEG sensors 101 , i.e., electrodes, which are arranged for contacting a scalp and/or forehead of user 1 10 and capturing nerve signals from user 1 10.
  • the EEG sensor data may, e.g., be acquired by receiving the EEG sensor data from BCI headset 150 via communications module 104 and a compatible communications module 104 comprised in BCI headset 150.
  • Computing device 100 is further operative to detect a characteristic EEG data pattern in the acquired EEG data.
  • the characteristic EEG data pattern is associated with a control mode which allows user 1 10 to control computing device 100, or a separate controlled device, using head gestures.
  • the detected characteristic EEG pattern is commensurate with a specific mental gesture imagined by user 1 10, such as a spinning cube, a face, or the like. This specific mental gesture is preferably determined during a learning or calibration phase, as the ability to reliably detect mental gestures imagined by an individual varies between subjects.
  • EEG sensor data is acquired from EEG sensors 101 while user 1 10 is imaging a mental gesture which is selected to activate the control mode of computing device 100, i.e., which allows user 1 10 to control computing device 100, or a separate controlled device, by means of head gestures.
  • the thereby acquired EEG sensor data is stored as reference EEG sensor data.
  • the characteristic EEG data pattern is detected by comparing EEG sensor data acquired from EEG sensors 101 to the stored reference EEG sensor data, and determining whether a similarity condition between the two sets of EEG sensor data is fulfilled. In practice, this may be achieved by calculating a correlation between the two data sets and comparing the calculated correlation to a threshold value.
  • correlation is a statistical relationship which reflects the extent to which two random variables, such as the acquired EEG sensor data and stored reference EEG sensor data, are related with each other.
  • the correlation between two random variables is commonly referred to as cross-correlation and can be quantified by means of a correlation function, which can be expressed as an integral over the two random variables over time.
  • correlation functions are normalized such that a perfect correlation between the two random variables, i.e., the two random variables are identical, result in a maximum value which oftentimes is chosen to be equal to one ("1 ").
  • the correlation of two completely independent random variables yields a correlation value of zero ("0").
  • An example is the well-known Pearson product-moment correlation coefficient.
  • a more reliably detection of the characteristic EEG patent in the acquired EEG sensor data may be achieved by utilizing a machine-learning algorithm which is trained during a calibration or learning phase, and using the trained machine-learning algorithm for classifying EEG sensor data which is acquired during normal operation of computing device 100 ("Machine- Learning-Based Coadaptive Calibration for Brain-Computer Interfaces", by C. Vidaurre, C. Sannelli, K.-R. Muller, and B. Blankertz, Neural Computation, vol. 23, pages 791-816, MIT Press Journals, 201 1 ).
  • the characteristic EEG pattern may be detected by means of a Support Vector Classifier (SVC), a technique which has been demonstrated to work reliably even with in-ear EEG ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and J. Chuang, IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pages 130-135, IEEE, 2016).
  • SVC Support Vector Classifier
  • Computing device 100 is further operative to acquire motion sensor data from at least one motion sensor 102 attached to the head, and detect a characteristic movement of the head which is commensurate with the performed head gesture.
  • the motion sensor data may, e.g., be acquired by receiving the motion sensor data from BCI headset 150, via communications module 104 and a compatible communications module 104 comprised in BCI headset 150.
  • Motion sensor 102 may, e.g., comprise one or more inertial sensors, accelerometers, gyroscopes, or any combination thereof.
  • accelerometers of the type which are provided with today's mobile phones, smartphones, and tablets, can be used to detect the orientation of the device.
  • the gyroscope adds an additional dimension to the information supplied by the accelerometer by tracking a rotation or twist of the device. More specifically, an accelerometer measures a linear
  • a gyroscope measures an angular rotational velocity.
  • a change in orientation and/or rotation of a device can be reliably determined.
  • the accuracy the acquired motion sensor data may be improved by employing differential measurements between at least two motion sensors 102 which are attached at separate locations on the head of user 1 10, e.g., one motion sensor 102 on the left side of the head and one motion sensor 102 on the right side of the head, e.g., close to the ears.
  • motion sensor 102 comprised in BCI headset 150 which is worn by user 1 10 can be used to determine whether user 1 10 is performing a specific head gesture which, e.g., involves tilting the head backwards or forwards 1 12, tilting the head left or right 1 13, rotating the head clockwise or counter-clockwise 1 14, or any combination thereof. If performed in
  • the different movements may either be performed
  • a head gesture may involve tilting the head forwards followed by a clockwise rotation of the head.
  • the detection of a characteristic movement of the head which is commensurate with the performed head gesture is achieved by comparing the acquired motion sensor data to reference motion sensor data, and concluding that the characteristic movement of the head is detected if the acquired motion sensor data and the reference motion sensor data fulfil a similarity condition.
  • the reference motion sensor data may, e.g., be obtained and stored by computing device 100 during a learning or calibration phase, by acquiring motion sensor data from motion sensor 102 while user 1 10 performs a head gesture which he/or she wishes to use for initiating an associated control operation.
  • different sets of reference motion sensor data corresponding to multiple head gestures may be obtained and stored by computing device 100 so as to enable user 1 10 to initiate different control operations, as is described further below.
  • the comparison of the acquired motion sensor data and the reference motion sensor data may, e.g., be performed by calculation a correlation between the two data sets, as is described hereinbefore, and comparing the calculated correlation to a threshold value.
  • Computing device 100 is further operative to initiate a control operation which is associated with the performed head gesture.
  • the control operation may control an operation of computing device 100, or an operation of a controlled device which is separate from computing device 100, as is described in further detail below.
  • the control operation may, e.g., comprise controlling rendering or play-out of media, such as music or video, as is illustrated in Fig. 1 by a media control center 120 which is displayed by computing device 100.
  • a certain head gesture may initiate changing a media track, such as a song (e.g., "Born to run"), which is currently rendered by computing device 100 and/or played-out using loudspeaker 105 or earphones 105 operatively connected to computing device 100.
  • the initiated control operation has a similar effect as utilizing a user-interface element 121 or 122 for skipping to the next or the preceding track, respectively.
  • the performed head gesture may initiate increasing or decreasing a volume of audio which is currently played-out by computing device 100, similar to utilizing a user- interface element 123 or 124 for increasing or decreasing the volume, respectively.
  • each head gesture being commensurate with a detectable characteristic movement of the head, may be associated with different control operations. For instance, tilting the head forward may increase the volume, whereas tilting the head backward decreases the volume. Likewise, tilting the head to the right may skip to the next track, whereas tilting the head to the left skips to a preceding track.
  • control operation may comprise initiating a communication session, such as voice call, a video call, a chat, or any other type of communication session received by computing device 100 or a separate controlled device.
  • control operation may comprise answering a request for establishing a communication session, or terminating a communication session.
  • computing device 100 may be operative to answer an incoming call in response to a first head gesture, such as nodding the head in a way a person would do to indicate confirmation or approval (a silent "Yes"), instead of touching a user- interface element 421 .
  • computing device 100 may be operative to reject or deny an incoming call in response to a second head gesture, such as shaking the head in a way a person would do to indicate disagreement (a silent "No"), instead of touching a user-interface
  • the control operation may comprise controlling a user-interface of computing device 100, or a separate controlled device.
  • computing device 100 may be operative to scroll a displayed page 520 upwards in response to a first head gesture, such as tilting the head backward, instead of touching a user- interface element 521 , and to scroll page 520 downwards in response to a second head gesture, such as tilting the head forward, instead of touching a user-interface element 522.
  • computing device 100 may be operative to move a cursor of a user-interface, or a focus selecting a user-interface element, in response to a head gesture.
  • the control operation may alternatively relate to any other operation or function of computing device 100, or a separate controlled device, and may, e.g., comprise starting execution of an application, i.e., a software application or app, stopping execution of an application, changing a setting of the computing device, a controlled device, or an application, and so forth.
  • an application i.e., a software application or app
  • stopping execution of an application changing a setting of the computing device, a controlled device, or an application, and so forth.
  • one or more head gestures in combination with a mental gesture may be used for adjusting the brightness of display 106, or waking computing device 100 or a separate controlled device from sleep-mode.
  • Computing device 100 may optionally be operative to derive a measure which is related to the performed head gesture from the acquired motion sensor data, such as the distance, or angle, travelled when tilting or rotating the head.
  • the initiated control operation may additionally be dependent on the derived measure. For instance, if the control mode allows user 1 10 to control the volume of audio played-out by computing device 100, the change in volume which is effected by initiating the control operation may be dependent on the derived measure, e.g., the angle by which the head is tilted forwards or backwards.
  • Computing device 100 may optionally be operative to notify, or alert, user 1 10 in response to detecting the characteristic EEG pattern. For instance, computing device 100 may emit an audible signal, display a visual notification, or a provide a haptic feedback, e.g., a vibration. Thereby, user 1 10 is notified that computing device 100 has successfully detected the mental gesture which is associated with the control mode, and has
  • computing device 100 may be operative to acquire the motion sensor data in response to detecting the characteristic EEG pattern. This is advantageous in that motion sensor data is only acquired and analyzed when the intention of user 1 10 to enter the control mode has been detected. As an alternative, motion sensor data may be acquired
  • control operation is initiated in response to detecting the characteristic EEG data pattern in the acquired EEG data in combination with detecting the characteristic movement of the head.
  • control mode corresponds to different control operations, such as changing volume, changing brightness, changing a currently played track, and so forth.
  • Each control mode is associated with a specific mental gesture and a corresponding characteristic EEG data pattern. For instance, thinking about a cube may activate volume control, whereas thinking about a face activates brightness control.
  • a comparison between each one of a set of reference EEG sensor data and the acquired EEG sensor data is performed. Depending on which reference EEG sensor data best matches the acquired EEG sensor data, i.e., is most similar, the corresponding control mode is activated or entered.
  • the reference EEG sensor data which matches the acquired EEG sensor data best may, e.g., be determined by calculating a correlation between the acquired EEG sensor data and each reference EEG sensor data, and activating the control mode which corresponds to the reference EEG sensor data which has the highest correlation with the acquired EEG sensor data.
  • the control mode which the user intends to activate may be identified by machine-learning techniques
  • computing device 100 If computing device 100 is operative to notify user 1 10 in response to detecting the characteristic EEG pattern, computing device 100 may further be operative to provide an indication of the activated control mode to user 1 10, e.g., whether volume control or brightness control is activated.
  • the EEG sensor data may be acquired from an in-ear EEG device 200, as is illustrated in Fig. 2.
  • In-ear EEG device 200 is designed for insertion into the ear channel of ear 1 1 1 , similar to an earbud, and comprises EEG sensors or
  • EEG sensor data captured by in-ear EEG devices and the reliable classification of the EEG sensor data using SVCs has recently been demonstrated ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and J. Chuang, IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pages 130-135, IEEE, 2016).
  • In-ear EEG devices are commercially available, e.g., "Aware” from United Sciences (http://efitaware.com/, retrieved on 1 December 2016).
  • In-ear EEG device 200 further comprises at least one motion sensor 102, similar to what is described with reference to motion sensor 102 comprised in computing device 100.
  • the EEG sensor data and the motion sensor data may be acquired from in-ear EEG device 200 via communications module 104 and a compatible communications module 104 comprised in in-ear EEG device 200.
  • the EEG sensor data may be acquired from an around-ear EEG device 300, as is illustrated in Fig. 3.
  • Around-ear EEG device 300 comprises EEG sensors or electrodes 101 which are arranged for contacting the skin around ear 1 1 1 of user 1 10.
  • the use of EEG sensor data from around-ear EEG devices has, e.g., been reported in "Target Speaker Detection with Concealed EEG Around the Ear", by B. Mirkovic,
  • Around-ear EEG device 300 further comprises at least one motion sensor 102, similar to what is described with reference to motion sensor 102 comprised in computing device 100.
  • the EEG sensor data and the motion sensor data may be acquired from around-ear EEG device 300 via communications module 104 and a compatible
  • communications module 104 comprised in around-ear EEG device 300.
  • EEG sensor data which is suitable for detecting the characteristic EEG data pattern.
  • This may, e.g., be achieved using a decoder which implements a transfer function for transforming the raw EEG signals captured by several EEG electrode channels into one time- dependent function.
  • the signal representing the one time-dependent function may, e.g., be used as input for calculating a correlation between the acquired EEG sensor data and the reference EEG sensor data.
  • alternative embodiments of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on any one of BCI
  • headset 150 in-ear EEG device 200, and around-ear EEG device 300, respectively.
  • computing device 150 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing device 100 hereinbefore, computing device 150 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion sensor 102, detect a characteristic movement of the head, and initiate a control operation which is associated with the performed head gesture. The control operation may either control an operation of computing device 150, or the operation of a controlled device which is separate from computing device 150, such as changing a volume setting.
  • a further alternative embodiment 200 of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on in-ear EEG device 200.
  • Computing device 200 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 inside the ear channel of ear 1 1 1 , and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing
  • computing device 200 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion
  • control operation may either control an operation of computing device 200, or the operation of a controlled device which is separate from computing device 200, such as changing a volume setting.
  • yet a further embodiment 300 of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on around-ear EEG
  • Computing device 300 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 around ear 1 1 1 of user 1 10, and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing device 100 hereinbefore, computing device 300 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion sensor 102, detect a characteristic movement of the head, and initiate a control operation which is associated with the performed head gesture. The control operation may either control an operation of computing device 300, or the operation of a controlled device which is separate from computing device 300, such as changing a volume setting.
  • Embodiments of the invention which control operation of a controlled device which is separate from the computing device utilize communications module 104 for transmitting a control signal pertaining to the control operation to the controlled device.
  • the control signal may be transmitted by using any suitable protocol, e.g., the HyperText Transfer Protocol (HTTP), the Constrained Application Protocol (CoAP), or the like, and triggers the controlled device to effect the control operation.
  • HTTP HyperText Transfer Protocol
  • CoAP Constrained Application Protocol
  • the controlled device may be any type of computing device comprising a communications module which is operative to receive a control signal from an embodiment of the computing device for initiating a control operation in response to a head gesture performed by user 1 10, including, e.g., mobile phones, smartphones, tablets, personal computers, laptops, smartwatches, wearables, digital cameras, household appliances, televisions, and vehicles, such as a car or a UAV.
  • the controlled device may, e.g., be smartphone 100 described with reference to Fig.
  • an embodiment of the computing device for initiating a control operation in response to a head gesture performed by user 1 10, such as BCI headset 150, in-ear EEG device 200, or around-ear EEG device 300, may be used for controlling an operation of smartphone 100 through a combination of a mental gesture and a head gesture, as is described hereinbefore.
  • processing means 103 comprised in an embodiment 100, 150, 200, or 300, of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 (hereinafter referred to as 100-300), respectively, are described with reference to Figs. 6 and 7.
  • a first embodiment 600 of processing means 103 is shown in Fig. 6.
  • Processing means 600 comprises a processing unit 602, such as a general purpose processor, and a computer-readable storage medium 603, such as a Random Access Memory (RAM), a Flash memory, or the like.
  • processing means 600 comprises one or more interfaces 601 ("I/O" in Fig. 6) for controlling and/or receiving information from other components comprised in computing device 100-300, such as motion sensor 101 , EEG sensors 102, communications module 104, loudspeaker/earphones 105, and display 106.
  • interface(s) 601 may be operative to acquire EEG sensor data and motion sensor data, either from built-in EEG sensors 101 and motion sensor 102, or from external EEG sensors 101 and motion sensors 102, via communications module 104.
  • the acquired EEG sensor data and motion sensor data may either be received as analog signals, which are digitalized in processing means 600 for subsequent processing, or in a digital format.
  • Memory 603 contains computer-executable instructions 604, i.e., a computer program or software, for computing device 100-300 to become operative to perform in accordance with embodiments of the invention as described herein, when computer-executable instructions 604 are executed on processing unit 602.
  • computer-executable instructions 604 i.e., a computer program or software
  • processing means 700 comprises one or more interfaces 701 ("I/O" in Fig. 7) for controlling and/or receiving information from other components comprised in computing device 100-300, such as motion sensor 101 , EEG sensors 102, communications module 104, loudspeaker/earphones 105, and display 106.
  • interface(s) 701 may be operative to acquire EEG sensor data and motion sensor data, either from built-in EEG sensors 101 and motion sensor 102, or from external EEG sensors 101 and motion sensors 102, via communications module 104.
  • the acquired EEG sensor data and motion sensor data may either be received as analog signals, which are digitalized in processing means 700 for subsequent processing, or in a digital format.
  • Processing means 700 further comprises an EEG module 702, a motion module 703, a control module 704, and, optionally, a notification module 705, which are configured to cause computing device 100-300 to perform in accordance with embodiments of the invention as described herein.
  • EEG module 702 is configured to cause computing device 100-300 to acquire EEG sensor data from EEG sensors 101 contacting a skin of user 1 10 and detect a characteristic EEG data pattern in the acquired EEG data.
  • Motion module 703 is configured to acquire motion sensor data from motion sensor 102 attached to the head of user 1 10 and detect a characteristic movement of the head.
  • Control module 704 is configured to initiate a control operation which is associated with the performed head gesture. The control operation may, e.g., control an operation of computing device 100-300, or that of a controlled device which is separate from computing device 100-300. In the latter case, control module 704 is configured to initiate the control operation by transmitting, via communications module 104, a control signal pertaining to the control operation to the controlled device.
  • EEG module 702 and motion module 703 may be configured to acquire the EEG sensor data and the motion sensor data by receiving, via communications module 104, the EEG sensor data and the motion sensor data from EEG sensors 101 and motion sensor 102, respectively.
  • EEG module 702 may further be configured to derive the characteristic EEG data pattern during a learning or calibration phase.
  • motion module 703 may be configured to acquire the motion sensor data in response to detecting the characteristic EEG pattern.
  • Optional notification module 705 may be configured to notify user 1 10 in response to detecting the characteristic EEG pattern.
  • Interfaces 601 and 701 , and modules 702-705, as well as any additional modules comprised in processing means 700, may be
  • analogue electronic circuitry digital electronic circuitry
  • processing means executing a suitable computer program, i.e., software.
  • Method 800 is performed by a computing device such as a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, a television, or a vehicle.
  • a computing device such as a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, a television, or a vehicle.
  • Method 800 comprises acquiring 801 EEG sensor data from EEG sensors contacting a skin of the user and detecting 802 a characteristic EEG data pattern in the acquired EEG data.
  • Method 800 further comprises acquiring 804 motion sensor data from at least one motion sensor attached to the head and detecting 805 a characteristic movement of the head.
  • Method 800 further comprises initiating 806 a control operation which is associated with the performed head gesture.
  • the control operation may, e.g., control an operation of the computing device.
  • the control operation may control an operation of a controlled device which is separate from the computing device.
  • the initiating 806 the control operation comprises transmitting, via a communications module comprised in the computing device, a control signal pertaining to the control operation to the controlled device.
  • the control operation may, e.g., comprise any one of initiating a communication session, answering a request for establishing a communication session, terminating a communication session, starting execution of an application, stopping execution of an application, changing a setting, controlling rendering of media, or controlling a user-interface of the computing device or the controlled device.
  • the EEG sensor data and the motion sensor data may, e.g., be acquired 801/804 from EEG sensors and at least one motion sensor comprised in the computing device.
  • the EEG sensor data and the motion sensor data may be acquired by receiving, via a communications module comprised in the computing device, the EEG sensor data and the motion sensor data from the EEG sensors and the at least one motion sensor, respectively.
  • the motion sensor data is only acquired 804 if the characteristic EEG pattern is detected, as is illustrated in Fig. 8.
  • the motion sensor data may alternatively be acquired concurrently with
  • method 800 may further comprise notifying 803 the user that the characteristic EEG pattern has been detected.
  • method 800 may further comprise deriving the
  • method 800 may comprise additional, or modified, steps in accordance with what is described throughout this disclosure.
  • An embodiment of method 800 may be implemented as software, such as computer program 604, to be executed by a processing unit comprised in the computing device, whereby the computing device becomes operative to perform in accordance with embodiments of the invention described herein

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A computing device (100, 150) for initiating a control operation in response to a head gesture (112-114) performed by a user (110) of the computing device is provided. The computing device is operative to acquire Electroencephalogram (EEG) sensor data from EEG sensors (101) contacting a skin of the user, detect a characteristic EEG data pattern in the acquired EEG data, which characteristic EEG data pattern is associated with a control mode, acquire motion sensor data from at least one motion sensor (102) attached to the head, detect a characteristic movement of the head, which characteristic movement is commensurate with the performed head gesture, and initiate a control operation which is associated with the performed head gesture. The detected characteristic EEG pattern is commensurate with a specific mental gesture imagined by the user. The control operation may, e.g., relate to control (121-124) play-out of media.

Description

INITIATING A CONTROL OPERATION IN RESPONSE TO A HEAD
GESTURE
Technical field
The invention relates to a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, a method of initiating a control operation in response to a head gesture performed by a user of a computing device, a corresponding computer program, and a corresponding computer program product.
Background
Known solutions for controlling the operation of computing devices are oftentimes based on touch or voice control.
Today, touch control is the primary interaction method for most users and is used to control the operation of, and interact with, various types of computing devices. In particular, this is true for mobile computing devices such as smartphones and tablets, which users control through buttons or a touchscreen displaying a graphical user-interface. Typically, the possibility to interact with the touchscreen is disabled when not being intentionally used to avoid input of unintentional commands. For example, touchscreen-based devices are oftentimes able to detect that they are stored in a pocket, or held to a person's ear, and the touchscreen is disabled accordingly.
Known touch-control solutions typically require a user to touch a touchscreen several times, or to press several buttons, in order to activate a function. In addition, they oftentimes require the user to be looking at the screen. This prevents or impedes control of the computing devices when they device are stored in a pocket, or in situations where the user's vision or hands are occupied, such as when the user is riding a bike or driving a car. Voice control solutions offer an alternative way of interacting with computing devices, and is based on voice recognition software to interpret commands spoken by a user. These solutions suffer from the disadvantage that a user first needs to activate the voice control functionality of the device. For example, Apple's Siri is activated by pressing the home button for two seconds. Alternative voice activation techniques may require the user to utter a key phrase, such as "OK Google" or "Alexa", in order to activate the voice control function of a computing device.
Moreover, whilst the user may not be required to touch the computing device or to retrieve from a pocket, voice control solutions may suffer in noisy environments which make the user's voice command difficult to recognize and renders a voice-controlled device prone to accidental activation of a function. Voice control may also constitute a security risk, as voice control commands may be mimicked with malicious intent, e.g., by giving voice commands to a device through hidden audio embedded in a YouTube video (see, e.g., "Hidden Voice Commands", by N. Carlini, P. Mishra, T. Vaidya, Y. Zhang, M. Sherr, C. Shields, D. Wagner, and W. Zhou, Proceedings of the 25th USENIX Security Symposium, pages 513-530, USENIX Association, 2016). In addition, many people do not like to talk to devices in public, partly due to privacy concerns arise from speaking commands aloud.
It is also known to utilize Electroencephalogram (EEG) sensor data, commonly referred to as brain waves, to control mobile devices using electrode caps or headsets, known as Brain-Computer Interfaces (BCIs), to record signals from the user's brain and use them to control a computing device (see, e.g., "Samsung Demos a Tablet Controlled by Your Brain", by S. Young Rojahn, MIT technology Review, Biomedicine, 19 April 2013, retrieved on 17 January 2017). However, BCIs often lack signal resolution and are vulnerable to noise from muscular activity or poor contact of the sensors with the skin. As a consequence, these solutions offer only crude control over functions. Summary
It is an object of the invention to provide an improved alternative to the above techniques and prior art.
More specifically, it is an object of the invention to provide improved solutions for controlling a computing device, in particular controlling an operation or a function of a computing device.
These and other objects of the invention are achieved by means of different aspects of the invention, as defined by the independent claims. Embodiments of the invention are characterized by the dependent claims.
According to a first aspect of the invention, a computing device for initiating a control operation is provided. The control operation is initiated in response to a head gesture performed by a user of the computing device. The computing device comprises processing means operative to acquire EEG sensor data from EEG sensors contacting a skin of the user, and to detect a characteristic EEG data pattern in the acquired EEG data. The characteristic EEG data pattern is associated with a control mode. The processing means is further operative to acquire motion sensor data from at least one motion sensor attached to the head, and to detect a characteristic movement of the head. The characteristic movement is commensurate with the performed head gesture. The processing means is further operative to initiate a control operation which is associated with the performed head gesture.
According to a second aspect of the invention, a method of initiating a control operation is provided. The control operation is initiated in response to a head gesture performed by a user of a computing device. The method comprises acquiring EEG sensor data from EEG sensors contacting a skin of the user, and detecting a characteristic EEG data pattern in the acquired EEG data. The characteristic EEG data pattern is associated with a control mode. The method further comprises acquiring motion sensor data from at least one motion sensor attached to the head, and detecting a characteristic movement of the head. The characteristic movement is commensurate with the performed head gesture. The method further comprises initiating a control operation which is associated with the performed head gesture.
According to a third aspect of the invention, a computer program is provided. The computer program comprises computer-executable instructions for causing a device to perform the method according to an embodiment of the second aspect of the invention, when the computer- executable instructions are executed on a processing unit comprised in the device.
According to a fourth aspect of the invention, a computer program product is provided. The computer program product comprises a computer- readable storage medium which has the computer program according to the third aspect of the invention embodied therein.
The invention makes use of an understanding that an improved solution for enabling a user of a computing device to control an operation of the computing device, or a controlled device which is separate from the computing device, may be achieved by initiating a control operation in dependence on a combination of a mental gesture imagined by the user and a head gesture performed by the user. The detected characteristic EEG pattern is commensurate with the mental gesture imagined by the user, and the characteristic movement of the head is commensurate with the head gesture performed by the user.
In practice, the user may control an operation of function of the computing device, or a controlled device, by imaging a specific mental gesture to enter a control mode which enables the user to initiate a control operation using head gestures, e.g., a change in orientation of the head and/or a rotation of the head. The mental gesture imaged by the user may, e.g., be a spinning cube, a face, or similar. Preferably, the mental gesture which is associated with the control mode is selected so as to be easily detectable. Optionally, several control modes may be defined, corresponding to different control operations such as changing a volume setting and changing a brightness setting, respectively. In this case, each control mode is associated with a different mental gesture and a corresponding characteristic EEG data pattern. For instance, the user may activate volume control by thinking about a cube, whereas brightness control is activated by thinking about a face.
Advantageously, embodiments of the invention relieve the user from retrieving the computing device from a pocket and/or touching the computing device. Thereby, a more reliable control of an operation of the computing device, or a controlled device, is achieved in comparison with solutions relying on EEG data alone. In addition, embodiments of the invention provide an improved security over solutions relying on voice control, as it is more difficult for an adversary to initiate a control operation. Also, as no audible input is required, embodiments of the invention are advantageous in situations where privacy and etiquette is a concern, and renders the technique insensitive to acoustically noisy environments.
In the present context, the control operation may, e.g., relate to a communication session, such as a voice call, a video call, a chat session, or the like. Alternatively, the control operation may relate to execution of a software application or app, to rendering of media, or to a user-interface of the computing device or the controlled device. The computing device, or the controlled device, may, e.g., be a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, or a television. For instance, the control operation may effect changing a volume of media play-out, changing a brightness of a display, scrolling a page, moving a cursor, changing a setting associated with the device or a software application, answering an incoming call, and so forth. The computing device or the controlled device may also be a vehicle, such as a car or an Unmanned Aerial Vehicle (UAV), aka drone, and the control operation may relate to controlling the vehicle, e.g., steering the vehicle.
According to an embodiment of the invention, the characteristic EEG data pattern is derived during a calibration phase. Advantageously, by letting the user imagine a variety of mental gestures and evaluating the resulting EEG data, one or more specific mental gestures may be identified which can be reliably detected in the acquired EEG sensor data. The characteristic EEG patterns which are triggered by certain mental gestures vary between individuals and are preferably learned for each user. A comparison of acquired EEG sensor data to reference EEG sensor data which was acquired during the calibration phase is subsequently used for classifying the acquired EEG sensor data, in particular to detect whether the user has imagined a specific mental gesture which is associated with the control mode.
According to an embodiment of the invention, the user is notified in response to detecting the characteristic EEG pattern. For instance, the computing device may emit an audible signal, display a visual notification, or provide the user with a haptic feedback. Thereby the user is notified, or alerted, that the computing device has successfully detected the
characteristic EEG pattern and thereby recognized the user's intention to enter the control mode, allowing the user to initiate the desired control operation by performing a head gesture.
According to an embodiment of the invention, the computing device further comprises EEG sensors arranged for contacting a skin of the user, and at least one motion sensor for measuring a movement of the head. That is, the EEG sensors and the motion sensors are provided with the computing device. The computing device may, e.g., be a BCI headset, an in-ear EEG device, or an around-ear EEG device.
According to another embodiment of the invention, the computing device further comprises a communications module, and the EEG sensor data and the motion sensor data are acquired by receiving the EEG sensor data and the motion sensor data, via the communications module, from the EEG sensors and the at least one motion sensor, respectively. For instance, the computing device may receive the EEG sensor data and the motion sensor data from a BCI headset, an in-ear EEG device, or an around-ear EEG device.
Even though advantages of the invention have in some cases been described with reference to embodiments of the first aspect of the invention, corresponding reasoning applies to embodiments of other aspects of the invention.
Further objectives of, features of, and advantages with, the invention will become apparent when studying the following detailed disclosure, the drawings, and the appended claims. Those skilled in the art realize that different features of the invention can be combined to create embodiments other than those described in the following.
Brief description of the drawings
The above, as well as additional objects, features and advantages of the invention, will be better understood through the following illustrative and non-limiting detailed description of embodiments of the invention, with reference to the appended drawings, in which:
Fig. 1 shows computing devices for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with embodiments of the invention.
Fig. 2 shows a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with another embodiment of the invention.
Fig. 3 shows a computing device for initiating a control operation in response to a head gesture performed by a user of the computing device, in accordance with a further embodiment of the invention. Fig. 4 illustrates answering an incoming call, as an example of a control operation initiated by an embodiment of the invention.
Fig. 5 illustrates scrolling a page, as another example of a control operation initiated by an embodiment of the invention.
Fig. 6 shows an embodiment of the processing means comprised in the computing device for initiating a control operation in response to a head gesture performed by a user of the computing device.
Fig. 7 shows another embodiment of the processing means comprised in the computing device for initiating a control operation in response to a head gesture performed by a user of the computing device.
Fig. 8 shows a method of initiating a control operation in response to a head gesture performed by a user of a computing device, in accordance with embodiments of the invention.
All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested.
Detailed description The invention will now be described more fully herein after with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In Fig. 1 , an embodiment 100 of the computing device for initiating a control operation in response to a head gesture performed by a user 1 10 of the computing device is illustrated as a tablet, a smartphone, or a phablet (a device which is intermediate in size between that of a smartphone and that of a tablet). Computing device 100 is illustrated to comprise a processing means 103, a communications module 104, and a display 106, e.g., a touchscreen. It will be appreciated that computing device 100 may further comprise additional components, e.g. a microphone, a loudspeaker 105, or the like.
Communications module 104 is operative to effect wireless
communications through a Wireless Local Arena Network (WLAN)/Wi-Fi network, Bluetooth, ZigBee, or any other short-range communications technology. Alternatively, or additionally, communications module 104 may further be operative to effect wireless communications with a Radio Access Network (RAN) or with another compatible device, based on a cellular telecommunications technique such as the Global System for Mobile communications (GSM), Universal Mobile Telecommunications System (UMTS), Long Term Evolution (LTE), or any 5G standard, e.g., Next
Generation (NG) and New Radio (NR).
Computing device 100 is operative to acquire EEG sensor data from a set of EEG sensors 101 contacting a skin of user 1 10. EEG is a technique which can be used for detecting a subject's brain activity by placing sensors, i.e., electrodes on the subject's scalp or other parts of the subject's head, e.g., within the ear channel and around the ear ("The neurophysiological bases of EEG and EEG measurement: a review for the rest of us", by A. F. Jackson and D. J. Bolger, Psychophysiology, vol. 51 , pages 1061— 1071 , Wiley, 2014). These electrodes are used for measuring small electric potentials which are generated by action potentials of firing neurons, which are electrochemical excitations caused by the creation of an ion current in the cell's axon to activate connected cells through the synapses. Whereas the most common method to capture EEG signals is by placing the electrodes directly on the scalp of the subject, similar to what is illustrated in Fig. 1 , it has recently been demonstrated that EEG signals from within the subject's ear channel can be detected and reliably classified ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and
J. Chuang, IEEE 13th International Conference on Wearable and Implantable
Body Sensor Networks (BSN), pages 130-135, IEEE, 2016).
For instance, as is illustrated in Fig. 1 , the EEG sensor data may be acquired from a BCI headset 150 which user 1 10 is wearing, comprising EEG sensors 101 , i.e., electrodes, which are arranged for contacting a scalp and/or forehead of user 1 10 and capturing nerve signals from user 1 10. The EEG sensor data may, e.g., be acquired by receiving the EEG sensor data from BCI headset 150 via communications module 104 and a compatible communications module 104 comprised in BCI headset 150.
Computing device 100 is further operative to detect a characteristic EEG data pattern in the acquired EEG data. The characteristic EEG data pattern is associated with a control mode which allows user 1 10 to control computing device 100, or a separate controlled device, using head gestures. The detected characteristic EEG pattern is commensurate with a specific mental gesture imagined by user 1 10, such as a spinning cube, a face, or the like. This specific mental gesture is preferably determined during a learning or calibration phase, as the ability to reliably detect mental gestures imagined by an individual varies between subjects. More specifically, during the calibration or learning phase EEG sensor data is acquired from EEG sensors 101 while user 1 10 is imaging a mental gesture which is selected to activate the control mode of computing device 100, i.e., which allows user 1 10 to control computing device 100, or a separate controlled device, by means of head gestures. The thereby acquired EEG sensor data is stored as reference EEG sensor data. Subsequently, during normal operation of computing device 100, i.e., after the learning or calibration phase has been completed, the characteristic EEG data pattern is detected by comparing EEG sensor data acquired from EEG sensors 101 to the stored reference EEG sensor data, and determining whether a similarity condition between the two sets of EEG sensor data is fulfilled. In practice, this may be achieved by calculating a correlation between the two data sets and comparing the calculated correlation to a threshold value.
As is known in the art, correlation is a statistical relationship which reflects the extent to which two random variables, such as the acquired EEG sensor data and stored reference EEG sensor data, are related with each other. The correlation between two random variables is commonly referred to as cross-correlation and can be quantified by means of a correlation function, which can be expressed as an integral over the two random variables over time. Typically, correlation functions are normalized such that a perfect correlation between the two random variables, i.e., the two random variables are identical, result in a maximum value which oftentimes is chosen to be equal to one ("1 "). Correspondingly, the correlation of two completely independent random variables yields a correlation value of zero ("0"). An example is the well-known Pearson product-moment correlation coefficient.
A more reliably detection of the characteristic EEG patent in the acquired EEG sensor data may be achieved by utilizing a machine-learning algorithm which is trained during a calibration or learning phase, and using the trained machine-learning algorithm for classifying EEG sensor data which is acquired during normal operation of computing device 100 ("Machine- Learning-Based Coadaptive Calibration for Brain-Computer Interfaces", by C. Vidaurre, C. Sannelli, K.-R. Muller, and B. Blankertz, Neural Computation, vol. 23, pages 791-816, MIT Press Journals, 201 1 ). Alternatively, the characteristic EEG pattern may be detected by means of a Support Vector Classifier (SVC), a technique which has been demonstrated to work reliably even with in-ear EEG ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and J. Chuang, IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pages 130-135, IEEE, 2016).
Computing device 100 is further operative to acquire motion sensor data from at least one motion sensor 102 attached to the head, and detect a characteristic movement of the head which is commensurate with the performed head gesture. The motion sensor data may, e.g., be acquired by receiving the motion sensor data from BCI headset 150, via communications module 104 and a compatible communications module 104 comprised in BCI headset 150. Motion sensor 102 may, e.g., comprise one or more inertial sensors, accelerometers, gyroscopes, or any combination thereof. As is known in the art, accelerometers of the type which are provided with today's mobile phones, smartphones, and tablets, can be used to detect the orientation of the device. The gyroscope adds an additional dimension to the information supplied by the accelerometer by tracking a rotation or twist of the device. More specifically, an accelerometer measures a linear
acceleration of movement, while a gyroscope measures an angular rotational velocity. By combining information obtained from an accelerometer and a gyroscope, a change in orientation and/or rotation of a device can be reliably determined. The accuracy the acquired motion sensor data may be improved by employing differential measurements between at least two motion sensors 102 which are attached at separate locations on the head of user 1 10, e.g., one motion sensor 102 on the left side of the head and one motion sensor 102 on the right side of the head, e.g., close to the ears.
Hence, motion sensor 102 comprised in BCI headset 150 which is worn by user 1 10 can be used to determine whether user 1 10 is performing a specific head gesture which, e.g., involves tilting the head backwards or forwards 1 12, tilting the head left or right 1 13, rotating the head clockwise or counter-clockwise 1 14, or any combination thereof. If performed in
combination, the different movements may either be performed
simultaneously or in a specific sequence. As an example, a head gesture may involve tilting the head forwards followed by a clockwise rotation of the head.
The detection of a characteristic movement of the head which is commensurate with the performed head gesture is achieved by comparing the acquired motion sensor data to reference motion sensor data, and concluding that the characteristic movement of the head is detected if the acquired motion sensor data and the reference motion sensor data fulfil a similarity condition. The reference motion sensor data may, e.g., be obtained and stored by computing device 100 during a learning or calibration phase, by acquiring motion sensor data from motion sensor 102 while user 1 10 performs a head gesture which he/or she wishes to use for initiating an associated control operation. Preferably, different sets of reference motion sensor data corresponding to multiple head gestures may be obtained and stored by computing device 100 so as to enable user 1 10 to initiate different control operations, as is described further below. The comparison of the acquired motion sensor data and the reference motion sensor data may, e.g., be performed by calculation a correlation between the two data sets, as is described hereinbefore, and comparing the calculated correlation to a threshold value.
Computing device 100 is further operative to initiate a control operation which is associated with the performed head gesture. The control operation may control an operation of computing device 100, or an operation of a controlled device which is separate from computing device 100, as is described in further detail below. To give a few examples, the control operation may, e.g., comprise controlling rendering or play-out of media, such as music or video, as is illustrated in Fig. 1 by a media control center 120 which is displayed by computing device 100. For instance, a certain head gesture may initiate changing a media track, such as a song (e.g., "Born to run"), which is currently rendered by computing device 100 and/or played-out using loudspeaker 105 or earphones 105 operatively connected to computing device 100. The initiated control operation has a similar effect as utilizing a user-interface element 121 or 122 for skipping to the next or the preceding track, respectively. Alternatively, the performed head gesture may initiate increasing or decreasing a volume of audio which is currently played-out by computing device 100, similar to utilizing a user- interface element 123 or 124 for increasing or decreasing the volume, respectively.
Preferably, different head gestures, each head gesture being commensurate with a detectable characteristic movement of the head, may be associated with different control operations. For instance, tilting the head forward may increase the volume, whereas tilting the head backward decreases the volume. Likewise, tilting the head to the right may skip to the next track, whereas tilting the head to the left skips to a preceding track.
As a further example, the control operation may comprise initiating a communication session, such as voice call, a video call, a chat, or any other type of communication session received by computing device 100 or a separate controlled device. Alternatively, the control operation may comprise answering a request for establishing a communication session, or terminating a communication session. For instance, as is illustrated in Fig. 4, computing device 100 may be operative to answer an incoming call in response to a first head gesture, such as nodding the head in a way a person would do to indicate confirmation or approval (a silent "Yes"), instead of touching a user- interface element 421 . Correspondingly, computing device 100 may be operative to reject or deny an incoming call in response to a second head gesture, such as shaking the head in a way a person would do to indicate disagreement (a silent "No"), instead of touching a user-interface
element 422.
As a yet a further example, and with reference to Fig. 5, the control operation may comprise controlling a user-interface of computing device 100, or a separate controlled device. For instance, computing device 100 may be operative to scroll a displayed page 520 upwards in response to a first head gesture, such as tilting the head backward, instead of touching a user- interface element 521 , and to scroll page 520 downwards in response to a second head gesture, such as tilting the head forward, instead of touching a user-interface element 522. As an alternative to scrolling a displayed page, computing device 100 may be operative to move a cursor of a user-interface, or a focus selecting a user-interface element, in response to a head gesture.
The control operation may alternatively relate to any other operation or function of computing device 100, or a separate controlled device, and may, e.g., comprise starting execution of an application, i.e., a software application or app, stopping execution of an application, changing a setting of the computing device, a controlled device, or an application, and so forth. For instance, one or more head gestures in combination with a mental gesture may be used for adjusting the brightness of display 106, or waking computing device 100 or a separate controlled device from sleep-mode.
Computing device 100 may optionally be operative to derive a measure which is related to the performed head gesture from the acquired motion sensor data, such as the distance, or angle, travelled when tilting or rotating the head. The initiated control operation may additionally be dependent on the derived measure. For instance, if the control mode allows user 1 10 to control the volume of audio played-out by computing device 100, the change in volume which is effected by initiating the control operation may be dependent on the derived measure, e.g., the angle by which the head is tilted forwards or backwards.
Computing device 100 may optionally be operative to notify, or alert, user 1 10 in response to detecting the characteristic EEG pattern. For instance, computing device 100 may emit an audible signal, display a visual notification, or a provide a haptic feedback, e.g., a vibration. Thereby, user 1 10 is notified that computing device 100 has successfully detected the mental gesture which is associated with the control mode, and has
accordingly recognized the intention of user 1 10 to perform a head gesture in order to initiate a control operation.
Optionally, computing device 100 may be operative to acquire the motion sensor data in response to detecting the characteristic EEG pattern. This is advantageous in that motion sensor data is only acquired and analyzed when the intention of user 1 10 to enter the control mode has been detected. As an alternative, motion sensor data may be acquired
concurrently with acquiring the EEG sensor data and detecting the
characteristic EEG pattern. In this case, the control operation is initiated in response to detecting the characteristic EEG data pattern in the acquired EEG data in combination with detecting the characteristic movement of the head.
With respect to detecting a characteristic EEG data pattern in the acquired EEG data, it will be appreciated that several distinct control modes may be defined which correspond to different control operations, such as changing volume, changing brightness, changing a currently played track, and so forth. Each control mode is associated with a specific mental gesture and a corresponding characteristic EEG data pattern. For instance, thinking about a cube may activate volume control, whereas thinking about a face activates brightness control. In order to detect a characteristic EEG pattern in the acquired EEG data, a comparison between each one of a set of reference EEG sensor data and the acquired EEG sensor data is performed. Depending on which reference EEG sensor data best matches the acquired EEG sensor data, i.e., is most similar, the corresponding control mode is activated or entered. The reference EEG sensor data which matches the acquired EEG sensor data best may, e.g., be determined by calculating a correlation between the acquired EEG sensor data and each reference EEG sensor data, and activating the control mode which corresponds to the reference EEG sensor data which has the highest correlation with the acquired EEG sensor data. Alternatively, the control mode which the user intends to activate may be identified by machine-learning techniques
("Machine-Learning-Based Coadaptive Calibration for Brain-Computer Interfaces", by C. Vidaurre, C. Sannelli, K.-R. Muller, and B. Blankertz, Neural Computation, vol. 23, pages 791-816, MIT Press Journals, 201 1 ) or by means of SVCs ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and J. Chuang, IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pages 130-135, IEEE, 2016). If computing device 100 is operative to notify user 1 10 in response to detecting the characteristic EEG pattern, computing device 100 may further be operative to provide an indication of the activated control mode to user 1 10, e.g., whether volume control or brightness control is activated.
As an alternative to acquiring the EEG sensor data from BCI headset 150, i.e., from the scalp and/or forehead of user 1 10, the EEG sensor data may be acquired from an in-ear EEG device 200, as is illustrated in Fig. 2. In-ear EEG device 200 is designed for insertion into the ear channel of ear 1 1 1 , similar to an earbud, and comprises EEG sensors or
electrodes 101 which are arranged for contacting the skin within the ear channel. The use of EEG sensor data captured by in-ear EEG devices, and the reliable classification of the EEG sensor data using SVCs has recently been demonstrated ("Classifying Mental Gestures with In-Ear EEG", by N. Merrill, M. T. Curran, J.-K. Yang, and J. Chuang, IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), pages 130-135, IEEE, 2016). In-ear EEG devices are commercially available, e.g., "Aware" from United Sciences (http://efitaware.com/, retrieved on 1 December 2016). In-ear EEG device 200 further comprises at least one motion sensor 102, similar to what is described with reference to motion sensor 102 comprised in computing device 100. The EEG sensor data and the motion sensor data may be acquired from in-ear EEG device 200 via communications module 104 and a compatible communications module 104 comprised in in-ear EEG device 200.
As a further alternative, the EEG sensor data may be acquired from an around-ear EEG device 300, as is illustrated in Fig. 3. Around-ear EEG device 300 comprises EEG sensors or electrodes 101 which are arranged for contacting the skin around ear 1 1 1 of user 1 10. The use of EEG sensor data from around-ear EEG devices has, e.g., been reported in "Target Speaker Detection with Concealed EEG Around the Ear", by B. Mirkovic,
M. G. Bleichner, M. De Vos, and S. Debener, Frontiers in Neuroscience, vol. 10, article 349 (2016). Around-ear EEG device 300 further comprises at least one motion sensor 102, similar to what is described with reference to motion sensor 102 comprised in computing device 100. The EEG sensor data and the motion sensor data may be acquired from around-ear EEG device 300 via communications module 104 and a compatible
communications module 104 comprised in around-ear EEG device 300.
It will be appreciated that, depending on the type of EEG sensors or electrodes 101 comprised in BCI headset 150, in-ear EEG device 200, or around-ear EEG device 300, and the number of electrodes used and their placement on the skin of user 1 10, additional decoding, data fusion, or data processing, may be required for deriving EEG sensor data which is suitable for detecting the characteristic EEG data pattern. This may, e.g., be achieved using a decoder which implements a transfer function for transforming the raw EEG signals captured by several EEG electrode channels into one time- dependent function. The signal representing the one time-dependent function may, e.g., be used as input for calculating a correlation between the acquired EEG sensor data and the reference EEG sensor data.
Further with reference to Figs. 1 to 3, alternative embodiments of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on any one of BCI
headset 150, in-ear EEG device 200, and around-ear EEG device 300, respectively.
More specifically, and with reference to Fig. 1 , computing device 150 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing device 100 hereinbefore, computing device 150 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion sensor 102, detect a characteristic movement of the head, and initiate a control operation which is associated with the performed head gesture. The control operation may either control an operation of computing device 150, or the operation of a controlled device which is separate from computing device 150, such as changing a volume setting.
With reference to Fig. 2, a further alternative embodiment 200 of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on in-ear EEG device 200. Computing device 200 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 inside the ear channel of ear 1 1 1 , and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing
device 100 hereinbefore, computing device 200 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion
sensor 102, detect a characteristic movement of the head, and initiate a control operation which is associated with the performed head gesture. The control operation may either control an operation of computing device 200, or the operation of a controlled device which is separate from computing device 200, such as changing a volume setting.
With reference to Fig. 3, yet a further embodiment 300 of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 may be based on around-ear EEG
device 300. Computing device 300 is similar to computing device 100 but comprises, in addition to processing means 103 and communications module 104, EEG sensors 101 arranged for contacting a skin of user 1 10 around ear 1 1 1 of user 1 10, and at least one motion sensor 102 for measuring a movement of the head. Similar to what is described with reference to computing device 100 hereinbefore, computing device 300 is operative to acquire EEG sensor data from EEG sensors 101 , detect a characteristic EEG data pattern in the acquired EEG data, acquire motion sensor data from motion sensor 102, detect a characteristic movement of the head, and initiate a control operation which is associated with the performed head gesture. The control operation may either control an operation of computing device 300, or the operation of a controlled device which is separate from computing device 300, such as changing a volume setting.
Embodiments of the invention which control operation of a controlled device which is separate from the computing device utilize communications module 104 for transmitting a control signal pertaining to the control operation to the controlled device. The control signal may be transmitted by using any suitable protocol, e.g., the HyperText Transfer Protocol (HTTP), the Constrained Application Protocol (CoAP), or the like, and triggers the controlled device to effect the control operation. In general, the controlled device may be any type of computing device comprising a communications module which is operative to receive a control signal from an embodiment of the computing device for initiating a control operation in response to a head gesture performed by user 1 10, including, e.g., mobile phones, smartphones, tablets, personal computers, laptops, smartwatches, wearables, digital cameras, household appliances, televisions, and vehicles, such as a car or a UAV. For instance, the controlled device may, e.g., be smartphone 100 described with reference to Fig. 1 , and an embodiment of the computing device for initiating a control operation in response to a head gesture performed by user 1 10, such as BCI headset 150, in-ear EEG device 200, or around-ear EEG device 300, may be used for controlling an operation of smartphone 100 through a combination of a mental gesture and a head gesture, as is described hereinbefore.
It will be appreciated that, whereas embodiments of the invention have been described with reference to smartphone 100, BCI headset 150, in- ear EEG device 200, and around-ear EEG device 300, one may easily envisage alternative embodiments of the invention which are based on, e.g., a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, a television, and a vehicle, such as a car or a UAV.
In the following, embodiments of processing means 103 comprised in an embodiment 100, 150, 200, or 300, of the computing device for initiating a control operation in response to a head gesture performed by user 1 10 (hereinafter referred to as 100-300), respectively, are described with reference to Figs. 6 and 7.
A first embodiment 600 of processing means 103 is shown in Fig. 6.
Processing means 600 comprises a processing unit 602, such as a general purpose processor, and a computer-readable storage medium 603, such as a Random Access Memory (RAM), a Flash memory, or the like. In addition, processing means 600 comprises one or more interfaces 601 ("I/O" in Fig. 6) for controlling and/or receiving information from other components comprised in computing device 100-300, such as motion sensor 101 , EEG sensors 102, communications module 104, loudspeaker/earphones 105, and display 106. In particular, interface(s) 601 may be operative to acquire EEG sensor data and motion sensor data, either from built-in EEG sensors 101 and motion sensor 102, or from external EEG sensors 101 and motion sensors 102, via communications module 104. The acquired EEG sensor data and motion sensor data may either be received as analog signals, which are digitalized in processing means 600 for subsequent processing, or in a digital format.
Memory 603 contains computer-executable instructions 604, i.e., a computer program or software, for computing device 100-300 to become operative to perform in accordance with embodiments of the invention as described herein, when computer-executable instructions 604 are executed on processing unit 602.
An alternative embodiment 700 of processing means 103 is illustrated in Fig. 7. Similar to processing means 600, processing means 700 comprises one or more interfaces 701 ("I/O" in Fig. 7) for controlling and/or receiving information from other components comprised in computing device 100-300, such as motion sensor 101 , EEG sensors 102, communications module 104, loudspeaker/earphones 105, and display 106. In particular, interface(s) 701 may be operative to acquire EEG sensor data and motion sensor data, either from built-in EEG sensors 101 and motion sensor 102, or from external EEG sensors 101 and motion sensors 102, via communications module 104. The acquired EEG sensor data and motion sensor data may either be received as analog signals, which are digitalized in processing means 700 for subsequent processing, or in a digital format. Processing means 700 further comprises an EEG module 702, a motion module 703, a control module 704, and, optionally, a notification module 705, which are configured to cause computing device 100-300 to perform in accordance with embodiments of the invention as described herein.
In particular, EEG module 702 is configured to cause computing device 100-300 to acquire EEG sensor data from EEG sensors 101 contacting a skin of user 1 10 and detect a characteristic EEG data pattern in the acquired EEG data. Motion module 703 is configured to acquire motion sensor data from motion sensor 102 attached to the head of user 1 10 and detect a characteristic movement of the head. Control module 704 is configured to initiate a control operation which is associated with the performed head gesture. The control operation may, e.g., control an operation of computing device 100-300, or that of a controlled device which is separate from computing device 100-300. In the latter case, control module 704 is configured to initiate the control operation by transmitting, via communications module 104, a control signal pertaining to the control operation to the controlled device.
Optionally, EEG module 702 and motion module 703 may be configured to acquire the EEG sensor data and the motion sensor data by receiving, via communications module 104, the EEG sensor data and the motion sensor data from EEG sensors 101 and motion sensor 102, respectively.
Optionally, EEG module 702 may further be configured to derive the characteristic EEG data pattern during a learning or calibration phase.
Optionally, motion module 703 may be configured to acquire the motion sensor data in response to detecting the characteristic EEG pattern.
Optional notification module 705 may be configured to notify user 1 10 in response to detecting the characteristic EEG pattern.
Interfaces 601 and 701 , and modules 702-705, as well as any additional modules comprised in processing means 700, may be
implemented by any kind of electronic circuitry, e.g., any one, or a
combination of, analogue electronic circuitry, digital electronic circuitry, and processing means executing a suitable computer program, i.e., software.
In the following, embodiments 800 of the method of initiating a control operation in response to a head gesture performed by a user of a computing device are described with reference to Fig. 8. Method 800 is performed by a computing device such as a mobile phone, a smartphone, a tablet, a personal computer, a laptop, a smartwatch, a wearable, a digital camera, a television, or a vehicle.
Method 800 comprises acquiring 801 EEG sensor data from EEG sensors contacting a skin of the user and detecting 802 a characteristic EEG data pattern in the acquired EEG data. Method 800 further comprises acquiring 804 motion sensor data from at least one motion sensor attached to the head and detecting 805 a characteristic movement of the head.
Method 800 further comprises initiating 806 a control operation which is associated with the performed head gesture. The control operation may, e.g., control an operation of the computing device. Alternatively, the control operation may control an operation of a controlled device which is separate from the computing device. In the latter case, the initiating 806 the control operation comprises transmitting, via a communications module comprised in the computing device, a control signal pertaining to the control operation to the controlled device. The control operation may, e.g., comprise any one of initiating a communication session, answering a request for establishing a communication session, terminating a communication session, starting execution of an application, stopping execution of an application, changing a setting, controlling rendering of media, or controlling a user-interface of the computing device or the controlled device.
The EEG sensor data and the motion sensor data may, e.g., be acquired 801/804 from EEG sensors and at least one motion sensor comprised in the computing device. Alternatively, the EEG sensor data and the motion sensor data may be acquired by receiving, via a communications module comprised in the computing device, the EEG sensor data and the motion sensor data from the EEG sensors and the at least one motion sensor, respectively.
Preferably, the motion sensor data is only acquired 804 if the characteristic EEG pattern is detected, as is illustrated in Fig. 8. However, the motion sensor data may alternatively be acquired concurrently with
acquiring 801 the EEG sensor data and detecting 802 the characteristic EEG pattern.
Optionally, method 800 may further comprise notifying 803 the user that the characteristic EEG pattern has been detected.
Optionally, method 800 may further comprise deriving the
characteristic EEG data pattern during a calibration phase.
It will be appreciated that method 800 may comprise additional, or modified, steps in accordance with what is described throughout this disclosure. An embodiment of method 800 may be implemented as software, such as computer program 604, to be executed by a processing unit comprised in the computing device, whereby the computing device becomes operative to perform in accordance with embodiments of the invention described herein
The person skilled in the art realizes that the invention by no means is limited to the embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims.

Claims

1 . A computing device (100, 150; 200; 300) for initiating a control operation in response to a head gesture performed by a user (1 10) of the computing device, the computing device comprising processing means (103; 600; 700) operative to:
acquire Electroencephalogram, EEG, sensor data from EEG sensors (101 ) contacting a skin of the user,
detect a characteristic EEG data pattern in the acquired EEG data, which characteristic EEG data pattern is associated with a control mode, acquire motion sensor data from at least one motion sensor (102) attached to the head,
detect a characteristic movement of the head, which characteristic movement is commensurate with the performed head gesture (1 12-1 14), and
initiate a control operation which is associated with the performed head gesture.
2. The computing device according to claim 1 , wherein the control operation controls an operation of the computing device.
3. The computing device (150; 200; 300) according to claim 1 , wherein the control operation controls an operation of a controlled
device (100) which is separate from the computing device.
4. The computing device according to claim 3, further comprising a communications module (104), the processing means being operative to initiate the control operation by transmitting, via the communications module, a control signal pertaining to the control operation to the controlled device.
5. The computing device according to any one of claims 1 to 4, the processing means being further operative to notify the user in response to detecting the characteristic EEG pattern.
6. The computing device according to any one of claims 1 to 5, wherein the control operation comprises any one of: initiating a
communication session, answering a request for establishing a
communication session, terminating a communication session, starting execution of an application, stopping execution of an application, changing a setting, controlling rendering of media, and controlling a user-interface.
7. The computing device according to any one of claims 1 to 6, wherein the detected characteristic EEG pattern is commensurate with a specific mental gesture imagined by the user.
8. The computing device according to any one of claims 1 to 7, the processing means being further operative to derive the characteristic EEG data pattern during a calibration phase.
9. The computing device according to any one of claims 1 to 8, wherein the characteristic movement of the head comprises at least one of: a change in orientation (1 12, 1 13) of the head and a rotation (1 14) of the head.
10. The computing device according to any one of claims 1 to 9, the processing means being operative to acquire motion sensor data in response to detecting the characteristic EEG pattern.
1 1 . The computing device according to any one of claims 1 to 10, further comprising:
EEG sensors (101 ) arranged for contacting a skin of the user, and at least one motion sensor (102) for measuring a movement of the head.
12. The computing device according to claim 1 1 , being any one of: a Brain-Computer Interface headset (150), an in-ear EEG device (200), and an around-ear EEG device (300).
13. The computing device according to any one of claims 1 to 10, further comprising a communications module (104), the processing mean being further operative to acquire the EEG sensor data and the motion sensor data by receiving, via the communications module, the EEG sensor data and the motion sensor data from the EEG sensors (101 ) and the at least one motion sensor (102), respectively.
14. The computing device according to claim 13, being any one of: a mobile phone (100), a smartphone (100), a tablet (100), a personal computer, a laptop, a smartwatch, a wearable, a digital camera, a television, and a vehicle.
15. A method (800) of initiating a control operation in response to a head gesture performed by a user (1 10) of a computing device, the method comprising:
acquiring (801 ) Electroencephalogram, EEG, sensor data from EEG sensors (101 ) contacting a skin of the user,
detecting (802) a characteristic EEG data pattern in the acquired EEG data, which characteristic EEG data pattern is associated with a control mode,
acquiring (804) motion sensor data from at least one motion
sensor (102) attached to the head, detecting (805) a characteristic movement of the head, which characteristic movement is commensurate with the performed head gesture, and
initiating a control operation which is associated with the performed head gesture.
16. The method according to claim 15, wherein the control operation controls an operation of the computing device.
17. The method according to claim 15, wherein the control operation controls an operation of a controlled device which is separate from the computing device.
18. The method according to claim 17, wherein the initiating the control operation comprises transmitting, via a communications module comprised in the computing device, a control signal pertaining to the control operation to the controlled device.
19. The method according to any one of claims 15 to 18, further comprising notifying (803) the user that the characteristic EEG pattern has been detected.
20. The method according to any one of claims 15 to 19, wherein the control operation comprises any one of: initiating a communication session, answering a request for establishing a communication session, terminating a communication session, starting execution of an application, stopping execution of an application, changing a setting, controlling rendering of media, and controlling a user-interface.
21 . The method according to any one of claims 15 to 20, wherein the detected characteristic EEG pattern is commensurate with a specific mental gesture imagined by the user.
22. The method according to any one of claims 15 to 21 , further comprising deriving the characteristic EEG data pattern during a calibration phase.
23. The method according to any one of claims 15 to 22, wherein the characteristic movement of the head comprises at least one of: a change in orientation of the head and a rotation of the head.
24. The method according to any one of claims 15 to 23, wherein the motion sensor data is acquired in response to detecting the characteristic EEG pattern.
25. The method according to any one of claims 15 to 24, wherein the acquiring the EEG sensor data and the motion sensor data comprises receiving, via a communications module comprised in the computing device, the EEG sensor data and the motion sensor data from the EEG
sensors (101 ) and the at least one motion sensor (102), respectively.
26. A computer program (604) comprising computer-executable instructions for causing a device to perform the method according to any one of claims 15 to 25, when the computer-executable instructions are executed on a processing unit (602) comprised in the device.
27. A computer program product comprising a computer-readable storage medium (603), the computer-readable storage medium having the computer program (604) according to claim 26 embodied therein.
PCT/EP2017/052515 2017-02-06 2017-02-06 Initiating a control operation in response to a head gesture Ceased WO2018141409A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2017/052515 WO2018141409A1 (en) 2017-02-06 2017-02-06 Initiating a control operation in response to a head gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2017/052515 WO2018141409A1 (en) 2017-02-06 2017-02-06 Initiating a control operation in response to a head gesture

Publications (1)

Publication Number Publication Date
WO2018141409A1 true WO2018141409A1 (en) 2018-08-09

Family

ID=57965953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/052515 Ceased WO2018141409A1 (en) 2017-02-06 2017-02-06 Initiating a control operation in response to a head gesture

Country Status (1)

Country Link
WO (1) WO2018141409A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110502103A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Brain-controlled UAV system and its control method based on brain-computer interface
CN110989832A (en) * 2019-11-21 2020-04-10 维沃移动通信有限公司 Control method and electronic equipment
CN111290579A (en) * 2020-02-10 2020-06-16 Oppo广东移动通信有限公司 Control method, apparatus, electronic device and computer-readable medium for virtual content
CN111831111A (en) * 2019-04-16 2020-10-27 硅谷介入有限公司 EEG with artificial intelligence as control device
CN112363659A (en) * 2020-11-09 2021-02-12 平安普惠企业管理有限公司 APP interface operation method and device, electronic equipment and storage medium
CN113009931A (en) * 2021-03-08 2021-06-22 北京邮电大学 Man-machine and unmanned-machine mixed formation cooperative control device and method
CN114167984A (en) * 2021-01-28 2022-03-11 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2428869A1 (en) * 2010-09-13 2012-03-14 Sony Ericsson Mobile Communications AB Control of mobile communication device based on head movement
US20150261298A1 (en) * 2014-03-15 2015-09-17 Microsoft Corporation Trainable sensor-based gesture recognition
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
US9449446B1 (en) * 2012-05-27 2016-09-20 Make Ideas, LLC System employing a plurality of brain/body-generated inputs to control the multi-action operation of a controllable device
US20170017080A1 (en) * 2015-07-17 2017-01-19 Azubuike Victor Onwuta Think and Zoom

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2428869A1 (en) * 2010-09-13 2012-03-14 Sony Ericsson Mobile Communications AB Control of mobile communication device based on head movement
US9449446B1 (en) * 2012-05-27 2016-09-20 Make Ideas, LLC System employing a plurality of brain/body-generated inputs to control the multi-action operation of a controllable device
US20150261298A1 (en) * 2014-03-15 2015-09-17 Microsoft Corporation Trainable sensor-based gesture recognition
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
US20170017080A1 (en) * 2015-07-17 2017-01-19 Azubuike Victor Onwuta Think and Zoom

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
A. F. JACKSON; D. J. BOLGER: "Psychophysiology", vol. 51, 2014, WILEY, article "The neurophysiological bases of EEG and EEG measurement: a review for the rest of us", pages: 1061 - 1071
B. MIRKOVIC; M. G. BLEICHNER; M. DE VOS; S. DEBENER: "Target Speaker Detection with Concealed EEG Around the Ear", FRONTIERS IN NEUROSCIENCE, vol. 10, 2016
C. VIDAURRE; C. SANNELLI; K.-R. MULLER; B. BLANKERTZ: "Neural Computation", vol. 23, 2011, MIT PRESS JOURNALS, article "Machine-Learning-Based Coadaptive Calibration for Brain-Computer Interfaces", pages: 791 - 816
MERRILL NICK ET AL: "Classifying mental gestures with in-ear EEG", 2016 IEEE 13TH INTERNATIONAL CONFERENCE ON WEARABLE AND IMPLANTABLE BODY SENSOR NETWORKS (BSN), IEEE, 14 June 2016 (2016-06-14), pages 130 - 135, XP032925830, DOI: 10.1109/BSN.2016.7516246 *
N. CARLINI; P. MISHRA; T. VAIDYA; Y. ZHANG; M. SHERR; C. SHIELDS; D. WAGNER; W. ZHOU: "Proceedings of the 25th USENIX Security Symposium", 2016, USENIX ASSOCIATION, article "Hidden Voice Commands", pages: 513 - 530
N. MERRILL; M. T. CURRAN; J.-K. YANG; J. CHUANG: "IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN", 2016, IEEE, article "Classifying Mental Gestures with In-Ear EEG", pages: 130 - 135
S. YOUNG ROJAHN: "Samsung Demos a Tablet Controlled by Your Brain", MIT TECHNOLOGY REVIEW, BIOMEDICINE, 19 April 2013 (2013-04-19)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831111A (en) * 2019-04-16 2020-10-27 硅谷介入有限公司 EEG with artificial intelligence as control device
CN110502103A (en) * 2019-05-29 2019-11-26 中国人民解放军军事科学院军事医学研究院 Brain-controlled UAV system and its control method based on brain-computer interface
CN110989832A (en) * 2019-11-21 2020-04-10 维沃移动通信有限公司 Control method and electronic equipment
CN111290579A (en) * 2020-02-10 2020-06-16 Oppo广东移动通信有限公司 Control method, apparatus, electronic device and computer-readable medium for virtual content
CN112363659A (en) * 2020-11-09 2021-02-12 平安普惠企业管理有限公司 APP interface operation method and device, electronic equipment and storage medium
CN114167984A (en) * 2021-01-28 2022-03-11 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN114167984B (en) * 2021-01-28 2024-03-12 Oppo广东移动通信有限公司 Equipment control method and device, storage medium and electronic equipment
CN113009931A (en) * 2021-03-08 2021-06-22 北京邮电大学 Man-machine and unmanned-machine mixed formation cooperative control device and method
CN113009931B (en) * 2021-03-08 2022-11-08 北京邮电大学 A collaborative control device and method for a mixed formation of manned aircraft and unmanned aerial vehicles

Similar Documents

Publication Publication Date Title
WO2018141409A1 (en) Initiating a control operation in response to a head gesture
US10366778B2 (en) Method and device for processing content based on bio-signals
KR102080747B1 (en) Mobile terminal and control method thereof
EP2652578B1 (en) Correlation of bio-signals with modes of operation of an apparatus
US9848796B2 (en) Method and apparatus for controlling media play device
CN111742361B (en) Method and terminal for terminal to update wake-up voice of voice assistant
US9374647B2 (en) Method and apparatus using head movement for user interface
KR20180102871A (en) Mobile terminal and vehicle control method of mobile terminal
JP6121618B2 (en) Launch control method, apparatus, system, program, and recording medium
US12477265B2 (en) Portable audio device
US11902091B2 (en) Adapting a device to a user based on user emotional state
US9543918B1 (en) Configuring notification intensity level using device sensors
CN107613131A (en) A kind of application program disturbance-free method and mobile terminal
CN108683968A (en) Display control method and related products
US11294449B2 (en) Multipoint sensor system for efficient power consumption
US12483843B2 (en) Context-based situational awareness for hearing instruments
US20210149483A1 (en) Selective image capture based on multi-modal sensor input
US9495017B2 (en) Computing systems for peripheral control
CN109144263A (en) Social householder method, device, storage medium and wearable device
CN109343710A (en) Message reply method and device
US11164602B1 (en) Detecting loss of attention during playing of media content in a personal electronic device
WO2023216930A1 (en) Wearable-device based vibration feedback method, system, wearable device and electronic device
KR101689713B1 (en) Mobile terminal and operation method thereof
KR101696720B1 (en) Mobile terminal and operation method thereof
EP4312436A1 (en) Earphone sharing modes of operation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17703418

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17703418

Country of ref document: EP

Kind code of ref document: A1