[go: up one dir, main page]

WO2025204110A1 - Dispositif de type lunettes, procédé de traitement d'informations et programme - Google Patents

Dispositif de type lunettes, procédé de traitement d'informations et programme

Info

Publication number
WO2025204110A1
WO2025204110A1 PCT/JP2025/003067 JP2025003067W WO2025204110A1 WO 2025204110 A1 WO2025204110 A1 WO 2025204110A1 JP 2025003067 W JP2025003067 W JP 2025003067W WO 2025204110 A1 WO2025204110 A1 WO 2025204110A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
eyeglass
type device
contact portion
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2025/003067
Other languages
English (en)
Japanese (ja)
Inventor
亜旗 米田
未佳 砂川
隆雅 吉田
弘毅 高橋
邦博 今村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Publication of WO2025204110A1 publication Critical patent/WO2025204110A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C3/00Special supporting arrangements for lens assemblies or monocles
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C5/00Constructions of non-optical parts
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C5/00Constructions of non-optical parts
    • G02C5/14Side-members
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/10OLED displays

Definitions

  • This disclosure relates to an eyeglass-type device, an information processing method, and a program.
  • Patent Document 1 discloses electronic eyeglasses according to background art.
  • the frames of the electronic eyeglasses are equipped with sensors that perform sensing operations.
  • the left and right temples of the electronic eyeglasses are connected to each other by a connection means that crosses the back of the user's head.
  • Patent Document 1 The electronic eyeglasses disclosed in Patent Document 1 have a connection means that crosses the back of the user's head, which gives the user the feeling that their head is being squeezed, causing a great mental strain.
  • the present disclosure aims to provide an eyeglass-type device, information processing method, and program that can reduce misalignment of the eyeglass-type device without using a connection means that crosses the back of the user's head.
  • a glasses-type device comprises a front section, temple sections connected to the front section, end pieces connected to the temple sections, a first contact section that contacts the bridge of the user's nose or between the eyebrows to support the front section, a second contact section that has a connection section connected to the end pieces and contacts the area under the user's chin to support the end pieces, and at least one of a first sensor that is disposed in the first contact section and acquires a first biosignal of the user from the bridge of the user's nose or between the eyebrows, and a second sensor that is disposed in the second contact section and acquires a second biosignal of the user from the area under the user's chin.
  • FIG. 1 is a front view showing a simplified configuration example of AR glasses according to an embodiment.
  • FIG. 1 is a side view showing a simplified configuration example of AR glasses according to an embodiment.
  • FIG. 2 is a simplified diagram illustrating a first configuration example of a connection portion.
  • FIG. 10 is a simplified diagram illustrating a second configuration example of the connection portion.
  • FIG. 1 is a diagram showing a simplified functional configuration of AR glasses.
  • FIG. 2 is a simplified diagram showing the functional configuration of a processing unit.
  • 10 is a flowchart showing a process executed by a processing unit.
  • FIG. 10 is a diagram showing a simplified functional configuration of AR glasses according to a first modified example.
  • FIG. 10 is a diagram showing a simplified functional configuration of a processing unit according to a first modified example.
  • FIG. 10 is a diagram showing a simplified functional configuration of a processing unit according to a first modified example.
  • FIG. 10 is a diagram showing an image of another user's face captured by an external camera.
  • FIG. 10 is a side view showing an example of how the mask is worn.
  • FIG. 10 is a diagram showing an example of displaying the faces of other users on the display unit.
  • FIG. 10 is a diagram showing a simplified functional configuration of a processing unit according to a second modified example.
  • FIG. 10 is a diagram showing a simplified configuration example of a connection portion according to a third modified example.
  • FIG. 10 is a simplified diagram showing a first configuration example of AR glasses according to a fourth modified example.
  • FIG. 10 is a simplified diagram showing a second configuration example of AR glasses according to a fourth modified example.
  • FIG. 10 is a simplified diagram showing a third configuration example of AR glasses according to a fourth modified example.
  • FIG. 10 is a simplified diagram showing a fourth configuration example of AR glasses according to a fourth modified example.
  • FIG. 10 is a simplified diagram showing a fifth configuration example of AR glasses according to a
  • General eyeglass-type devices are designed to support the glasses with the temples and nose pads, which can cause the glasses to slip off easily.
  • the inventor discovered that by supporting the front section by contacting the bridge of the nose or between the eyebrows and supporting the temple sections by contacting the area under the chin, it is possible to reduce misalignment of the eyeglass-type device without using a connection means that crosses the back of the head, and this led to the present disclosure.
  • a glasses-type device comprises a front section, temple sections connected to the front section, end pieces connected to the temple sections, a first contact section that contacts the bridge of the user's nose or between the eyebrows to support the front section, a second contact section that has a connection section connected to the end pieces and contacts the area under the user's chin to support the end pieces, and at least one of a first sensor that is disposed in the first contact section and acquires a first biosignal of the user from the bridge of the user's nose or between the eyebrows, and a second sensor that is disposed in the second contact section and acquires a second biosignal of the user from the area under the user's chin.
  • misalignment of the eyeglass-type device can be reduced without using a connection means that crosses the back of the user's head.
  • the user's intention to operate the object to be operated can be estimated with high accuracy.
  • connection portion biases the end piece in a direction that pushes up the temple portion, using the end piece as a fulcrum
  • first contact portion preferably has an inter-brow pad that contacts the user's between the eyebrows from below.
  • the effect of reducing misalignment of the eyeglass-type device can be improved.
  • the transmission unit 57 transmits movement information (data D7) indicating the movement of the user's mouth, tongue, or throat estimated by the movement estimation unit 54 to the other AR glasses 1A.
  • the other AR glasses 1A are, for example, AR glasses worn by another user who is a face-to-face conversation partner with the user wearing the AR glasses 1.
  • the receiving unit 58 receives other movement information (data D7A) indicating the movement of the mouth, tongue, or throat of another user wearing another pair of AR glasses 1A from another transmitting unit 57A provided in the other pair of AR glasses 1A.
  • the receiving unit 58 inputs the data D7A to the image creating unit 59.
  • the image creation unit 59 creates image data D11 of the facial expression image of the other user based on image data D10 acquired from the external camera 37 and data D7A input from the receiving unit 58.
  • the image data D10 includes an image of the other user's face captured by the external camera 37.
  • the image creation unit 59 detects that the other user is wearing a mask 60 based on the image data D10, it creates an image of the facial expression around the mouth of the other user based on other movement information (data D7A) received by the receiving unit 58.
  • the image creation unit 59 inputs the image data D11 of the created facial expression image to the display control unit 52.
  • Figure 10 shows an image of the face of another user captured by the external camera 37.
  • the other user is wearing a mask 60, and the other user's nose and mouth are hidden by the mask 60.
  • the display control unit 52 superimposes and displays an image of the facial expression around the mouth of another user, created by the image creation unit 59 based on the image data D11, on the display surface of the display unit 31, in accordance with the position of the face of the other user.
  • the AR glasses 1 may further include an image projection unit, which may use the mask 60 worn by the other user as a screen and project an image of the other user's facial expression (image data D11) onto the screen.
  • the material of the mask 60 may be a retroreflective material.
  • the movement information (data D7) indicating the movement of the user's mouth, tongue, or throat transmitted by the transmission unit 57 can be used by other AR glasses 1A or any information processing device, etc.
  • the above-mentioned optional information processing device may also be a management device that manages the user's health.
  • the management device receives images (image data D10) captured by the external camera 37 from the AR glasses 1 and, based on the captured images, determines whether the user is currently eating and what they are chewing. Based on data D7 received from the AR glasses 1, the management device measures the user's jaw movement while chewing, the number of times they swallow, or the timing of swallowing. Data D7 may also include measurements taken by a sensor that can measure the degree to which the user's jaw is opened and closed. Because users who chew less frequently are at higher health risk, the management device issues a warning to such users and encourages them to improve their chewing behavior.
  • Health insurance premiums may be increased or decreased in real time depending on the number of chews. Furthermore, users who have abnormalities in their chewing behavior, such as asymmetry, may have dental or oral diseases.
  • the management device issues a warning to such users and encourages them to visit a hospital. Furthermore, in a care facility or the like, if the management device detects a user's aspiration based on data D7, it may output an alarm or send an emergency call to the staff in charge.
  • the arbitrary information processing device may be an information terminal carried by a language learning instructor, such as an English conversation instructor.
  • the instructor may be a real instructor who is face-to-face with the user, a remote instructor who can communicate with the user, or a virtual instructor.
  • a user wearing AR glasses 1 can have a real conversation with the instructor displayed on the display unit 31.
  • data D7 indicating the user's jaw or tongue movements is sent from the AR glasses 1 to the information terminal carried by the instructor.
  • the instructor determines whether the user's jaw or tongue movements are correct based on the received data D7, and provides feedback on the correct pronunciation method to the AR glasses 1 worn by the user. This makes it possible to provide instruction based on jaw and tongue movements in addition to instruction based on differences in pronunciation due to the ear, which is expected to promote language learning.
  • (Second Modification) 13 is a simplified diagram showing the functional configuration of the processing unit 32 according to the second modified example.
  • the processing unit 32 further includes a mis-attachment detection unit 71 in addition to the configuration shown in FIG.
  • the mis-wearing detection unit 71 detects that the second contact unit 22 has been worn incorrectly by the user, for example, when the detection value of the electrode 222 is an abnormal value.
  • An example of mis-wearing is when the second contact unit 22 is worn turned toward the back of the user's head.
  • the mis-wearing detection unit 71 detects that the second contact unit 22 has been worn incorrectly, it inputs mis-wearing detection information D20 to the display control unit 52.
  • the display control unit 52 receives detection information D20 from the mis-wearing detection unit 71, it displays notification information such as an image or text message on the display unit 31 to notify the user of the mis-wearing.
  • the manner in which the mis-wearing is notified is not limited to a display, and may also be the output of a warning sound or voice message, etc.
  • This modified example makes it possible to prevent malfunction of the AR glasses 1 or a decrease in estimation accuracy in gaze control due to incorrect attachment of the second contact portion 22.
  • a message such as "Please attach under the chin” may be written on the surface of the subchin pad 221.
  • connection portion 224 may be configured to be stretchable or have a controllable tensile strength.
  • ease of wearing may be ensured by weakening the tensile strength before wearing, and the degree of adhesion of the subchin pad 221 may be increased by strengthening the tensile strength after sensing that the pad has been worn.
  • a shape memory alloy that reacts to the user's body temperature may be used to variably control the tensile strength, or an actuator such as a motor may be used.
  • notification information may be output to prompt the user to reattach the AR glasses 1.
  • detection values of the electrodes 212, 222 change significantly due to misalignment of the AR glasses 1
  • notification information may be output to prompt the user to reattach the AR glasses 1.
  • detection information of the user's eye position by the internal camera 36 may be used.
  • FIG. 14 is a simplified diagram showing an example configuration of the connection unit 224 according to the third modified example.
  • the first contact unit 21 has a nasal root pad 213 and an electrode 214.
  • the nasal root pad 213 is a pad that contacts the skin surface of the user's nasal root from above, following the slope of the nasal root.
  • the nasal root pad 213 is connected to the bridge 112 or the rim 111 via a pad arm (not shown).
  • the electrode 214 is disposed on the nasal root pad 213. Multiple electrodes 214 may be disposed on one nasal root pad 213.
  • the electrode 214 is included in the first sensor 33.
  • the first sensor 33 acquires a first biological signal D5 from the user's nasal root via the electrode 214.
  • the connection portion 224 has a biasing member 224C that uses a spiral spring or the like.
  • the biasing member 224C biases the end piece 13 in a direction that rotates the lower end of the end piece 13 forward relative to the second contact portion 22 (the direction indicated by arrow Y4).
  • the connection portion 224 biases the end piece 13 in a direction that presses down on the temple portion 12 (the direction indicated by arrow Y5) with the end piece 13 as a fulcrum, resulting in the nose bridge pad 213 coming into close contact with the bridge of the user's nose.
  • the first contact portion 21 contacts the bridge of the user's nose to support the front portion 11, and the second contact portion 22 contacts the area under the user's chin to support the temple portion 13. This reduces misalignment of the AR glasses 1 without using a connection means that crosses the back of the user's head.
  • connection portion 224 biases the end piece 13 in a direction that presses down the temple portion 12, with the end piece 13 as a fulcrum, so that the nose bridge pad 213 adheres closely to the bridge of the user's nose, thereby improving the effect of reducing misalignment of the AR glasses 1.
  • biasing member 224C shown in Figure 14 can appropriately apply a biasing force in a direction that presses down the temple portion 12 with the end piece 13 as a fulcrum.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Eyeglasses (AREA)

Abstract

Ce dispositif de type lunettes comprend : une section avant ; une section branche qui est reliée à la section avant ; une section pointe d'extrémité qui est reliée à la section branche ; une première section de contact qui vient en contact avec la racine nasale ou la glabelle d'un utilisateur et supporte la section avant ; une seconde section de contact qui a une section de liaison reliée à la section pointe d'extrémité, vient en contact avec le menton de l'utilisateur et supporte la section pointe d'extrémité ; et au moins l'un d'un premier capteur et d'un second capteur, le premier capteur étant disposé dans la première section de contact et acquérant un premier signal biologique de l'utilisateur à partir de la racine nasale ou de la glabelle de l'utilisateur et le second capteur étant disposé dans la seconde section de contact et acquérant un second signal biologique de l'utilisateur à partir du menton de l'utilisateur.
PCT/JP2025/003067 2024-03-29 2025-01-30 Dispositif de type lunettes, procédé de traitement d'informations et programme Pending WO2025204110A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2024-056548 2024-03-29
JP2024056548 2024-03-29

Publications (1)

Publication Number Publication Date
WO2025204110A1 true WO2025204110A1 (fr) 2025-10-02

Family

ID=97215436

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2025/003067 Pending WO2025204110A1 (fr) 2024-03-29 2025-01-30 Dispositif de type lunettes, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2025204110A1 (fr)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997033190A1 (fr) * 1996-03-07 1997-09-12 Newline Surf Pty. Ltd. Lunettes de soleil de sport
JP2002156612A (ja) * 2000-11-17 2002-05-31 Poochie Pompreece:Kk ペット用眼鏡
CN203786386U (zh) * 2014-04-17 2014-08-20 华北石油管理局总医院 医用放大镜
WO2016194849A1 (fr) * 2015-06-01 2016-12-08 アルプス電気株式会社 Dispositif électronique de type lunettes
JP2017206067A (ja) * 2016-05-16 2017-11-24 株式会社東芝 眼電位検出電極付き制帽、眼電位検出電極付きヘッドウエア、および眼電位検出を利用したアラート方法
US20170367423A1 (en) * 2016-06-23 2017-12-28 Six Flags Theme Parks, Inc. Headband for virtual reality goggles
JP2019512713A (ja) * 2016-01-05 2019-05-16 サフィーロ・ソシエタ・アツィオナリア・ファブリカ・イタリアナ・ラボラツィオーネ・オッチアリ・エス・ピー・エー 生体信号センサ付き眼鏡
JP2020502589A (ja) * 2016-12-13 2020-01-23 サフィーロ・ソシエタ・アツィオナリア・ファブリカ・イタリアナ・ラボラツィオーネ・オッチアリ・エス・ピー・エー バイオセンサを備えたメガネ
JP2020036027A (ja) * 2015-12-25 2020-03-05 三井化学株式会社 圧電基材、圧電織物、圧電編物、圧電デバイス、力センサー、アクチュエータ、及び生体情報取得デバイス
CN213338207U (zh) * 2020-11-23 2021-06-01 梁少菊 一种消化内科用局部放大镜

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997033190A1 (fr) * 1996-03-07 1997-09-12 Newline Surf Pty. Ltd. Lunettes de soleil de sport
JP2002156612A (ja) * 2000-11-17 2002-05-31 Poochie Pompreece:Kk ペット用眼鏡
CN203786386U (zh) * 2014-04-17 2014-08-20 华北石油管理局总医院 医用放大镜
WO2016194849A1 (fr) * 2015-06-01 2016-12-08 アルプス電気株式会社 Dispositif électronique de type lunettes
JP2020036027A (ja) * 2015-12-25 2020-03-05 三井化学株式会社 圧電基材、圧電織物、圧電編物、圧電デバイス、力センサー、アクチュエータ、及び生体情報取得デバイス
JP2019512713A (ja) * 2016-01-05 2019-05-16 サフィーロ・ソシエタ・アツィオナリア・ファブリカ・イタリアナ・ラボラツィオーネ・オッチアリ・エス・ピー・エー 生体信号センサ付き眼鏡
JP2017206067A (ja) * 2016-05-16 2017-11-24 株式会社東芝 眼電位検出電極付き制帽、眼電位検出電極付きヘッドウエア、および眼電位検出を利用したアラート方法
US20170367423A1 (en) * 2016-06-23 2017-12-28 Six Flags Theme Parks, Inc. Headband for virtual reality goggles
JP2020502589A (ja) * 2016-12-13 2020-01-23 サフィーロ・ソシエタ・アツィオナリア・ファブリカ・イタリアナ・ラボラツィオーネ・オッチアリ・エス・ピー・エー バイオセンサを備えたメガネ
CN213338207U (zh) * 2020-11-23 2021-06-01 梁少菊 一种消化内科用局部放大镜

Similar Documents

Publication Publication Date Title
CA3095287C (fr) Systemes de realite augmentee pour applications biomedicales d'urgence
EP3064130A1 (fr) Mesure d'activité cérébrale et système de rétroaction
US10172552B2 (en) Method for determining and analyzing movement patterns during dental treatment
Kwon et al. Emotion recognition using a glasses-type wearable device via multi-channel facial responses
Bulling et al. What's in the Eyes for Context-Awareness?
CN105578954A (zh) 生理参数测量和反馈系统
KR20190005219A (ko) 사용자 건강 분석을 위한 증강 현실 시스템들 및 방법들
JP2022548473A (ja) 患者監視のためのシステム及び方法
WO2019071166A1 (fr) Évaluation clinique pluridisciplinaire en réalité virtuelle ou augmentée
JP7320261B2 (ja) 情報処理システム、方法、及びプログラム
Gao et al. Wearable technology for signal acquisition and interactive feedback in autism spectrum disorder intervention: A review
CN111933277A (zh) 3d眩晕症的检测方法、装置、设备和存储介质
Gjoreski et al. OCOsense glasses–monitoring facial gestures and expressions for augmented human-computer interaction: OCOsense glasses for monitoring facial gestures and expressions
JP4730621B2 (ja) 入力装置
WO2025204110A1 (fr) Dispositif de type lunettes, procédé de traitement d'informations et programme
US20220240802A1 (en) In-ear device for blood pressure monitoring
CN113995416A (zh) 用于显示眼镜中的用户界面的装置和方法
Gemicioglu et al. TongueTap: Multimodal Tongue Gesture Recognition with Head-Worn Devices
WO2023129390A1 (fr) Surveillance de l'activité cardiaque à l'aide d'un dispositif intra-auriculaire
WO2022237954A1 (fr) Module d'oculométrie pouvant être porté par un être humain
Matthies et al. Wearable Sensing of Facial Expressions and Head Gestures
Peña-Cortés et al. Warning and rehabilitation system using brain computer interface (BCI) in cases of bruxism
KR102877504B1 (ko) Bci 기반의 뉴로 로보틱스 시스템
CN222804286U (zh) 一种神经重症患者监护穿戴面罩
US20240103285A1 (en) Integrated health sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25778578

Country of ref document: EP

Kind code of ref document: A1