[go: up one dir, main page]

CN117958819A - Attention monitoring method, device, equipment and medium - Google Patents

Attention monitoring method, device, equipment and medium Download PDF

Info

Publication number
CN117958819A
CN117958819A CN202410216781.1A CN202410216781A CN117958819A CN 117958819 A CN117958819 A CN 117958819A CN 202410216781 A CN202410216781 A CN 202410216781A CN 117958819 A CN117958819 A CN 117958819A
Authority
CN
China
Prior art keywords
wearer
head
attention
eye movement
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410216781.1A
Other languages
Chinese (zh)
Inventor
骆俊谕
林大鹏
刘瑞洋
梁波
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN202410216781.1A priority Critical patent/CN117958819A/en
Publication of CN117958819A publication Critical patent/CN117958819A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Developmental Disabilities (AREA)
  • Child & Adolescent Psychology (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application discloses a method, a device, equipment and a medium for monitoring attention, and relates to the technical field of head wear. Wherein the method is applied to the head-mounted device, comprising: acquiring a head pose of a wearer of the headset; acquiring an eye movement characteristic value of a wearer; based on the head pose and the eye movement characteristic values, the attention state of the wearer is determined. In this method, since the head posture and the eye movement characteristic value have an obvious correlation with the attention state of the wearer, the attention state of the wearer is determined based on the head posture and the eye movement characteristic value. That is, the method provides a method of monitoring the attention of the wearer.

Description

Attention monitoring method, device, equipment and medium
Technical Field
The present application relates to the field of head-mounted technology, and more particularly, to an attention monitoring method, an attention monitoring device, a head-mounted apparatus, and a computer-readable storage medium.
Background
People have the situation of inattention in the course of work or learning.
In order to improve working or learning efficiency, how to monitor attention of people is one of the technical problems to be solved.
Disclosure of Invention
It is an object of the present application to provide a new solution for monitoring attention.
According to a first aspect of the present application, there is provided an attention monitoring method applied to a head-mounted device, comprising:
Acquiring a head pose of a wearer of the headset;
acquiring an eye movement characteristic value of the wearer;
an attention state of the wearer is determined from the head pose and the eye movement characteristic value.
Optionally, the determining the attention state of the wearer according to the head pose and the eye movement characteristic value includes:
determining whether the wearer is in a head rest state according to the head posture and the eye movement characteristic value;
when the wearer is in a head rest state and the duration of the head rest state is longer than a preset duration, determining that the attention state of the wearer is inattention;
and determining that the attention state of the wearer is focused when the wearer is in a non-head rest state or when the wearer is in a head rest state and the duration of the head rest state is less than or equal to a preset duration.
Optionally, the determining whether the wearer is in a head rest state according to the head pose and the eye movement characteristic value includes:
determining that the wearer is in a head rest state when the variation of the head posture is less than or equal to a first preset threshold and the variation of the eye movement characteristic value is less than or equal to a second preset threshold;
and determining that the wearer is in a non-head rest state under the condition that the variation of the head posture is larger than the first preset threshold value and/or the variation of the eye movement characteristic value is larger than the second preset threshold value.
Optionally, before said determining the attention state of the wearer according to the head pose and the eye movement feature value, the method further comprises:
Acquiring a use scene of the head-mounted equipment;
According to the use scene, determining a head standard posture corresponding to the attention state of focusing attention;
the determining the attention state of the wearer according to the head posture and the eye movement characteristic value comprises the following steps:
Determining that the attention state of the wearer is inattentive if the wearer is in a head rest state and the head pose does not match the head standard pose according to the eye movement characteristic value;
And determining that the attention state of the wearer is focused when the eye movement characteristic value determines that the wearer is in a non-head rest state and/or the head posture is matched with the head standard posture.
Optionally, the method further comprises:
And outputting reminding information when the attention state of the wearer is inattention.
Optionally, the eye movement characteristic value further includes a blink number, and the method further includes:
And stopping outputting the reminding information under the condition that the blink times reach the preset times and the reminding information is output.
According to a second aspect of the present application, there is provided an attention monitor apparatus applied to a head-mounted device, comprising:
A first acquisition module for acquiring a head pose of a wearer of the headset;
a second acquisition module for acquiring an eye movement characteristic value of the wearer;
A determining module for determining an attention state of the wearer according to the head pose and the eye movement characteristic value.
Optionally, the determining module is specifically configured to:
determining whether the wearer is in a head rest state according to the head posture and the eye movement characteristic value;
when the wearer is in a head rest state and the duration of the head rest state is longer than a preset duration, determining that the attention state of the wearer is inattention;
and determining that the attention state of the wearer is focused when the wearer is in a non-head rest state or when the wearer is in a head rest state and the duration of the head rest state is less than or equal to a preset duration.
According to a third aspect of the present application there is provided a headset comprising the attention monitor device of any of the second aspects;
or the head-mounted device comprises a memory for storing computer instructions and a processor for invoking the computer instructions from the memory to perform the method of attention monitoring as in any of the first aspects.
According to a fourth aspect of the present application, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of attention monitoring according to any of the first aspects.
The application provides an attention monitoring method, which is applied to head-mounted equipment and comprises the following steps: acquiring a head pose of a wearer of the headset; acquiring an eye movement characteristic value of a wearer; based on the head pose and the eye movement characteristic values, the attention state of the wearer is determined. In this method, since the head posture and the eye movement characteristic value have an obvious correlation with the attention state of the wearer, the attention state of the wearer is determined based on the head posture and the eye movement characteristic value. That is, the present application provides a method of monitoring the attention of a wearer.
Other features of the present application and its advantages will become apparent from the following detailed description of exemplary embodiments of the application, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a block diagram of a hardware configuration of a head-mounted device implementing an attention monitoring method according to the present application;
FIG. 2 is a flow chart diagram of a method for implementing attention monitoring according to the present application;
fig. 3 is a schematic structural diagram of an attention monitor device according to the present application.
Detailed Description
Various exemplary embodiments of the present application will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present application unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the application, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Fig. 1 is a block diagram of a hardware configuration of a head-mounted device implementing an attention monitoring method according to the present application. The headset 1000 may be exemplary AR/MR/XR glasses, AR/MR/XR helmets, and the like.
The headset 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and a sensing device 1900, among others. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, ROM (read only memory), RAM (random access memory), nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 can perform wired or wireless communication, for example. The display device 1500 is, for example, a liquid crystal display, a touch display, or the like. The input device 1600 may include, for example, a touch screen or the like. The wearer may input/output voice information through the speaker 1700 and microphone 1800. The sensing device 1900 includes, for example, an image sensor, illustratively an infrared eyeball camera and/or other optical camera, for capturing an eye image of the wearer, and an attitude sensor, illustratively an IMU sensor, for capturing a head attitude of the wearer.
Although a plurality of devices are shown for the headset 1000 in fig. 1, the present application may relate to only some of the devices, for example, the headset 1000 relates to only the memory 1200, the processor 1100, and the sensing device 1900.
In an embodiment of the present application, the memory 1200 of the headset 1000 is used to store instructions for controlling the processor 1100 to perform the attention monitoring method provided by the embodiment of the present application.
In the above description, the skilled person may design instructions according to the disclosed solution. How the instructions control the processor to operate is well known in the art and will not be described in detail here.
Fig. 2 is a flow chart of a method for implementing attention monitoring according to the present application, and the method is applied to a head-mounted device. And an application scenario of the method may specifically be: the attention of the students in the class is monitored.
As shown in fig. 2, the method includes the following steps S210 to S230.
Step S210, a head posture of a wearer of the head-mounted device is acquired.
In one embodiment of the application, an IMU unit is provided on the headset, and the three-axis acceleration and three-axis angular velocity of the IMU output by the IMU unit describe the head pose of the wearer.
In another embodiment of the application, a camera communicatively coupled to the headset may also be provided in the external environment. The camera captures an image of the wearer and transmits the image to the headset based on a communication connection with the headset. The headset identifies the head pose of the wearer based on the image.
In the present embodiment, how to obtain the head posture of the wearer of the head-mounted device is not limited.
Step S220, acquiring an eye movement characteristic value of the wearer.
In the present embodiment, the eye movement characteristic value may be exemplified by a pupil position or a gaze point position.
In one embodiment of the application, the eye movement characteristic value of the wearer can be determined through an eye image of the wearer and an eye movement tracking algorithm.
Step S230, according to the head gesture and the eye movement characteristic value, the attention state of the wearer is determined.
In this embodiment, the attentiveness state of the wearer includes: focusing and inattention. The head pose and eye movement characteristic values have obvious correlation with the attention state of the wearer, so the attention state of the wearer can be judged based on the head pose and eye movement characteristic values.
In this embodiment, the above step S230 may be implemented in two ways.
In the first way, the above step S230 may be specifically implemented through the following steps S231 to S233.
Step S231, determining whether the wearer is in a head rest state according to the head posture and the eye movement characteristic value.
In one embodiment of the application, it may be determined whether the wearer is in a head rest state based on the head pose and the amount of change in the eye movement characteristic value. If the variation amount according to the head posture and the eye movement characteristic value is small, the wearer can be determined to be in a head rest state, otherwise, the wearer is determined to be in a non-head rest state. Based on this, the above-described step S231 may be realized specifically by the following steps S2311 and 2312.
In step S2311, it is determined that the wearer is in a head rest state when the variation of the head posture is less than or equal to the first preset threshold and the variation of the eye movement characteristic value is less than or equal to the second preset threshold.
In step S2312, it is determined that the wearer is in a non-head rest state in the case where the variation amount of the head posture is greater than the first preset threshold and/or the variation amount of the eye movement characteristic value is greater than the second preset threshold.
In this embodiment, the first preset threshold is the maximum amount of change in the head pose of the wearer in the head resting state. The second preset threshold is the maximum variation of the eye movement characteristic value when the wearer is in the state of head rest.
Taking the description of the head gesture through the triaxial acceleration value and the triaxial angular velocity value as an example, the first preset threshold value includes: Δx, Δy, Δz, Δω X、ΔωY, and Δω Z. Where Δx is the maximum acceleration change of the head in the X-axis when the wearer is in a state of rest of the head. ΔY is the maximum acceleration change of the head in the Y axis when the wearer is in a resting state of the head. ΔZ is the maximum acceleration change of the head in the Z axis when the wearer is at rest. Δω X is the maximum angular velocity change of the head in the X-axis when the wearer is in a state of rest of the head. Δω Y is the maximum angular velocity variation of the head on the Y axis when the wearer is in a state of rest of the head. Δω Z is the maximum angular velocity change of the head in the Z-axis when the wearer is in the head rest state. That is, the first preset threshold is a multi-dimensional array.
In this embodiment, the amount of change in the head posture may specifically be an amount of change in the head posture according to the current head posture collection time compared to the head posture at the previous head posture collection time. Similarly, the change amount of the eye movement characteristic value may be specifically the change amount of the eye movement characteristic value according to the gaze point position at the current eye movement characteristic value acquisition time compared with the eye movement characteristic value at the previous eye movement characteristic value acquisition time.
Based on the above, in the case where the amount of change in the head posture is less than or equal to the first preset threshold, it is indicated that the wearer is in the head rest state. Meanwhile, in the case that the variation of the eye movement characteristic value is smaller than or equal to the second preset threshold value, the wearer is also indicated to be in a head rest state. In this embodiment, the wearer is determined to be in the head rest state only when the amount of change in the head posture is less than or equal to the first preset threshold value and the amount of change in the eye movement characteristic value is less than or equal to the second preset threshold value. Correspondingly, when the variation of the head posture is larger than a first preset threshold value and/or the variation of the eye movement characteristic value is larger than a second preset threshold value, the wearer is determined to be in a non-head static state. That is, in the present embodiment, whether the wearer is in the head rest state is determined more accurately by the two dimensions of the head posture and the eye movement characteristic value.
Step S232, determining that the attention state of the wearer is inattentive when the wearer is in the head rest state and the head rest state continues for a duration longer than the preset duration.
In step S233, in the case that the wearer is in the non-head rest state, or in the case that the wearer is in the head rest state and the duration of the head rest state is less than or equal to the preset duration, the attentiveness state of the wearer is determined to be focused.
In this embodiment, the preset time period is the maximum time period in the head rest state when the wearer focuses attention. On this basis, the duration of the head rest state is monitored with the wearer in the head rest state. In the case where the duration of the head rest state is longer than the preset duration, the attentiveness state of the wearer is determined to be inattention.
Correspondingly, in the case that the wearer is in a non-head rest state or in the case that the wearer is in a head rest state and the duration of the head rest state is less than or equal to a preset duration, the attentiveness state of the wearer is determined to be attentiveness.
The above steps S231 to S233 provide a way of determining whether the wearer is focused by whether the wearer is stationary for a long time.
In a second manner, the attention monitoring method provided by the present application further includes the following step S234 and step S235 before the step S230.
In step S234, a usage scenario of the headset is acquired.
In the present embodiment, the field Jing Shili is used for listening and speaking scenes and writing scenes.
Step S235, according to the use scene, the head standard gesture corresponding to the attention state of focusing attention is determined.
It will be appreciated that in different use situations, the wearer's head pose is relatively fixed with attention focused, i.e. the different use situations present a corresponding standard head pose.
In one example, where the usage scene is an audiometric scene, the head standard pose is a pose that can cause eyes to head up. In the case where the usage scene is a composition scene, the head standard pose is a low head pose.
In one embodiment of the present application, the mapping relationship between different usage scenarios and corresponding head standard poses may be stored in the head-mounted device in advance. On this basis, the above step S235 may be implemented by searching for the head standard pose corresponding to the usage scenario that matches the usage scenario of the head-mounted device acquired based on the above step S234 from the above mapping relation.
The mapping relation between different usage scenes and the corresponding head standard gestures can be determined empirically or experimentally.
In addition to the steps S234 and S235, the step S230 is specifically implemented by the following steps 236 and S237.
In step S236, in the case that the wearer is determined to be in the head rest state according to the eye movement characteristic value and the head posture does not match the head standard posture, the attention state of the wearer is determined to be inattentive.
Step S237, in the case where it is determined that the wearer is in the non-head rest state and/or the head pose matches the head standard pose based on the eye movement feature value, determining that the attention state of the wearer is focused.
In this embodiment, in the case where the amount of change in the eye movement characteristic value is less than or equal to the second preset threshold value, it is determined that the wearer is in the head rest state. Otherwise, if the change amount of the eye movement characteristic value is smaller than the second preset threshold value, the wearer is determined to be in a non-head rest state. The second preset threshold value is the maximum variation of the eye movement characteristic value when the wearer is in the head static state.
In the case that the head pose does not match the head standard pose, it is indicated that the head pose of the wearer is not a focused head pose, i.e., the wearer is not focused.
In the present embodiment, in the case where it is determined that the wearer is in the head rest state while the head posture does not match the head standard posture based on the eye movement characteristic value, it is determined that the attentiveness state of the wearer is inattention. Correspondingly, in the case that the wearer is determined to be in a non-head rest state according to the eye movement characteristic value and/or the head posture is matched with the head standard posture, the attention state of the wearer is determined to be focused. That is, in the present embodiment, the attention state of the wearer is accurately determined by means of the eye movement characteristic value in assisting the determination of the attention state of the wearer from the head posture of the wearer.
In summary, the present application provides an attention monitoring method applied to a head-mounted device, including: acquiring a head pose of a wearer of the headset; acquiring an eye movement characteristic value of a wearer; based on the head pose and the eye movement characteristic values, the attention state of the wearer is determined. In this method, since the head posture and the eye movement characteristic value have an obvious correlation with the attention state of the wearer, the attention state of the wearer is determined based on the head posture and the eye movement characteristic value. That is, the present application provides a method of monitoring the attention of a wearer.
In order to alert the wearer in the event that the wearer is not focused, in one embodiment of the present application, the attention monitoring method further includes the following step S240.
Step S240, when the attention state of the wearer is not focused, reminding information is output.
In this embodiment, when the attention state of the wearer is inattention, a reminder message may be output. The reminding information is used for reminding the wearer to concentrate on. In one example, the alert information may be at least one of an acoustic, an optical, a vibration, and the like type.
Further, in order to promote the wearer to restore the attentive state, the eye movement characteristic value further includes the blink number based on the step S240, and the attentive monitoring method further includes the following step S250.
Step S250, stopping outputting the reminding information when the blink times reach the preset times and the reminding information is output.
In this embodiment, in the case where the attention state of the wearer is inattentive, the head-mounted device continuously outputs the reminder information. Only when the preset number of blinks of the wearer are detected, the output of the reminding information is stopped. In this way, the wearer can be effectively encouraged to return to a state of concentrated attention. The preset number of times may be exemplified by 3 times.
In addition, by simply instructing the wearer to blink to concentrate on the wearer, more complicated operations such as clicking on the headset, etc., can also be avoided. This also avoids degradation of the wearer's body experience.
The present application also provides an attention monitor device 300, applied to a head-mounted apparatus, as shown in fig. 3, including:
A first acquisition module 310 for acquiring a head pose of a wearer of the headset;
A second acquisition module 320, configured to acquire an eye movement characteristic value of the wearer;
A determining module 330 is configured to determine an attention state of the wearer according to the head pose and the eye movement feature value.
Since the head pose and the eye movement characteristic value have an obvious correlation with the attention state of the wearer, the attention state of the wearer is judged based on the head pose and the eye movement characteristic value. That is, the present application provides a device for monitoring the attention of a wearer.
In one embodiment of the present application, the determining module 330 is specifically configured to:
determining whether the wearer is in a head rest state according to the head posture and the eye movement characteristic value;
when the wearer is in a head rest state and the duration of the head rest state is longer than a preset duration, determining that the attention state of the wearer is inattention;
and determining that the attention state of the wearer is focused when the wearer is in a non-head rest state or when the wearer is in a head rest state and the duration of the head rest state is less than or equal to a preset duration.
In one embodiment of the present application, the determining module 330 is specifically configured to:
determining that the wearer is in a head rest state when the variation of the head posture is less than or equal to a first preset threshold and the variation of the eye movement characteristic value is less than or equal to a second preset threshold;
and determining that the wearer is in a non-head rest state under the condition that the variation of the head posture is larger than the first preset threshold value and/or the variation of the eye movement characteristic value is larger than the second preset threshold value.
In one embodiment of the present application, the attention monitor device 300 provided in the embodiment of the present application further includes:
A third obtaining module, configured to obtain a usage scenario of the headset device;
In this embodiment, the determining module 330 is further configured to determine, according to the usage scenario, a head standard pose corresponding to an attention state of focusing attention;
and, the determining module 330 is specifically configured to:
Determining that the attention state of the wearer is inattentive if the wearer is in a head rest state and the head pose does not match the head standard pose according to the eye movement characteristic value;
And determining that the attention state of the wearer is focused when the eye movement characteristic value determines that the wearer is in a non-head rest state and/or the head posture is matched with the head standard posture.
In one embodiment of the present application, the attention monitor device 300 provided in the embodiment of the present application further includes:
the reminding module is used for outputting reminding information under the condition that the attention state of the wearer is not focused.
In one embodiment of the present application, the attention monitor device 300 provided in the embodiment of the present application further includes:
and the stopping reminding module is used for stopping outputting the reminding information under the condition that the blink times reach the preset times and the reminding information is output.
The present application also provides a headset comprising any of the attention monitor 300 provided by the above-described attention monitor device embodiments.
Or the head-mounted device comprises a memory for storing computer instructions and a processor for invoking the computer instructions from the memory to perform the attention monitoring method as in any of the above described attention monitoring method embodiments.
In this embodiment, the headset may be embodied as the headset 1000 in fig. 1.
The present application also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the attention monitoring method of any of the above-described attention monitoring method embodiments.
The present application may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present application.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present application may be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as SMALLTALK, C ++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present application are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information for computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are all equivalent.
The foregoing description of embodiments of the application has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the technical improvement of the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the application is defined by the appended claims.

Claims (10)

1. A method of attention monitoring, applied to a headset, comprising:
Acquiring a head pose of a wearer of the headset;
acquiring an eye movement characteristic value of the wearer;
an attention state of the wearer is determined from the head pose and the eye movement characteristic value.
2. The method of claim 1, wherein said determining the attention state of the wearer from the head pose and the eye movement characteristic value comprises:
determining whether the wearer is in a head rest state according to the head posture and the eye movement characteristic value;
when the wearer is in a head rest state and the duration of the head rest state is longer than a preset duration, determining that the attention state of the wearer is inattention;
and determining that the attention state of the wearer is focused when the wearer is in a non-head rest state or when the wearer is in a head rest state and the duration of the head rest state is less than or equal to a preset duration.
3. The method of claim 2, wherein said determining whether the wearer is in a head rest state based on the head pose and the eye movement characteristic value comprises:
determining that the wearer is in a head rest state when the variation of the head posture is less than or equal to a first preset threshold and the variation of the eye movement characteristic value is less than or equal to a second preset threshold;
and determining that the wearer is in a non-head rest state under the condition that the variation of the head posture is larger than the first preset threshold value and/or the variation of the eye movement characteristic value is larger than the second preset threshold value.
4. The method according to claim 1, further comprising, prior to said determining the attention state of the wearer from the head pose and the eye movement characteristic value:
Acquiring a use scene of the head-mounted equipment;
According to the use scene, determining a head standard posture corresponding to the attention state of focusing attention;
the determining the attention state of the wearer according to the head posture and the eye movement characteristic value comprises the following steps:
Determining that the attention state of the wearer is inattentive if the wearer is in a head rest state and the head pose does not match the head standard pose according to the eye movement characteristic value;
And determining that the attention state of the wearer is focused when the eye movement characteristic value determines that the wearer is in a non-head rest state and/or the head posture is matched with the head standard posture.
5. The method according to any one of claims 1-4, further comprising:
And outputting reminding information when the attention state of the wearer is inattention.
6. The method of claim 5, wherein the eye movement characteristic value further comprises a blink number, the method further comprising:
And stopping outputting the reminding information under the condition that the blink times reach the preset times and the reminding information is output.
7. An attention monitor device, characterized by being applied to a head-mounted apparatus, comprising:
A first acquisition module for acquiring a head pose of a wearer of the headset;
a second acquisition module for acquiring an eye movement characteristic value of the wearer;
A determining module for determining an attention state of the wearer according to the head pose and the eye movement characteristic value.
8. The apparatus of claim 7, wherein the determining module is specifically configured to:
determining whether the wearer is in a head rest state according to the head posture and the eye movement characteristic value;
When the wearer is in a head rest state and the duration of the rest state is longer than a preset duration, determining that the attention state of the wearer is inattention;
and determining that the attention state of the wearer is focused when the wearer is in a non-head rest state or when the wearer is in a head rest state and the duration of the head rest state is less than or equal to a preset duration.
9. A head-mounted device, characterized in that it comprises the attention monitor device according to claim 7 or 8;
Or the head-mounted device comprises a memory for storing computer instructions and a processor for invoking the computer instructions from the memory to perform the attention monitoring method of any of claims 1-6.
10. A computer readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, implements the attention monitoring method according to any of claims 1-6.
CN202410216781.1A 2024-02-27 2024-02-27 Attention monitoring method, device, equipment and medium Pending CN117958819A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410216781.1A CN117958819A (en) 2024-02-27 2024-02-27 Attention monitoring method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410216781.1A CN117958819A (en) 2024-02-27 2024-02-27 Attention monitoring method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117958819A true CN117958819A (en) 2024-05-03

Family

ID=90864371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410216781.1A Pending CN117958819A (en) 2024-02-27 2024-02-27 Attention monitoring method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117958819A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118766455A (en) * 2024-06-17 2024-10-15 南京蔚来思创科技有限公司 Eye movement visual field testing method, system and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118766455A (en) * 2024-06-17 2024-10-15 南京蔚来思创科技有限公司 Eye movement visual field testing method, system and storage medium

Similar Documents

Publication Publication Date Title
US12320977B2 (en) Display system
US9317113B1 (en) Gaze assisted object recognition
KR20220034243A (en) Resolving natural language ambiguity for simulated reality settings
JP7523596B2 (en) Assistant device arbitration using wearable device data
US10599980B2 (en) Technologies for cognitive cuing based on knowledge and context
US20180018144A1 (en) Leveraging environmental context for enhanced communication throughput
JP6402718B2 (en) Information processing apparatus, control method, and program
US20210048930A1 (en) Human-machine conversation method, client, electronic device, and storage medium
US9088668B1 (en) Configuring notification intensity level using device sensors
US12405703B2 (en) Digital assistant interactions in extended reality
US20230196836A1 (en) Human Presence Sensor for Client Devices
CN114930272B (en) Head and eye pose recognition
CN115857781A (en) Adaptive User Registration for Electronic Devices
CN117958819A (en) Attention monitoring method, device, equipment and medium
US12504812B2 (en) Method and device for processing user input for multiple devices
KR102909471B1 (en) Method for determining movement of electronic device and electronic device using same
US11030979B2 (en) Information processing apparatus and information processing method
WO2016206642A1 (en) Method and apparatus for generating control data of robot
US11991263B2 (en) Notification delivery in a virtual space based on a user being in a flow state
WO2023239715A1 (en) Digital employee experience improvement based on de-identified productivity data signals
US12422925B1 (en) Methods and systems for interacting with digital objects using a gaze tracking-enabled headset
CN113495976A (en) Content display method, device, equipment and storage medium
CN111880854A (en) Method and apparatus for processing speech
US20250110568A1 (en) Pinch Compensation for Markup
US20260011093A1 (en) Systems and methods for identifying a targeted object for ai-assisted interactions using a head-wearable device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination