WO2024069473A1 - Method, system, and computer program product for drawing and fine-tuned motor controls - Google Patents
Method, system, and computer program product for drawing and fine-tuned motor controls Download PDFInfo
- Publication number
- WO2024069473A1 WO2024069473A1 PCT/IB2023/059613 IB2023059613W WO2024069473A1 WO 2024069473 A1 WO2024069473 A1 WO 2024069473A1 IB 2023059613 W IB2023059613 W IB 2023059613W WO 2024069473 A1 WO2024069473 A1 WO 2024069473A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cursor
- location
- gaze point
- activation zone
- user device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
Definitions
- This disclosure relates generally to drawing and fine-tuned motor controls and, in nonlimiting embodiments, systems, methods, and computer program products for drawing and fine-tuned motor controls.
- eye tracking technology e.g., software, hardware, etc.
- users are able to interact with computing devices (e.g., desk tops, laptops, tablets, smart phones, smart watches, etc.) using eye movements.
- computing devices e.g., desk tops, laptops, tablets, smart phones, smart watches, etc.
- eye movements For example, users may use eye movements to control the direction of a cursor and/or select an icon and/or selectable button by changing the direction of their eye gaze, pausing eye movements, and/or staring (e.g., allowing their eye gaze to dwell).
- gaze data e.g., eye image data
- eye movements e.g., eye image data
- the method may include displaying, with at least one processor, data comprising an activation zone via a display of a user device.
- the method may include receiving, with at least one processor, gaze data from a user comprising a gaze point.
- the method may include determining, with at least one processor, a location of the gaze point is within the activation zone.
- the method may include moving, with at least one processor, a cursor in a direction of the gaze point based on determining the location of the gaze point is within the activation zone.
- the method may include in response to moving the cursor in a direction of the gaze point, causing, with at least one processor, an action to be performed. In some nonlimiting embodiments, the method may include displaying, with at least one processor, data associated with the action via the display of the user device.
- determining the location of the gaze point is within the activation zone may include: determining a time the location of the gaze point is within the activation zone satisfies a threshold value; and determining the location of the gaze point is within the activation zone based on determining the time the location of the gaze point is within the activation zone satisfies the threshold value.
- moving the cursor in the direction of the gaze point may include: determining a location of the cursor; determining a distance between the location of the gaze point and the location of the cursor; and moving the cursor in the direction of the gaze point at a speed, the speed of a cursor movement is based on the distance between the location of the gaze point and the location of the cursor.
- the method may further include stopping movement of the cursor when the location of the gaze point is not within the activation zone, wherein stopping movement of the cursor causes the action to end.
- the activation zone may include an annulus between two concentric circles.
- an outer edge of the activation zone may include an icon
- the method may further include: receiving a selection of the icon by the user; and stopping movement of the cursor, wherein stopping movement of the cursor causes the action to end.
- a size of the activation zone may be based on a percentage of a width of a display of the user device.
- the system may include at least one processor programmed or configured to display data comprising an activation zone via a display of a user device.
- the at least one processor may be programmed or configured to receive gaze data from a user comprising a gaze point.
- the at least one processor may be programmed or configured to determine a location of the gaze point is within the activation zone.
- the at least one processor may be programmed or configured to move a cursor in a direction of the gaze point based on determining the location of the gaze point is within the activation zone.
- the at least one processor may be programmed or configured to in response to moving the cursor in a direction of the gaze point, cause an action to be performed. In some non-limiting embodiments, the at least one processor may be programmed or configured to display data associated with the action via the display of the user device.
- the at least one processor when determining the location of the gaze point is within the activation zone, may be programmed or configured to: determine a time the location of the gaze point is within the activation zone satisfies a threshold value; and determine the location of the gaze point is within the activation zone based on determining the time the location of the gaze point is within the activation zone satisfies the threshold value.
- the at least one processor when moving the cursor in the direction of the gaze point, may be programmed or configured to: determine a location of the cursor; determine a distance between the location of the gaze point and the location of the cursor; and move the cursor in the direction of the gaze point at a speed, the speed of a cursor movement is based on the distance between the location of the gaze point and the location of the cursor.
- the at least one processor may be further programmed or configured to: stop movement of the cursor when the location of the gaze point is not within the activation zone, wherein stopping movement of the cursor may cause the action to end.
- the activation zone may include an annulus between two concentric circles.
- an outer edge of the activation zone may include an icon
- the at least one processor may be further programmed or configured to: receive a selection of the icon by the user; and stop movement of the cursor, wherein stopping movement of the cursor may cause the action to end.
- a size of the activation zone may be based on a percentage of a width of a display of the user device.
- the computer program product may include at least one non-transitory computer-readable medium including one or more instructions that, when executed by at least one processor, cause the at least one processor to: display data comprising an activation zone via a display of a user device; receive gaze data from a user comprising a gaze point; determine a location of the gaze point is within the activation zone.
- the instructions may cause the at least one processor to move a cursor in a direction of the gaze point based on determining the location of the gaze point is within the activation zone.
- the instructions may cause the at least one processor to in response to moving the cursor in a direction of the gaze point, cause an action to be performed. In some non-limiting embodiments, the instructions may cause the at least one processor to display data associated with the action via the display of the user device.
- the instructions when determining the location of the gaze point is within the activation zone, may cause the at least one processor to: determine a time the location of the gaze point is within the activation zone satisfies a threshold value; and determine the location of the gaze point is within the activation zone based on determining the time the location of the gaze point is within the activation zone satisfies the threshold value.
- the instructions when moving the cursor in the direction of the gaze point, may cause the at least one processor to: determine a location of the cursor; determine a distance between the location of the gaze point and the location of the cursor; and move the cursor in the direction of the gaze point at a speed, the speed of a cursor movement may be based on the distance between the location of the gaze point and the location of the cursor.
- the instructions may further cause the at least one processor to: stop movement of the cursor when the location of the gaze point is not within the activation zone, wherein stopping movement of the cursor may cause the action to end.
- the activation zone may include an annulus between two concentric circles.
- an outer edge of the activation zone may include an icon
- the instructions may further cause the at least one processor to: receive a selection of the icon by the user; and stop movement of the cursor, wherein stopping movement of the cursor may cause the action to end.
- a size of the activation zone may be based on a percentage of a width of a display of the user device.
- a method for drawing and fine-tuned motor controls comprising: displaying, with at least one processor, data comprising an activation zone via a display of a user device; receiving, with at least one processor, gaze data from a user comprising a gaze point; determining, with at least one processor, a location of the gaze point is within the activation zone; moving, with at least one processor, a cursor in a direction of the gaze point based on determining the location of the gaze point is within the activation zone; in response to moving the cursor in a direction of the gaze point, causing, with at least one processor, an action to be performed; and displaying, with at least one processor, data associated with the action via the display of the user device.
- Clause 2 The method of clause 1, wherein determining the location of the gaze point is within the activation zone comprises: determining a time the location of the gaze point is within the activation zone satisfies a threshold value; and determining the location of the gaze point is within the activation zone based on determining the time the location of the gaze point is within the activation zone satisfies the threshold value.
- Clause 3 The method of clause 1 or clause 2, wherein moving the cursor in the direction of the gaze point comprises: determining a location of the cursor; determining a distance between the location of the gaze point and the location of the cursor; and moving the cursor in the direction of the gaze point at a speed, wherein the speed of a cursor movement is based on the distance between the location of the gaze point and the location of the cursor.
- Clause 4 The method of any of clauses 1-3, further comprising: stopping movement of the cursor when the location of the gaze point is not within the activation zone, wherein stopping movement of the cursor causes the action to end.
- Clause 5 The method of any of clauses 1-4, wherein the activation zone comprises an annulus between two concentric circles.
- Clause 6 The method of any of clauses 1-5, wherein an outer edge of the activation zone comprises an icon, the method further comprising: receiving a selection of the icon by the user; and stopping movement of the cursor, wherein stopping movement of the cursor causes the action to end.
- Clause 7 The method of any of clauses 1-6, wherein a size of the activation zone is based on a percentage of a width of a display of the user device.
- a system for drawing and fine-tuned motor controls comprising: at least one processor, the at least one processor programmed or configured to: display data comprising an activation zone via a display of a user device; receive gaze data from a user comprising a gaze point; determine a location of the gaze point is within the activation zone; move a cursor in a direction of the gaze point based on determining the location of the gaze point is within the activation zone; in response to moving the cursor in a direction of the gaze point, cause an action to be performed; and display data associated with the action via the display of the user device.
- Clause 9 The system of clause 8, wherein, when determining the location of the gaze point is within the activation zone, the at least one processor is programmed or configured to: determine a time the location of the gaze point is within the activation zone satisfies a threshold value; and determine the location of the gaze point is within the activation zone based on determining the time the location of the gaze point is within the activation zone satisfies the threshold value.
- Clause 10 The system of clause 8 or clause 9, wherein, when moving the cursor in the direction of the gaze point, the at least one processor is programmed or configured to: determine a location of the cursor; determine a distance between the location of the gaze point and the location of the cursor; and move the cursor in the direction of the gaze point at a speed, wherein the speed of a cursor movement is based on the distance between the location of the gaze point and the location of the cursor.
- Clause 11 The system of any of clauses 8-10, wherein the at least one processor is further programmed or configured to: stop movement of the cursor when the location of the gaze point is not within the activation zone, wherein stopping movement of the cursor causes the action to end.
- Clause 12 The system of any of clauses 8-11, wherein the activation zone comprises an annulus between two concentric circles.
- Clause 13 The system of any of clauses 8-12, wherein an outer edge of the activation zone comprises an icon, and wherein the at least one processor is further programmed or configured to: receive a selection of the icon by the user; and stop movement of the cursor, wherein stopping movement of the cursor causes the action to end.
- Clause 14 The system of any of clauses 8-13, wherein a size of the activation zone is based on a percentage of a width of a display of the user device.
- a computer program product for drawing and fine-tuned motor controls comprising at least one non-transitory computer-readable medium including one or more instructions that, when executed by at least one processor, cause the at least one processor to: display data comprising an activation zone via a display of a user device; receive gaze data from a user comprising a gaze point; determine a location of the gaze point is within the activation zone; move a cursor in a direction of the gaze point based on determining the location of the gaze point is within the activation zone; in response to moving the cursor in a direction of the gaze point, cause an action to be performed; and display data associated with the action via the display of the user device.
- Clause 16 The computer program product of clause 15, wherein, when determining the location of the gaze point is within the activation zone, the instructions cause the at least one processor to: determine a time the location of the gaze point is within the activation zone satisfies a threshold value; and determine the location of the gaze point is within the activation zone based on determining the time the location of the gaze point is within the activation zone satisfies the threshold value.
- Clause 17 The computer program product of clause 15 or clause 16, wherein, when moving the cursor in the direction of the gaze point, the instructions cause the at least one processor to: determine a location of the cursor; determine a distance between the location of the gaze point and the location of the cursor; and move the cursor in the direction of the gaze point at a speed, wherein the speed of a cursor movement is based on the distance between the location of the gaze point and the location of the cursor.
- Clause 18 The computer program product of any of clauses 15-17, wherein the instructions further cause the at least one processor to: stop movement of the cursor when the location of the gaze point is not within the activation zone, wherein stopping movement of the cursor causes the action to end.
- Clause 19 The computer program product of any of clauses 15-18, wherein the activation zone comprises an annulus between two concentric circles.
- Clause 20 The computer program product of any of clauses 15-19, wherein an outer edge of the activation zone comprises an icon, and wherein the instructions further cause the at least one processor to: receive a selection of the icon by the user; and stop movement of the cursor, wherein stopping movement of the cursor causes the action to end.
- Clause 21 The computer program of any of clauses 15-20, wherein a size of the activation zone is based on a percentage of a width of a display of the user device.
- FIG. 1 is a schematic diagram of a system for drawing and fine-tuned motor controls according to a non-limiting embodiment
- FIG. 2 is a diagram of example components of a device used in connection with nonlimiting embodiments or aspects of the disclosed subject matter
- FIG. 3A is a flow diagram of a process for drawing and fine-tuned motor controls according to non-limiting embodiments
- FIG. 3B is a flow diagram of a process for drawing and fine-tuned motor controls according to non-limiting embodiments.
- FIGS. 4A-4H are an exemplary implementation of the processes shown in FIGS. 3 A and 3B according to non-limiting embodiments.
- the term “communication” may refer to the reception, receipt, transmission, transfer, provision, and/or the like of data (e.g., information, signals, messages, instructions, commands, and/or the like).
- data e.g., information, signals, messages, instructions, commands, and/or the like.
- one unit e.g., a device, a system, a component of a device or system, combinations thereof, and/or the like
- this may refer to a direct or indirect connection (e.g., a direct communication connection, an indirect communication connection, and/or the like) that is wired and/or wireless in nature.
- two units may be in communication with each other even though the information transmitted may be modified, processed, relayed, and/or routed between the first and second unit.
- a first unit may be in communication with a second unit even though the first unit passively receives information and does not actively transmit information to the second unit.
- a first unit may be in communication with a second unit if at least one intermediary unit processes information received from the first unit and communicates the processed information to the second unit.
- the term “user device” may refer to one or more electronic devices configured to process data.
- a user device may, in some examples, include the necessary components to receive, process, and output data, such as a processor, a display, a memory, an input device, a network interface, and/or the like.
- a user device may be a mobile device.
- a mobile device may include a cellular phone (e.g., a smartphone or standard cellular phone), a portable computer, a wearable device (e.g., watches, glasses, lenses, clothing, and/or the like), a personal digital assistant (PDA), and/or other like devices.
- a user device may also be a desktop computer or other form of non-mobile computer.
- the term “eye tracking device” may refer to one or more electronic devices configured to receive, capture, and/or process data (e.g., gaze data).
- An eye tracking device may, in some examples, may include a camera, image sensor(s) (e.g., complementary metal oxide semiconductor (CMOS) sensors, charged coupled device (CCD) sensors, and/or the like), and/or lights.
- CMOS complementary metal oxide semiconductor
- CCD charged coupled device
- An eye tracking device in some examples, may be part of (e.g., integrated into) a user device. Alternatively, an eye tracking device may be an accessory for a user device.
- Non-limiting embodiments of the present disclosure may include a method comprising displaying data comprising an eye tracking zone via a display of a user device.
- the eye tracking zone may comprise an activation zone.
- a user device may include a display (e.g., a screen) which displays an eye tracking zone, including an activation zone.
- the method may include receiving, gaze data from a user.
- the gaze data may include a gaze point.
- the method may include determining whether a location of the gaze point is within the activation zone.
- the method may include moving a cursor in a direction of the gaze point.
- the method may include moving the cursor in a direction of the gaze point based on determining the location of the gaze point is within the activation zone.
- the method may include, in response to moving the cursor in a direction of the gaze point, causing an action to be performed. For example, moving a cursor may cause a line to be drawn.
- the method may include displaying data associated with the action via the display of the user device. For example, moving the cursor may cause a line to be drawn and displayed on the display of the user device.
- determining whether the location of the gaze point is within the activation zone may comprise determining a time (e.g., in seconds, milliseconds, etc.) the location of the gaze point is within the activation zone exceeds a threshold value (e.g., 0 seconds to 1 second).
- moving the cursor in the direction of the gaze point may comprise: determining a location of the cursor; determining a distance between the location of the gaze point and the location of the cursor; and moving the cursor in the direction of the gaze point at a speed. The speed of the cursor movement may be based on the distance between the location of the gaze point and the first location of the cursor.
- the method may further comprise stopping movement of the cursor when the location of the gaze point is not within the activation zone.
- the movement of the cursor may stop, causing the action being performed to end.
- the activation zone may comprise an annulus between two concentric circles.
- an outer edge of the activation zone may comprise an icon (e.g., a selection button).
- the method may further comprise receiving a selection of the icon by the user; and stopping movement of the cursor, wherein stopping movement of the cursor causes the action to end.
- an outer edge of the activation zone may include an “END” icon, which may be selected by the user to end an action (e.g., drawing).
- a size of the activation zone may be based on a percentage of a width of the display (e.g., a screen) of the user device.
- non-limiting embodiments of the present disclosure enables users to interact with user devices based on eye movements. Additionally, non-limiting embodiments enable users to perform actions (e.g., drawing, scrubbing videos, editing images, etc.) that require control of the speed and/or precision of cursor movements. For example, some nonlimiting embodiments allow a user to draw only within the activation zone displayed on the user device so that if the user moves their eyes away from the gaze point and outside of the activation zone, the drawing will stop. This may be advantageous to users who are attempting to draw an image on a user device that receives pop-up notifications. If a notification appears on the display of the user device, the users gaze point may quickly and/or unintentionally move in the direction of the notification. If the pop-up is outside of the activation zone, the drawing action will stop when the location of the gaze point exits the activation zone instead of creating unintended markings on the image as a result of the unintentional eye movements.
- actions e.g., drawing, scrubbing videos
- FIG. 1 is a diagram of a system for drawing and fine-tuned controls according to some non-limiting embodiments.
- system 100 may include user device 102, eye tracking device 104, and/or user 106.
- user device 102 may include one or more user devices capable of communicating information to and/or receiving information from eye tracking device 104 and/or user 106.
- user device 102 may be a personal computer (e.g., desktop, laptop, tablet, smart phone, etc.) that communicates information to and/or receives input from eye tracking device 104 and/or user 106.
- user device 102 may include a display.
- user device 102 may include one or more displays (screens, monitors, etc.) to display data to user 106 via a graphical user interface (GUI) on the display.
- GUI graphical user interface
- the GUI may be an interactive GUI.
- the GUI may be configured to receive input data from eye tracking device 104 and/or user 106.
- data displayed via the interactive GUI e.g., buttons, icons, drop down menus, and/or the like
- user device 102 may be configured to be housed in a case and/or mounted to a surface.
- user device 102 may include eye tracking device 104.
- eye tracking device 104 may be part of user device 102.
- eye tracking device 104 may be configured to be affixed on and/or near user device 102.
- eye tracking device 104 may be a portable device.
- eye tracking device 104 may be configured for indoor and/or outdoor use.
- eye tracking device 104 may include one or more devices capable of communicating information to and/or receiving information from user device 102 and/or user 106.
- eye tracking device 104 may receive gaze data from user 106 and/or communicate the gaze data to user device 102.
- user device 102 and/or eye tracking device 104 may be calibrated for use by a specific user.
- user 106 may communicate information to and/or receive information from user device 102 and/or eye tracking device 104.
- user 106 may communicate (e.g., input) gaze data into eye tracking device 104 and/or user device 102.
- user 106 may input data into user device 102 via one or more peripherals of the user device (e.g., keyboard, mouse, audio device, camera, touchpad, etc.).
- FIG. 1 The number and arrangement of systems and/or devices shown in FIG. 1 are provided as an example. There may be additional systems and/or devices; fewer systems and/or devices; different systems and/or devices; and/or differently arranged systems and/or devices than those shown in FIG. 1. Furthermore, two or more systems or devices shown in FIG. 1 may be implemented within a single system or device, or a single system or device shown in FIG. 1 may be implemented as multiple, distributed systems or devices. Additionally or alternatively, a set of systems (e.g., one or more systems) or a set of devices (e.g., one or more devices) of system 100 may perform one or more functions described as being performed by another set of systems or another set of devices of system 100.
- a set of systems e.g., one or more systems
- a set of devices e.g., one or more devices
- Device 200 may correspond to user device 102 (e.g., one or more devices of user device 102) and/or eye tracking device 104 (e.g., one or more devices of eye tracking device 104).
- user device 102 and/or eye tracking device 104 may include at least one device 200 and/or at least one component of device 200.
- device 200 may include bus 202, processor 204, memory 206, storage component 208, input component 210, output component 212, and communication interface 214.
- Bus 202 may include a component that permits communication among the components of device 200.
- processor 204 may be implemented in hardware, software, or a combination of hardware and software.
- processor 204 may include a processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), etc.), a microprocessor, a digital signal processor (DSP), and/or any processing component (e.g., a field-programmable gate array (FPGA), an application- specific integrated circuit (ASIC), etc.) that can be programmed to perform a function.
- Memory 206 may include random access memory (RAM), read-only memory (ROM), and/or another type of dynamic or static storage memory (e.g., flash memory, magnetic memory, optical memory, etc.) that stores information and/or instructions for use by processor 204.
- RAM random access memory
- ROM read-only memory
- static storage memory e.g., flash memory, magnetic memory, optical memory, etc.
- Storage component 208 may store information and/or software related to the operation and use of device 200.
- storage component 208 may include a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.), a compact disc (CD), a digital versatile disc (DVD), a floppy disk, a cartridge, a magnetic tape, and/or another type of computer-readable medium, along with a corresponding drive.
- Input component 210 may include a component that permits device 200 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, a microphone, etc.). Additionally or alternatively, input component 210 may include a sensor for sensing information (e.g., a global positioning system (GPS) component, an accelerometer, a gyroscope, an actuator, etc.). Output component 212 may include a component that provides output information from device 200 (e.g., a display, a speaker, one or more light-emitting diodes (LEDs), etc.).
- GPS global positioning system
- LEDs light-emitting diodes
- Communication interface 214 may include a transceiver-like component (e.g., a transceiver, a separate receiver and transmitter, etc.) that enables device 200 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections.
- Communication interface 214 may permit device 200 to receive information from another device and/or provide information to another device.
- communication interface 214 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi® interface, a cellular network interface, and/or the like.
- Device 200 may perform one or more processes described herein. Device 200 may perform these processes based on processor 204 executing software instructions stored by a computer-readable medium, such as memory 206 and/or storage component 208.
- a computer- readable medium e.g., a non-transitory computer-readable medium
- a non-transitory memory device includes memory space located inside of a single physical storage device or memory space spread across multiple physical storage devices.
- Software instructions may be read into memory 206 and/or storage component 208 from another computer-readable medium or from another device via communication interface 214. When executed, software instructions stored in memory 206 and/or storage component 208 may cause processor 204 to perform one or more processes described herein. Additionally or alternatively, hardwired circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, embodiments or aspects described herein are not limited to any specific combination of hardware circuitry and software. [0076] The number and arrangement of components shown in FIG. 2 are provided as an example. In some non-limiting embodiments or aspects, device 200 may include additional components, fewer components, different components, or differently arranged components than those shown in FIG. 2. Additionally or alternatively, a set of components (e.g., one or more components) of device 200 may perform one or more functions described as being performed by another set of components of device 200.
- one or more steps of process 300 may be executed by user device 102, eye tracking device 104, and/or user 106. Additionally or alternatively, one or more steps of process 300 may be executed (e.g., completely, partially, and/or the like) by another system, another device, another group of systems, or another group of devices, separate from or including user device 102, eye tracking device 104, and/or user 106.
- process 300 may include displaying data including an eye tracking zone on a user device.
- user device 102 may display data including an eye tracking zone via a display of user device 102.
- user device 102 may display data including the eye tracking zone via a GUI on a display of user device 102.
- the eye tracking zone may be displayed over a software application (e.g., digital illustration application, image editing application, video editing application, and/or the like) being used by a user of user device 102.
- the eye tracking zone may include at least one selectable button which may be used to open a drawing mode over the application.
- a size of the eye tracking zone may be equivalent to a size of the display of user device 102 and/or a percentage of a size (e.g., height, width, and/or area of the display) of the display of user device 102.
- the eye tracking zone may include an activation zone.
- the activation zone may be an area of the eye tracking zone displayed on a display of user device 102.
- a size of the activation zone may be a percentage of the size of the eye tracking zone and/or a percentage of the size of the display of user device 102.
- the size of the activation zone may be between 1% and 40% of the size of the display of user device 102.
- the activation zone may include the area of a single shape (e.g., the space enclosed within a perimeter of the shape) with one or more sides.
- the activation zone may include the area of a circle, oval, triangle, square, rectangle, pentagon, hexagon, etc.
- the activation zone may include an area between two shapes with any number of sides.
- the activation zone may be an area between two concentric circles (e.g., an annulus of the two concentric circles).
- the activation zone may include two concentric circles, where an area of a first circle is smaller than an area of the second circle.
- the area of the first circle and/or the area of the second circle may be a first percentage and/or a second percentage of the size of the display of user device 102, respectively.
- the area of the first circle may be 3% of the size of the display of user device 102 and/or the area of the second circle may be 40% of the size of the display of user device 102.
- the size of the activation zone may be calculated by subtracting the size of the area of the first circle from the size of the area of the second circle.
- the activation zone may be 37% of the size of the display of user device 102.
- the area of the first circle may include a dragging zone.
- the center of the first circle may execute a drag action.
- the dragging zone may replace the cursor.
- a central point within the dragging zone may replace the cursor.
- an outer edge of the eye tracking zone and/or an outer edge of the activation zone may include an icon and/or selectable button.
- the outer edge of the eye tracking zone and/or the outer edge of the activation zone may include an icon and/or selectable button that, when selected by the user, will close the activation zone and/or end a drawing mode.
- an outer edge of the second circle may include the icon and/or selectable button.
- user device 102 and/or eye tracking device 104 may receive gaze data indicating a user’s selection of the icon and/or selectable button.
- user device 102 and/or eye tracking device 104 may stop movement of the cursor, close the activation zone, causing the activation zone to disappear from the display of user device 102, and/or end the drawing mode.
- process 300 may include receiving gaze data.
- user device 102 and/or eye tracking device 104 may receive gaze data from user 106 via one or more sensors (e.g., image sensors).
- the gaze data may include eye image data.
- the gaze data may include images of the user’s eye(s).
- upon receiving the gaze data user device 102 and/or eye tracking device 104 may process the gaze data.
- eye tracking device 104 may filter, edit, or alter the gaze data using image processing techniques.
- eye tracking device 104 may transmit the gaze data to user device 102.
- user device 102 may receive the gaze data from eye tracking device 104 and/or process the gaze data. In some non-limiting embodiments, user device 102 may display data on a display of user device 102 based on receiving and/or processing the gaze data.
- process 300 may include determining a location of a gaze point.
- user device 102 and/or eye tracking device 104 may determine a location of a gaze point based on the gaze data.
- user device 102 and/or eye tracking device 104 may determine whether the location of the gaze point is within the activation zone.
- user device 102 and/or eye tracking device 104 may determine the location of the gaze point is within the annulus of the activation zone.
- determining the location of the gaze point is within the activation zone may include determining a time the location of the gaze point is within the activation zone exceeds a threshold value.
- user device 102 and/or eye tracking device 104 may determine the time the location of the gaze point is within the activation zone exceeds the threshold value.
- the threshold value may be an adjustable value and/or a preset value.
- process 300 may include moving a cursor in a direction of the gaze point.
- user device 102 may control movement of a cursor and move the cursor in a direction of the gaze point (e.g., toward the gaze point) based on the location of the gaze point within the activation zone.
- the cursor may appear on a display of user device 102.
- moving the cursor in the direction of the gaze point may include determining a location of the cursor; determining a distance between the location of the gaze point and the location of the cursor; and/or moving the cursor in the direction of the gaze point.
- user device 102 and/or eye tracking device 104 may determine a location of the cursor; determine a distance between the location of the gaze point and the location of the cursor; and/or move the cursor in the direction of the gaze point.
- user device 102 and/or eye tracking device 104 may move the cursor from a first location in the direction of the gaze point to a second location at a speed.
- the speed of the cursor movement may be based on a distance between the location of the gaze point and the first location of the cursor.
- user device 102 and/or eye tracking device 104 may stop movement of the cursor.
- user device 102 and/or eye tracking device 104 may stop movement of the cursor based on the gaze data. In some non-limiting embodiments, user device 102 and/or eye tracking device 104 may stop the movement of the cursor when the location of the gaze point is not within the activation zone.
- the activation zone may move across (e.g., in any direction) the display of user device 102.
- the activation zone may move across the display of user device 102 based on the gaze data and/or the location of the gaze point.
- process 300 may include causing an action.
- user device 102 may cause an action to be performed based on the gaze data.
- user device 102 in response to moving the cursor in a direction of the gaze point, may cause an action, such as drawing a line, to be performed.
- user device 102 in response to moving the cursor in a direction of the gaze point, may draw an image, edit a photo, edit a video, scrub a video, and/or the like.
- the action may be ending a prior action.
- the action may be ending the action of drawing an image, editing a photo, editing a video, scrubbing a video, and/or the like.
- process 300 may include displaying data.
- user device 102 may display data via a display of user device 102.
- the data may be associated with the action.
- the data may be displayed via an interactive GUI on a display of user device 102.
- the gaze data may be used to draw an image which may be displayed on the display of user device 102.
- FIG. 3B is a flow diagram of a process 314 for drawing and fine-tuned motor controls according to non-limiting embodiments.
- one or more steps of process 314 may be executed by user device 102, eye tracking device 104, and/or user 106. Additionally or alternatively, one or more steps of process 314 may be executed (e.g., completely, partially, and/or the like) by another system, another device, another group of systems, or another group of devices, separate from or including user device 102, eye tracking device 104, and/or user 106.
- process 314 may include receiving gaze data.
- user device 102 and/or eye tracking device 104 may receive gaze data from user 106, where the gaze data includes image data of the user’s eye(s) including a location of the gaze point, where the location of the gaze point indicates a location where the user is looking on a display of user device 102.
- process 314 may include determining whether the location of the gaze point is inside the activation zone. For example, user device 102 and/or eye tracking device 104 may determine whether the location of the gaze point is inside the activation based on the gaze data, where the activation zone includes two concentric circles, and where the location of the gaze point is within the annulus of the two concentric circles. In some non-limiting embodiments, if the location of the gaze point is within the activation zone, the process will proceed to step 320. Alternatively, if the location of the gaze point is not within the activation zone, the process will process to step 324.
- process 314 may include determining whether the time the location of the gaze point is within the activation zone exceeds the threshold value. For example, after determining the location of the gaze point is within the activation zone, user device 102 and/or eye tracking device 104 may determine the location of the gaze point is within the activation zone for a time exceeding a preset threshold value. In some non-limiting embodiments, if user device 102 and/or eye tracking device 104 determines the time the location of the gaze point is within the activation zone exceeds the threshold value, process 314 will proceed to step 322.
- process 314 will proceed to step 324.
- process 300 may include additional steps, fewer steps, different steps, or differently arranged steps than those shown in FIG. 3A.
- process 314 may include moving the cursor towards the location of the gaze. For example, after determining the time the location of the gaze point is within the activation zone exceeds the threshold value, user device 102 and/or eye tracking device 104 may move the cursor towards the location of the gaze point.
- the activation zone when moving the cursor towards the location of the gaze point, the activation zone may move with the cursor. For example, when moving the cursor towards the location of the gaze point, the activation zone may move with the cursor where the cursor remains at the central point of the dragging zone.
- process 314 may include determining the location of the gaze point is over the icon and/or selectable button. For example, after determining the location of the gaze point is not within the activation zone and/or after determining the time the gaze point is located within the activation zone does not exceed the threshold value, user device 102 and/or eye tracking device 104 may determine whether the location of the gaze point is over the icon and/or selectable button located on the outer edge of the activation zone. In some non-limiting embodiments, if user device 102 and/or eye tracking device 104 determines the location of the gaze point is over the icon and/or selectable button, process 314 may proceed to step 326.
- user device 102 and/or eye tracking device 104 may determine whether a time the location of the gaze point is over the icon and/or selectable button exceeds a second threshold value.
- process 314 may include stopping an action. For example, if user device 102 and/or eye tracking device 104 determine the location of the gaze point is over the icon and/or selectable button located at the edge of the activation zone, user device 102 and/or eye tracking device 104 may stop an action (e.g., drawing, editing an image, editing a video, scrubbing a video, etc.).
- an action e.g., drawing, editing an image, editing a video, scrubbing a video, etc.
- process 314 may include additional steps, fewer steps, different steps, or differently arranged steps than those shown in FIG. 3B.
- FIGS. 4A-4H are an exemplary implementation of the processes shown in FIGS. 3 A and 3B according to non-limiting embodiments.
- implementation 400 may include user device 402, eye tracking device 404, user 406, activation zone 408, gaze point 410, first circle 412, second circle 414, cursor locations 416, 418, 420, 422, icon 424, and/or eye tracking zone 426.
- user device 402 may be the same as, similar to, and/or part of user device 102.
- eye tracking device 104 may be the same as, similar to, and/or part of eye tracking device 104.
- user 406 may be the same as and/or similar to user 106.
- user 406 may calibrate user device 402 and/or eye tracking device 404 to capture eye image data and allow the user to communicate with user device 402 via eye gaze control. After calibrating user device 402 and/or eye tracking device 404, the user may use eye gaze control to communicate with user device 402.
- user device 402 may display eye tracking zone 426 via a display of user device 402.
- the size of eye tracking zone 426 may be smaller or equivalent to the size of the display of user device 402.
- the size of eye tracking zone 426 may be a percentage (e.g., 40% - 100%) of the display of user device 402.
- the size of eye tracking zone 426 may be based on a user’s calibration of user device 402 and/or eye tracking device 404.
- user 406 may open an application (e.g., a drawing or editing software application, such as Microsoft Paint) on user device 402. Upon opening the application, user 406 may select to open a drawing mode over the application using at least one selectable icon.
- the drawing mode may enable user 406 to obtain fine-tuned control of the cursor to create a drawing and/or edit an image within the open application.
- user device 402 may display a plurality of selectable icons via an interactive GUI on user device 402, where the plurality of icons may include at least a drawing mode icon. Using eye gaze control, the user may select the drawing mode icon and open a drawing mode within eye tracking zone 426.
- the interactive GUI may update the plurality of selectable icons based on a first selection from user 406. For example, after a selection of the drawing mode icon by user 406, the interactive GUI may update to include a hide the guide icon, an adjust target icon, a draw line icon, an undo icon, a redo icon, and/or a close icon.
- eye tracking zone 426 may include activation zone 408 and/or icon 424.
- activation zone 408 upon selecting the drawing mode icon and opening the drawing mode over the application, activation zone 408 may appear via an interactive GUI on the display of user device 402.
- drawing mode may include one or more drawing tools which may be chosen by user 406 based on selecting a respective selectable icon.
- the interactive GUI may update to include a selectable icon for a trace tool, an adjust target tool, and/or a draw line.
- the trace tool may follow the user’s gaze and provide feedback about the location of the gaze point on the screen.
- the adjust target tool may allow the user to redefine the gaze point in a zoomed and centered window appearing on the display of user device 402, giving the user more control of a starting point of a drawing and/or line.
- the draw line tool may display activation zone 408.
- activation zone 408 may include first circle 412 and/or second circle 414.
- first circle 412 may appear inside of second circle 414.
- first circle 412 and second circle 414 may be concentric circles, where a diameter of first circle 412 is smaller than a diameter of second circle 414.
- the activation zone may include the annulus between the two concentric circles 412, 414.
- any part of the display outside of the perimeter of second circle 414 may be outside of the activation zone. Additionally or alternatively, any part of the display inside the perimeter of first circle 412 may be outside of the activation zone.
- activation zone 408 may take the place of a cursor on the display of user device 402.
- activation zone 408 may move about the display of user device 402 in place of a cursor pointer.
- a cursor pointer may appear outside of the activation zone and/or inside of the activation zone.
- a cursor pointer may appear outside of the activation zone, at the center of first circle 412.
- activation zone 408 may be used on top of software applications (e.g., digital illustration applications, image editing applications, video editing applications, and/or the like). For example, activation zone 408 may appear over a digital illustration software (e.g., Microsoft Paint) being run on user device 402, where activation zone 408 takes the place of the cursor on the display of user device 402.
- software applications e.g., digital illustration applications, image editing applications, video editing applications, and/or the like.
- activation zone 408 may appear over a digital illustration software (e.g., Microsoft Paint) being run on user device 402, where activation zone 408 takes the place of the cursor on the display of user device 402.
- user device 402 and/or eye tracking device 404 may receive gaze data from user 406.
- user 406 may look at or near user device 402 and/or eye tracking device 404 and eye tracking device 404 may receive, collect, process, and/or store the user’s gaze data.
- eye tracking device 404 may transmit (via a wireless and/or a wired connection) the gaze data to user device 402.
- eye tracking device 404 may receive gaze data from a user and provide the gaze data to user device 402 via a connection between the two devices.
- user device 402 may receive gaze data directly from user 406.
- the gaze data may include image data (e.g., eye images), gaze point data (e.g., location of a gaze point, time a gaze point is in a location, etc.), and/or the like.
- user device 102 and/or eye tracking device 104 may determine a location of gaze point 410 based on the gaze data.
- user device 402 and/or eye tracking device 404 may determine the location of gaze point 410 is within activation zone 408.
- user device 402 and/or eye tracking device 404 may determine the location of gaze point 410 is within activation zone 408 if gaze point 410 is in an area between first circle 412 and second circle 414.
- determining the location of gaze point 410 is within the activation zone may include determining the time the location of gaze point 410 is within the activation zone exceeds the threshold value.
- user device 402 and/or eye tracking device 404 may move cursor 416 in a direction of gaze point 410.
- user device 402 and/or eye tracking device 404 may move the cursor in a direction A, B, and/or C of gaze point 410 based on the gaze data (e.g., a location of the gaze point).
- the cursor may be moved from a first cursor location 416 in a direction, A, to a second cursor location 418.
- the cursor may be moved from the second cursor location 418 in a direction, B, to a third cursor location 420.
- the cursor may be moved from the third cursor location 420 in a direction, C, to a fourth cursor location 422.
- moving the cursor 416 in the direction, A, of gaze point 410 may include determining a location of the cursor (e.g., a cursor location).
- user 406 may interact with user device 402 and/or eye tracking device 404 and control the movement of the cursor appearing on the display of user device 402 by changing a parameter of a plurality of parameters.
- the plurality of parameters may include at least a direction of cursor movement and a speed of cursor movement.
- the direction of the cursor may be determined by moving the cursor toward gaze point 410.
- a cursor location may be determined based on the following equation, where M new is the second cursor location (e.g., new cursor location), M oid is the first cursor location (e.g., old cursor location), and where m mouse is a vector describing the movement of the cursor:
- the movement of the cursor, m mouse may be defined by:
- the speed of the cursor movement may be determined based on a distance between the cursor and gaze point 410.
- S max and S mjn may describe a maximum speed and a minimum speed of the cursor movement, respectively.
- a value of S max and/or S m in may be a constant value.
- a speed of S max and/or S mjn may be measured using a relative speed unit. For example, a speed for S max and/or S mjn may be measured by a speed per move (spm).
- S max may have a value of 150 spm.
- S mjn may have a value of 20 spm.
- moving the cursor 416 in the direction, A, of gaze point 410 may include determining a distance between the location of the gaze point and the location of the cursor.
- d may be a normalized vector of d, where d is a vector from a cursor location, m p , to a gaze location, g p .
- d may be defined by: d g p Trip
- moving the cursor 416 in the direction, A, of gaze point 410 may include moving the cursor in the direction of the gaze point at a speed, wherein the speed of the cursor movement is based on the distance between the location of the gaze point and the first location of the cursor.
- s may be a speed of a cursor movement at a given time based on the following, where
- the s may be a factor between 1 and 0.
- D min may be a minimum distance between m p and g p required for cursor movement and/or D max may be a maximum distance between m p and g p allowed before the cursor movement will stop.
- cursor movement may be based on the following: if
- an action in response to moving the cursor in a direction (e.g., A, B, C) of the gaze point, an action may be performed.
- a direction e.g., A, B, C
- moving the cursor from first cursor location 416 to second cursor location 418 may cause a line to be drawn between first cursor location 416 and second cursor location 418, second cursor location 418 and third cursor location 420, and/or third cursor location 420 and fourth cursor location 422.
- an action e.g., drawing a line
- data associated with the action may be displayed via the display of user device 402.
- eye tracking zone 426 may include an icon at an edge of eye tracking zone 426.
- eye tracking zone 426 may include an icon 424 at an edge of the display of user device 402. In some non-limiting embodiments or aspects, icon 424 may appear within eye tracking zone 426 on a side opposite of activation zone 408.
- icon 424 may be a selectable icon.
- icon 424 may be selected by user 406 to end the drawing mode.
- user device 402 and/or eye tracking device 404 may receive a selection of the icon from user 406.
- User 406 may select the icon 424 by focusing the location of the gaze point 410 over the icon.
- user device 402 and/or eye tracking device 404 may stop the movement of the cursor based on receiving the selection of the icon 424 from user 406.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP23785872.5A EP4594845A1 (en) | 2022-09-27 | 2023-09-27 | Method, system, and computer program product for drawing and fine-tuned motor controls |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202263410382P | 2022-09-27 | 2022-09-27 | |
| US63/410,382 | 2022-09-27 | ||
| US18/372,789 | 2023-09-26 | ||
| US18/372,789 US12204689B2 (en) | 2022-09-27 | 2023-09-26 | Method, system, and computer program product for drawing and fine-tuned motor controls |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024069473A1 true WO2024069473A1 (en) | 2024-04-04 |
Family
ID=88291138
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2023/059613 Ceased WO2024069473A1 (en) | 2022-09-27 | 2023-09-27 | Method, system, and computer program product for drawing and fine-tuned motor controls |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024069473A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7561143B1 (en) * | 2004-03-19 | 2009-07-14 | The University of the Arts | Using gaze actions to interact with a display |
| CN111949131A (en) * | 2020-08-17 | 2020-11-17 | 陈涛 | Eye movement interaction method, system and equipment based on eye movement tracking technology |
-
2023
- 2023-09-27 WO PCT/IB2023/059613 patent/WO2024069473A1/en not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7561143B1 (en) * | 2004-03-19 | 2009-07-14 | The University of the Arts | Using gaze actions to interact with a display |
| CN111949131A (en) * | 2020-08-17 | 2020-11-17 | 陈涛 | Eye movement interaction method, system and equipment based on eye movement tracking technology |
Non-Patent Citations (4)
| Title |
|---|
| HUANG LIDA ET AL: "Eyes can draw: A high-fidelity free-eye drawing method with unimodal gaze control", INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, vol. 170, 102966, 1 February 2023 (2023-02-01), AMSTERDAM, NL, pages 1 - 21, XP093104880, ISSN: 1071-5819, DOI: 10.1016/j.ijhcs.2022.102966 * |
| HUANG LIDA ET AL: "Improving and Analyzing Sketchy High-Fidelity Free-Eye Drawing", ADVANCES IN ARTIFICIAL INTELLIGENCE, ACM, 2 PENN PLAZA, SUITE 701NEW YORKNY10121-0701USA, 10 July 2023 (2023-07-10), pages 856 - 870, XP059151267, ISBN: 978-1-4503-6583-3, DOI: 10.1145/3563657.3596121 * |
| PORTA MARCO ET AL: "ceCursor, a contextual eye cursor for general pointing in windows environments", PROCEEDINGS OF THE 27TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, ACM, NEW YORK, NY, USA, 22 March 2010 (2010-03-22), pages 331 - 337, XP058751900, ISBN: 978-1-4503-8342-4, DOI: 10.1145/1743666.1743741 * |
| YEO ALVIN W ET AL: "Gaze estimation model for eye drawing", 2005 43RD ACM/IEEE DESIGN AUTOMATION CONFERENCE, IEEE, PISCATAWAY, NJ, USA, 21 April 2006 (2006-04-21), pages 1559 - 1564, XP058497553, ISBN: 978-1-59593-381-2, DOI: 10.1145/1125451.1125736 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20250130637A1 (en) | Method, System, and Computer Program Product for Drawing and Fine-Tuned Motor Controls | |
| US11816330B2 (en) | Display device, display controlling method, and computer program | |
| US10515610B2 (en) | Floating window processing method and apparatus | |
| EP4530701A2 (en) | Beacons for localization and content delivery to wearable devices | |
| US12369207B2 (en) | Prompt information display method and apparatus and electronic device | |
| EP3042265B1 (en) | Information processing apparatus, information processing method, and program | |
| US20210405761A1 (en) | Augmented reality experiences with object manipulation | |
| US9891707B2 (en) | Information processing device, information processing method, and program for controlling a state of an application by gaze position | |
| CN107533374B (en) | Dynamic switching and merging of head, gesture, and touch input in virtual reality | |
| CN103782255B (en) | The eye of vehicle audio entertainment system moves Tracing Control | |
| KR102180961B1 (en) | Method for processing input and an electronic device thereof | |
| US20170347153A1 (en) | Method of zooming video images and mobile terminal | |
| US20090249257A1 (en) | Cursor navigation assistance | |
| JP2017534993A (en) | System and method for controlling a cursor based on finger pressure and direction | |
| CN118331428A (en) | Eye gaze control to zoom in on a user interface | |
| US12268955B2 (en) | Context-sensitive remote eyewear controller | |
| US20240420382A1 (en) | Curated contextual overlays for augmented reality experiences | |
| EP4254143B1 (en) | Eye tracking based selection of a user interface element based on targeting criteria | |
| WO2024069473A1 (en) | Method, system, and computer program product for drawing and fine-tuned motor controls | |
| US20160291703A1 (en) | Operating system, wearable device, and operation method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23785872 Country of ref document: EP Kind code of ref document: A1 |
|
| DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2023785872 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2023785872 Country of ref document: EP Effective date: 20250428 |
|
| WWP | Wipo information: published in national office |
Ref document number: 2023785872 Country of ref document: EP |