WO2023111365A1 - Sistema de seguimiento de la direccion de la mirada - Google Patents
Sistema de seguimiento de la direccion de la mirada Download PDFInfo
- Publication number
- WO2023111365A1 WO2023111365A1 PCT/ES2021/070891 ES2021070891W WO2023111365A1 WO 2023111365 A1 WO2023111365 A1 WO 2023111365A1 ES 2021070891 W ES2021070891 W ES 2021070891W WO 2023111365 A1 WO2023111365 A1 WO 2023111365A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- light sources
- light
- camera
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/028—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
- A61B3/032—Devices for presenting test symbols or characters, e.g. test chart projectors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention refers to an eye analysis system, more specifically it refers to a gaze direction tracking system.
- Eye tracking (“eye tracking” in English) is the process of evaluating either the point where the gaze is fixed (where we are looking), or the rotation movement of the eye in relation to the head.
- Gaze tracking systems (“eye-trackers”) are known and widely used in commercial applications, such as: video games, hands-free interface for computer interaction, user visual interest analysis in complex environments. , etc.
- the direction of gaze is the line that joins the active PRL (preferred retinal locus) with the center of rotation of the eye.
- PRL preferred retinal locus
- the PRL can be found outside the fovea.
- a user can use more than one PRL, each one in its position associated with the eye, but there can only be one PRL active at any given time. This implies that the direction of gaze, that is, the line that joins the active PRL with the center of eye rotation, changes when the user changes PRL. However, the measurement precision required to detect them exceeds what is available in the state of the art.
- Gaze tracking systems using video imaging typically comprise two or more infrared LEDs, to induce reflections on the visible surface of the user's eye (i.e., the cornea and/or sclera of the eye), and a camera Infrared imaging to capture an image of the user's eye, including flashes caused by such reflections.
- the LEDs are located symmetrically on a flat ring around the camera.
- the gaze direction calculation is performed by a controller, with at least one processor, which collects the image captured by the infrared camera.
- this calculation is carried out image by image, the images being obtained under a constant infrared lighting system, and consists of determining the center of rotation of the eye (through which it passes in the line of gaze direction), as well as the contour of the pupil and the relative position of the flashes with respect to the contour of the pupil.
- the image being analyzed is a flat image (2D) that lacks information to detect depth, which makes these systems not very robust to variations in the user's position and its effects, specifically, to variations in the distance of the eye and/or the distance of the pupil to the camera.
- these systems perform their calculations taking some hypotheses that are far from reality, such as: i) that the user's pupil is always an ellipse; ii) that the pupil moves jointly with the eye; and iii) that the direction of gaze of the user passes, at all times, through the geometric center of the ellipse of the pupil.
- the flash does not allow to detect the contour of the pupil with quality nor to determine the geometric coordinates of the pupil with respect to that reference point, for which deteriorates the quality of the information for the purposes of calculating the gaze direction.
- the position reference between the camera and the eye is temporarily lost (in the worst case, the pupil of the eye may be outside the field of view of the camera). In these cases, image quality is lost, further deteriorating the accuracy of the gaze tracking system.
- the user's eye produces between 20 and 40 fixations in that time, along with other eye movements, at least drifting and microsaccadic, and eventually saccadic, around the area in which it is placed. displays the stimulus introducing a significant error in the calibration.
- a gaze direction tracking system is needed that takes into account the true 3D geometry of the eye, as well as reliably detects the contour of the pupil at all times, thereby obtaining more accurate results.
- the object of the present invention is a system for tracking the direction of a user's gaze, which comprises: a plurality of light sources that emit infrared light on at least one eye of the user; a camera configured to capture infrared images of the eye; a controller comprising at least one processor, wherein the controller is configured to: o detect a plurality of flashes in an image of the eye captured by the camera, where each flash is a reflection of one of the light sources on the surface visible to the eye; I make a correspondence between each detected flash and the light source that generated said flash on the visible surface of the eye; characterized in that: the plurality of light sources is arranged in the form of a matrix covering at least partially the visible surface of the eye; and: the controller is further configured to: o analyze the image of the eye and determine to which region of the eye each pixel of the image belongs; or generating a 3D geometry of the eye from the plurality of flashes relative to an absolute geometric reference determined by the position of the camera and the pluralit
- the plurality of light sources emit light toward an eye of the user, such that each light source generates flashes on the visible surface of the eye (either the cornea or the sclera of the eye).
- the camera captures an image of the eye, said image being analyzed by the controller, determining to which region of the eye (for example, pupil, iris, sclera, eyelid, eyelashes or corners) each pixel of the image belongs.
- the plurality of light sources are arrayed at different distances from each other vertically and horizontally, covering at least partially (or completely) the visible surface of the eye.
- This matrix distribution of the plurality of light sources produces a distribution of flashes whose correlative distances will change depending on the depth of the region of the eye in which they are produced, which will allow the generation of 3D geometry.
- each flash can be easily correlated with its corresponding light source. Producing a wide distribution of glare on the surface of the eye reduces problems caused by blinking or drooping, as glare is still visible in the eye image even under these conditions.
- the ocular center of rotation is kept in a constant position relative to the eye, when the controller compares two images of the eye corresponding to two gaze directions (for example, those obtained after a rotation of the eye). eye), you can easily establish the ocular center of rotation more precisely, which is critical for gaze direction calculation.
- the controller is further configured to: detect characteristic eye movements in a series of images of the eye captured by the camera during a gaze fixation estimate a point outside the eye of the gaze direction line based on the characteristic movements of the eye in said fixation; defining an active PRL of the eye in the 3D geometry of the eye as the point of intersection with the back of the 3D geometry of the eye of the gaze direction line defined by the center of eye rotation and the point outside the eye.
- Fixation is the name given to the visual process by which the user positions the image of a stimulus on his active PRL and by which his visual system extracts information about the content in spatial frequencies (and other information, such as color, etc.) from what go. To obtain this information, the eye must move with a characteristic movement called drift (drift), interrupted by other so-called microsaccades that reposition the image of the stimulus on the fovea.
- drift drift
- the present system uses the characteristic movements of the eye in a fixation. These movements are involuntary and occur exclusively when there is a stimulus on which the user can fix his gaze. If, for example, this stimulus is unique in the user's virtual visual space and the user's eye shows fixation movements, it can be concluded that the user is fixating on that stimulus.
- the PRL determined in this way is integral with the eye, it will serve for the automatic determination of the gaze direction as the direction of the axis that passes through the center of ocular rotation and the active PRL.
- the determination of the direction of gaze in the present system is defined as the line or axis that joins the active PRL (whether or not it is in the fovea) with the center of rotation of the eye.
- This line is completely independent of the pupil and has no relation to the contour of the pupil nor does it have to pass through its center.
- the plurality of light sources comprises at least two groups of light sources configured to alternately emit light, ie when the light sources of one group are on, the rest of the light sources would be off.
- This alternation between groups of light sources is done periodically, preferably every two images captured by the camera at a sampling frequency much higher than that of eye movements, this allows it to be done frequently enough for the rotation to occur. of the eye between both images is negligible, which allows obtaining satisfactory results.
- At least one of the following techniques can be used, in isolation or in combination, to produce a flash with reduced position uncertainty.
- the use of different techniques for the determination of this absolute geometric reference on the visible surface of the eye allows that, in the event that one of these techniques stops working, the system could use another technique, allowing the system to continue determining the absolute geometric reference. on the visible surface of the eye.
- At least one light source is configured to emit a light source of a different geometric size than the rest of the plurality of light sources
- the controller being configured to determine the absolute geometric reference on the visible surface of the eye by identifying the flash in the image of the eye in relation to the corresponding light source configured to emit a light source of different geometric size than the rest of the plurality of light sources.
- one way to configure the light source so that it emits a light source of a different geometric size, in this case much smaller, than the rest of the plurality of light sources would be to collimate the light beam emitted by said light source.
- light source by means of a collimation mask that converts the diverging light beam coming from the light source into a parallel beam in such a way that its geometric size decreases.
- At least one light source is configured to emit a light source of different power than the rest of the plurality of light sources
- the controller being configured to determine the absolute geometric reference on the visible surface of the eye by identifying the flash in the image of the eye in relation to the light source configured to emit a light source of different power than the rest of the plurality of light sources.
- At least two light sources are closer to each other than the distance between the rest of the plurality of light sources, the controller being configured to determine the absolute geometric reference on the visible surface of the eye by identifying the flashes in the eye image relative to light sources that are closer together than the distance between the rest of the plurality of light sources.
- the system comprises a narrow beam light emitter, the controller being configured to determine the absolute geometric reference on the visible surface of the eye by identifying a reflection on the visible surface of the eye relative to the narrow beam light emitter.
- said narrow beam light emitter would be centered with the optical axis of the camera, thus improving the determination of the absolute geometric reference, with which the geometric relationship between the camera, the plurality of light sources and the /o the narrow beam light emitter and the flashes analyzed in each image of the eye captured by the camera, thus improving the generation of the 3D geometry of the eye.
- the narrow beam light emitter emits an infrared laser.
- This reflection can be achieved equally with both technologies, as both emit a narrow beam light beam that produces a much smaller point reflection than reflections from multiple light sources.
- This most punctual reflection is used by the present system both to determine the absolute geometric reference on the visible surface of the eye, and to improve the resolution of its position and use it to improve the position estimation of the other reflections of the eye. multiple light sources.
- the same narrow-beam light beam that produces the punctual reflection on the visible surface of the eye penetrates the eye to its retina and produces a punctual reflection on the retina.
- the system will incorporate means to capture this spherical wavefront and determine, from it, the instantaneous refraction of the eye (sphere, cylinder and axis).
- the narrow-beam light emitter is also used as an autorefractometer, so that the reflection of the point produced by the laser on the retina makes it possible to determine the current accommodative state of the user.
- This information together with the 3D geometry of the eye and the results of the visual perceptual tests, also makes it possible to interpret the effects of possible corneal alterations of the user, based on the results of the user's perception, his accommodative behavior and his oculomotricity.
- the narrow beam light emitter when the narrow beam light emitter emits a laser, the system has to integrate this signal over time to improve the signal-to-noise ratio (due to the speckle phenomenon), whereas when the narrow beam light emitter emits a beam of diffuse collimated light the system calculates the measurement instantly. Consequently, the latter can take instantaneous refraction measurements at a higher temporal frequency.
- the infrared characteristic of both narrow beam light emitter options is not visible to the user's eye, so the user is not inconvenienced by the use of this narrow beam light emitter.
- the light sources are preferably infrared light-emitting diodes (LEDs), since with this configuration the user cannot see said light sources and, therefore, does not suffer discomfort, being able to continue performing their visual task with total normal.
- LEDs infrared light-emitting diodes
- the camera comprises optics associated with a variable focus system with focus control, which allows the camera to focus correctly on the visible surface of the eye, so that the images obtained by the camera are always in an optimal state. .
- the camera comprises optics associated with a fixed focus system with a depth of field of at least 5 mm, both for depth variations in the relative position of the eye with respect to the absolute reference system produced by user movements and so that the image of the eye captured by the camera is always in focus.
- the present system uses a matrix of light sources that allow to build a three-dimensional image of the eye (that is, with depth), which is used in the calculation of the gaze direction tracking.
- the depth of field In order for the camera to capture this dimensionality, the depth of field must have a minimum dimension of 5 mm.
- Figure 1 shows a sectional view of an example of a human eye, detailing its main parts.
- Figure 2 shows a sectional view of an example of a human eye, detailing the difference between the pupillary axis and gaze direction.
- Figure 3 shows a view of the distribution of light sources in the preferred embodiment of the invention.
- Figure 1 shows an example of a human eye (1), whose function is to transform light energy into electrical signals that are sent to the brain through the optic nerve.
- the visible outer surface of the eye (1) comprises the cornea (1.1), which is the transparent front part of the eye (1) that covers the iris (1.2) and the pupil (1.3), and the sclera (1.4), called colloquially as the "white” of the eye (1), the sclera (1.4) is a white membrane, which constitutes the outermost layer of the eye (1) and whose function is to shape the eye (1) and protect to internal elements.
- the retina (1.5) On the inner surface of the eye (1) you can find the retina (1.5), when light falls on the retina (1.5) a series of chemical and electrical phenomena is triggered, which translate into nerve impulses that are sent towards the brain via the optic nerve. Likewise, the area of the retina (1.5) that is especially capable of precise vision (greater visual acuity) and color is called fovea (1.6).
- the system comprises a plurality of light sources (2.1), more specifically infrared light-emitting diodes (LEDs), which emit infrared light towards at least one eye (1) of the user.
- LEDs infrared light-emitting diodes
- the plurality of light sources (2.1), preferably between 8 and 20, are arranged in the form of a matrix covering, the incidence of the plurality of light sources (2.1), at least partially, or all, the visible surface of the eye ( 1), preferably around the camera (2.2) and fixed on the same frame, as can be seen in figure 3, this allows a wide distribution of flashes on the visible surface of the eye (1) and avoids the problems caused by blinking or drooping blinking in the state of the art.
- This distribution in the form of a matrix is not limiting, and the plurality of light sources (2.1) can be with any other distribution as long as they cover at least partially, or all, the visible surface of the eye (1).
- This infrared light emitted towards the user's eye (1) generates flashes on the visible surface of the eye (1), that is, on the cornea (1.1) and/or sclera (1.4) of the eye (1), these flashes of light infrared light are subsequently detected by a sensor capable of detecting this infrared light.
- the infrared sensor is a camera (2.2), capable of capturing images of the eye (1) and the corresponding flashes generated by the plurality of light sources (2.1).
- the images obtained from the eye (1) captured by the camera (2.2) are analyzed by a controller, which comprises at least one processor, so that this controller detects the different regions of the image of the eye (1) eye (for example , pupil, iris, sclera, eyelid, eyelashes or corners) and the plurality of flashes in the image of the eye (1).
- a controller which comprises at least one processor, so that this controller detects the different regions of the image of the eye (1) eye (for example , pupil, iris, sclera, eyelid, eyelashes or corners) and the plurality of flashes in the image of the eye (1).
- the present system establishes an absolute geometric reference in the system composed of the camera (2.2), the plurality of light sources (2.1) (both in fixed relative positions) and an absolute geometric reference on the visible surface of the eye (1).
- At least one light source (2.1) is configured to emit a light source of a different geometric size and/or power than the rest of the plurality of light sources (2.1), so that the flash generated on the visible surface of the eye (1) by said light source is easily determinable by the controller with reduced geometric uncertainty in the analysis of each image captured by the camera
- the controller in the analysis of each image captured by the camera (2.2); and/or the present system additionally comprises a narrow beam light emitter
- the controller determines the absolute geometric reference on the visible surface of the eye (1) based on at least one flash easily determined by the controller in the analysis of each image captured by the camera (2.2).
- the controller by determining the absolute geometric reference and knowing the correspondence between the plurality of light sources (2.1), the narrow beam light emitter (2.3) and the camera
- the controller determines in the analysis of each image captured by the camera (2.2) the geometric relationship or distance between the flash generated on the surface of the eye (1) and the light source (2.1) or narrow beam light emitter (2.3) that generated said flash. Based on the correspondence between the absolute geometric reference, the sparkles generated on the visible surface of the eye (1), the camera (2.2) and the absolute geometric reference on the visible surface of the eye (1), the controller generates a 3D geometry of the eye (1) of the user and based on said 3D geometry, the controller estimates the eye rotation center (3.1) of the eye (1) by comparing two or more images of the eye (1) after one rotation.
- the center of eye rotation (3.1) indicates at which point the eye (1) rotates during the visual process, so its correct calculation is of the greatest importance for calculating the direction of the gaze (3.5).
- the controller can easily determine and take into account in the calculations the changes in the relative position of the eye (1) with respect to the camera (2.2). and, also, to determine if the change in relative position is the cause of any adverse effects on the image quality of the camera (2.2). In this case, the controller will be able to increase or decrease the power of the plurality of light sources (2.1) (for example, more power, if required, when the eye (1) moves away from the camera (2.2) and vice versa). and/or activate the associated optics of the camera (2.2), if it is of variable focus.
- the optics of the camera (2.2) are expected to be defined with a depth of field of at least 5mm and may include, in some configurations, a variable focus system with focus control.
- This determination of the ocular center of rotation (3.1) is critical for the calculation of the gaze direction (3.5), which is defined as the geometric line that joins the active PRL (3.3) (normally present in the fovea (1.6) of the eye (1), although in certain ophthalmic pathologies it is located outside the fovea (1.6)) with the center of ocular rotation (3.1).
- this gaze direction (3.5) is not related to the position of the center (3.2) of the pupil (1.3).
- the active PRL (3.3) is a given group of retinal neurons (1.5) (whether or not they are in the fovea (1.6)) and that, therefore, its geometric position moves jointly with the eye (1 ).
- a user can use more than one PRL (3.3), each one in its joint position with the eye (1), but there can only be one PRL (3.3) active at any time.
- Figure 2 shows an example of the difference between the pupillary axis (3.6), which is the geometric line that joins the center of the pupil (3.2) with the center of ocular rotation (3.1), and the direction of the look (3.5).
- the improvement obtained with the proposed system reduces the gaze direction measurement error (3.5) compared to the state of the art by more than 0.5 degrees. It should be noted that, in clinical applications, it is interesting to measure displacements of the direction of gaze (3.5) of 3 seconds of arc, which are the dimension of the drift paths during a fixation.
- fixation is the visual process that allows a subject to maintain the projection of an external visual stimulus on his fovea (1.6) and, through a continuum of small eye movements (rotations) (combination of drift movements and microsaccades) involuntary, extract visual information related to its content in spatial frequencies.
- an error of 0.5° implies an error of 1,800 arc seconds, that is, an error 600 times greater than the quantity to be measured.
- the controller is configured to detect the characteristic movements of the eye (1) during a fixation on a series of images of the eye (1) captured by the camera (2.2) and calculate the angular position of a point outside (3.4) to the eye (1) that determines the gaze direction (3.5).
- fixation movements are involuntary and occur exclusively when there is a stimulus on which the user can fix his gaze. If, for example, this stimulus is unique in the user's virtual visual space and the user's eye (1) shows fixation movements, it can be concluded that the user's eye (1) is fixing on that stimulus.
- the eye (1) must rotate with a characteristic movement called drift, interrupted by other so-called microsaccades that reposition the stimulus image on the active PRL (3.3).
- Drift consists of small position jumps of a few arc minutes (from 3 to 30) that are developed consecutively with a random-walk type statistic with short periods of persistent in direction, followed by short periods of anti-persistent direction.
- Microsaccades are position jumps an order of magnitude larger and ballistic in nature (ie, with a constant relationship of maximum velocity and amplitude).
- a typical fixation lasts between 150 ms and 250 ms.
- the system can measure fixation movements (drift and microsaccades) and if the stimulus is small (at the user's spatial visibility limit), the system will be able to determine the direction of the point outside (3.4) to the eye (1) (whose distance to the eye (1) is not relevant for these calculations) during a fixation, defined as the gaze direction line (3.5), joining the center of ocular rotation (3.1) with the position of a central measure of the statistical distribution of ocular rotations during that fixation (for example, it is determined that the point outside (3.4) to the eye (1) is in the centroid of the rotations).
- the controller is capable of estimating an active PRL (3.3) of the eye (1) in the 3D geometry generated based on the projection to the back of the eye (1), where the fovea (1.6) and/or retina are located. (1.5), of the gaze direction axis (3.5) that joins the eye rotation center (3.1) and the exterior point (3.4) to the eye (1).
- the controller calculation process will associate the 3D coordinates of this active PRL (3.3) to the generated 3D geometry, so that, from that moment on, the direction of gaze (3.5), at each instant, will be determined by the line geometry that joins that active PRL (3.3) with the center of ocular rotation (3.1).
- the present system does not require prior calibration, although, to accommodate anomalous behaviors in cases of certain visual dysfunctions, a statistical treatment of post-processing data is provided.
- the plurality of light sources (2.1) comprises at least two groups of light sources (2.1) configured to emit light alternately.
- the plurality of light sources (2.1) are of variable light intensity (with at least 2 levels) by means of individual electronic control of each light source (2.1) or by groups of light sources (2.1).
- This alternation is controlled so that every two consecutive images from the camera (2.2), one image will have been obtained with the first group of light sources (2.1) activated and the second group of light sources (2.1) deactivated and the second image with the second group of light sources (2.1) activated and the first group of light sources (2.1) deactivated.
- Figure 3 shows a view of the distribution of light sources (2.1), the camera
- the narrow beam light emitter (2.3) emits a narrow beam of light, whose reflection on the eye (1) has less position uncertainty than the flashes of the plurality of light sources (2.1), in correspondence with the axis of the camera (2.2), so as to allow a better calculation of the absolute geometric reference on the surface of the eye (1). Additionally, in a preferred configuration, the narrow beam light emitter (2.3) is used as the light source of an autorefractometer.
- FIG. 3 it can be seen how the plurality of light sources (2.1) are arranged in the form of a matrix around the camera (2.2) and at different distances between them vertically and horizontally, this allows a wide and relatively wide distribution of flashes. constant on the visible surface of the eye (1) and reduces problems caused by blinking or drooping, since you still see flashes even in these conditions.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Multimedia (AREA)
- Artificial Intelligence (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
Claims
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP21967966.9A EP4449975A4 (en) | 2021-12-15 | 2021-12-15 | EYESIGHT DIRECTION TRACKING SYSTEM |
| US18/720,141 US20250157073A1 (en) | 2021-12-15 | 2021-12-15 | Gaze direction tracking system |
| PCT/ES2021/070891 WO2023111365A1 (es) | 2021-12-15 | 2021-12-15 | Sistema de seguimiento de la direccion de la mirada |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/ES2021/070891 WO2023111365A1 (es) | 2021-12-15 | 2021-12-15 | Sistema de seguimiento de la direccion de la mirada |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023111365A1 true WO2023111365A1 (es) | 2023-06-22 |
Family
ID=86773674
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/ES2021/070891 Ceased WO2023111365A1 (es) | 2021-12-15 | 2021-12-15 | Sistema de seguimiento de la direccion de la mirada |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20250157073A1 (es) |
| EP (1) | EP4449975A4 (es) |
| WO (1) | WO2023111365A1 (es) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE4039145A1 (de) * | 1989-12-07 | 1991-06-13 | Asahi Optical Co Ltd | Verfahren zum erfassen der blickrichtung des menschlichen auges |
| US20060110008A1 (en) * | 2003-11-14 | 2006-05-25 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking |
| JP2011115460A (ja) * | 2009-12-04 | 2011-06-16 | Saga Univ | 視線制御装置、視線制御方法、及びそのプログラム |
| CA2750287A1 (en) * | 2011-08-29 | 2011-11-02 | Microsoft Corporation | Gaze detection in a see-through, near-eye, mixed reality display |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8929589B2 (en) * | 2011-11-07 | 2015-01-06 | Eyefluence, Inc. | Systems and methods for high-resolution gaze tracking |
| US8878749B1 (en) * | 2012-01-06 | 2014-11-04 | Google Inc. | Systems and methods for position estimation |
| US9292765B2 (en) * | 2014-01-07 | 2016-03-22 | Microsoft Technology Licensing, Llc | Mapping glints to light sources |
-
2021
- 2021-12-15 US US18/720,141 patent/US20250157073A1/en active Pending
- 2021-12-15 WO PCT/ES2021/070891 patent/WO2023111365A1/es not_active Ceased
- 2021-12-15 EP EP21967966.9A patent/EP4449975A4/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE4039145A1 (de) * | 1989-12-07 | 1991-06-13 | Asahi Optical Co Ltd | Verfahren zum erfassen der blickrichtung des menschlichen auges |
| US20060110008A1 (en) * | 2003-11-14 | 2006-05-25 | Roel Vertegaal | Method and apparatus for calibration-free eye tracking |
| JP2011115460A (ja) * | 2009-12-04 | 2011-06-16 | Saga Univ | 視線制御装置、視線制御方法、及びそのプログラム |
| CA2750287A1 (en) * | 2011-08-29 | 2011-11-02 | Microsoft Corporation | Gaze detection in a see-through, near-eye, mixed reality display |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4449975A4 * |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4449975A4 (en) | 2025-10-22 |
| EP4449975A1 (en) | 2024-10-23 |
| US20250157073A1 (en) | 2025-05-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| ES3039315T3 (en) | Reliability of gaze tracking data for left and right eye | |
| US6659611B2 (en) | System and method for eye gaze tracking using corneal image mapping | |
| CA2685976C (en) | Methods and apparatus for estimating point-of-gaze in three dimensions | |
| US9439592B2 (en) | Eye tracking headset and system for neuropsychological testing including the detection of brain damage | |
| US6655805B2 (en) | Ophthalmic apparatus | |
| US7600873B2 (en) | Method of determining the spatial relationship of an eye of a person with respect to a camera device | |
| JP2023500376A (ja) | 医用イメージング用の拡張現実ヘッドセット | |
| WO2013117727A1 (en) | System for examining eye movements, particularly the vestibulo-ocular reflex and dynamic visual acuity | |
| ES2993143T3 (en) | Eye tracking fixation monitoring systems and methods | |
| JPWO2019143844A5 (es) | ||
| ES2965618T3 (es) | Técnica para determinar un indicador de riesgo de miopía | |
| JP2016190029A (ja) | 前庭試験装置 | |
| ES2914240T3 (es) | Dispositivo oftálmico | |
| US20190216311A1 (en) | Systems, Methods and Devices for Monitoring Eye Movement to Test A Visual Field | |
| CN113080836A (zh) | 非中心注视的视觉检测与视觉训练设备 | |
| JP4102888B2 (ja) | 広視野角眼底血流画像化装置 | |
| JP2024138471A (ja) | 網膜障害を有する見る人の目の視覚を改善するためのシステム及び方法 | |
| Ott et al. | Video-oculographic measurement of 3-dimensional eye rotations | |
| ES2943064T3 (es) | Visor de realidad virtual para neurorrehabilitación visual | |
| WO2023111365A1 (es) | Sistema de seguimiento de la direccion de la mirada | |
| WO2014016454A1 (es) | Aparato para la medición de la topografía y espesor de la cornea y procedimiento de medida empleado | |
| CN113080844B (zh) | 优选视网膜区的视觉检测与视觉训练设备 | |
| US7628488B2 (en) | Method for reproducing a fixation mark for ophthalmological therapeutic equipment | |
| RU2648202C2 (ru) | Способ офтальмологического исследования поля зрения | |
| US20200281528A1 (en) | Method and system for diagnosing a disease using eye optical data |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21967966 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18720141 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2021967966 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2021967966 Country of ref document: EP Effective date: 20240715 |
|
| WWP | Wipo information: published in national office |
Ref document number: 18720141 Country of ref document: US |