[go: up one dir, main page]

WO1999005988A2 - An eye tracker using an off-axis, ring illumination source - Google Patents

An eye tracker using an off-axis, ring illumination source Download PDF

Info

Publication number
WO1999005988A2
WO1999005988A2 PCT/US1998/015920 US9815920W WO9905988A2 WO 1999005988 A2 WO1999005988 A2 WO 1999005988A2 US 9815920 W US9815920 W US 9815920W WO 9905988 A2 WO9905988 A2 WO 9905988A2
Authority
WO
WIPO (PCT)
Prior art keywords
eye
camera
axis
subject
image
Prior art date
Application number
PCT/US1998/015920
Other languages
French (fr)
Other versions
WO1999005988A3 (en
Inventor
Joshua D. Borah
Charles Valois
Original Assignee
Applied Science Laboratories
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Applied Science Laboratories filed Critical Applied Science Laboratories
Publication of WO1999005988A2 publication Critical patent/WO1999005988A2/en
Publication of WO1999005988A3 publication Critical patent/WO1999005988A3/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B2213/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B2213/02Viewfinders
    • G03B2213/025Sightline detection

Definitions

  • the invention relates to an eye tracker for determining line-of-gaze of a subject and, more particularly, it relates to an eye tracker that uses an improved light source for illuminating the subject's eye.
  • All practical eye movement measurement or eye tracking techniques involve tracking one or more features or reflections that can be optically detected on the eye.
  • Many of the system that are available fall into one of two categories, namely, methods that detect single features on the subject's eye and methods that detect two features on the subject's eye.
  • One way of establishing a feature is by reflecting a light source off of the eye.
  • One feature that has been often used for this purpose is the pupil.
  • the equipment determines the position of the pupil center. Through a simple mathematical transformation, changes in the position of the center of the pupil can be easily converted to an indication of the line-of-gaze of the subject.
  • the retina is highly reflective, but any light reflected back through the pupil will be directed towards its original source. In fact, if the eye is focused at the plane of the source, such retro-reflected light from the retina will be imaged back at the source. Under normal viewing conditions, the retina looks completely black because none of the rays reflected off of the retina return to the observer. If, however, the observer is able to look along the axis of an illumination beam, then the observer will see the retinal reflection and the pupil will appear bright.
  • a beam splitter which typically is a prism with a 45° reflecting surface.
  • the viewing camera looks through the beam splitter (and through the reflecting surface) at the subject's eye.
  • the illumination source which is off to the side of the viewing axis, directs light at the 45° reflecting surface of the beam splitter which reflects that light along and coaxial with the viewing axis toward the subject.
  • the beam splitter adds to the bulk of the device it tends to attenuate the light reflected back from the subject's eye to the camera.
  • the invention is a camera unit for use in an eye tracking apparatus .
  • the camera unit includes a camera with a lens having an image axis; and a ring shaped light source disposed around the image axis and near the periphery of the lens aperture.
  • the light source is oriented to direct light along the camera axis toward the target .
  • the invention is an eye line-of-gaze measurement apparatus including an electronic camera with a lens having an image axis; a ring shaped light source disposed around the image axis and near the periphery of the lens aperture and oriented to direct light along the camera axis toward the target; and a digital processor programmed to determine an eye line-of-gaze from the image of a retro-reflection obtained from the subject's eye.
  • the ring-shaped light source includes an array of lights arranged in a circle to form a ring.
  • the plurality of light sources are evenly spaced about the circle.
  • the light sources are LED's.
  • the invention is a method of generating a retro-reflection from a subject's eye for use in a line-of-gaze measurement system that utilizes a bright pupil detection technique.
  • the method includes the steps of producing an image of the subject's eye by using a camera that is characterized by a viewing axis; and illuminating the subject's eye with an off-axis illumination to produce a retro- reflection from the retina of the subject's eye.
  • the step of illuminating the subject's eye involves illuminating with a light source that is distributed around the viewing axis.
  • a light source that is distributed around the viewing axis.
  • Another advantage of the invention is that it provides a larger and brighter illumination source which both improves the accuracy of the eye tracking system and makes the overall system less sensitive to variations in ambient light conditions.
  • the improved light source more effectively produces a retro-reflection from the retina of the subject's eye. Also, it produces a large enough reflection off of the cornea and at the same time it produces a bright enough retinal retro-reflection from which the point-of-gaze determinations can be made.
  • Yet another advantage of the invention is that it eliminates the need for inserting between the viewing camera and the subject's eye a beam splitter which also acts to attenuate the intensity of the image obtained by the camera.
  • FIG. 1 is a block diagram of representative components of an eye tracker system including a camera unit which has a ring illumination source;
  • Fig. 2 is a front view of ring illumination source that is mounted on the front of the camera unit shown in Fig. 1:
  • Fig. 3 is an embodiment of an eye-head tracker system including a head-mounted scene camera, a stationary scene camera, and a eye tacker camera which includes the ring illumination source; and
  • Fig. 4 is an alternative embodiment of an eye-head tracker system including a stationary scene camera.
  • an eye tracker system with an improved illumination source includes a solid state camera 10, a ring light source 12 for illuminating the subjects eye 14 (also shown from a front perspective in Fig. 2) , and a digital processor 16 for executing the programs which processes the camera image to locate the pupil, determine its center, and then compute the line- of-gaze of the subject.
  • the novel aspects of the system are in the design and construction of the ring light source that is used to illuminate the subject's eye and its relationship with the camera lens.
  • the camera can be any commercially available system that is suitable for generating an image of the eye.
  • it can include a solid state linear or two-dimensional array device.
  • the arrays can be made by clustering photocells in tightly packed linear or two- dimensional arrangements.
  • they can include arrays of thousands of charge coupled devices (CCD's) or charge injection devices (CID's) such as are found in commercially available solid state cameras or video cameras.
  • CCD's charge coupled devices
  • CID's charge injection devices
  • the camera is a Sony EVI-D30 pan/tilt color camera, which is a compact solid state device and which uses a CD array.
  • the Sony camera that is used in the described embodiment includes a lens system 20 which focuses the image onto a CCD array 22.
  • It also includes internal electronic circuitry 24 which converts the signals from the CCD array into a form that is displayable and is analyzable by the image processing components of the system.
  • internal electronic circuitry 24 converts the signals from the CCD array into a form that is displayable and is analyzable by the image processing components of the system.
  • any camera that generates an image which can be processed and analyzed to extract feature information would be acceptable.
  • the camera was operated at a distance of between 18 to 40 inches from the subjects eye, we further modified the lens system within the Sony camera to magnify the image to a size that was more useful to the image processing system.
  • This involved adding a positive lens 28 before internal lens system 20 and adding a negative lens 30 after the lens system 20.
  • the added front lens was a piano convex lens obtained from Edmund Scientific (part no. 44020) with a 20.5 mm diameter and a 415 mm focal length.
  • the added back lens was a negative lens also obtained from Edmund Scientific (part no. 44090) with a 9.2 mm diameter and a negative focal length of 41 mm.
  • the Sony camera by building an addition to the front of the camera which included a plate 40 supporting the ring of illumination 13 that encircles and is in close proximity to the lens opening.
  • the lens system is contained in a housing 21 onto which plate 40 is mounted.
  • Plate 40 has a centrally located circular hole 45 through which the camera views the target scene.
  • the hole in plate 40 is approximately the size of the aperture of the lens. More specifically, the hole is made as small as possible without either compromising the light gathering efficiency of the lens or the quality of the image that is produced.
  • Surrounding hole 45 there is an array of eight, evenly spaced LED's 13 mounted on the plate and oriented to direct light toward the target scene.
  • the LED's have built in lenses which produce a narrow beam than would be generated by the device without the lens.
  • the LED's produce light that is in the near infra-red region, though of course, light of other wavelengths could also be used.
  • the advantage of near infra-red light is that since it is not visible to people, it
  • the LED's were devices sold Siemens Corporation (part no. SFH 484) which produce a 16° beam at a center wavelength of about 880 nm.
  • the lens aperture formed in the plate was about 0.5 inch and the diameter of the ring of light was slightly bigger, e.g 0.73 inch.
  • the size of the array should not be so small as to interfere with the efficient operation of the lens. If the ring diameter is too large, then the light source will become less effective at producing the retro-reflection. Indeed, at a certain diameter, it will loose all ability to produce a retro-reflection that can be observed by the camera .
  • the array of closely spaced LED's forms a ring of light surrounding the lens aperture. It should be understood, however, that any effective way of producing a ring of light would be acceptable. For example, one could use a single light source and an group of optical fibers to produce the array of individual light sources. Or alternatively, one could use an optical lens system to produce the ring of light. It should also be noted that the retro-reflection can be generated using less than a complete ring of light. Indeed, what was surprising is that it was possible to effectively produce the retro-reflection with an off axis light source.
  • the eye tracking system is limited to using only a single feature of the eye to compute line-of-gaze.
  • the system may use multiple features also including, for example, corneal reflection of the light source.
  • corneal reflection the feature is typically at a different position from the center of the pupil.
  • Tracking only the position of a single landmark or feature of the eye does not permit the system to distinguish between eye rotation and eye translation with respect to the camera.
  • further information must be provided such as could be obtained from a head tracker that indicates the position of the head with respect to the camera.
  • Another source of information can be a second feature on the eye, e.g. corneal reflection. Since the second feature is at a different location from the first feature, the system can eliminate the ambiguity the ambiguity between translation and rotation.
  • the techniques for using two features to perform eye tracking are well known in the art and will not be described here. But it should be understood that the invention is meant to cover systems and method which use these other techniques in addition to the bright pupil technique.
  • Fig. 3 shows a typical eye tracking system which uses the improved light source for bright pupil monitoring.
  • a person whose point of gaze is being measured wears a helmet 110 on which are mounted a visor 112, an eye tracker sensor and optics unit 114, having the features described above, and a head mounted scene camera 116.
  • Visor 112 is coated to be very reflective in the near infra-red but transparent in the visual spectrum and thus allows the person to look through it while at the same time eye tracker sensor and optics unit 114 is able to "look" at a reflection of the person's eye and head-mounted scene camera 116 is able to see a reflection of the field of view of the subject.
  • a stationary scene camera 118 is mounted on the floor - within proximity of the person. It may be mounted on a tripod, as shown, or fixed to the environment in some other way. Stationary scene camera 118 is aimed so that one or more of the surfaces of interest (e.g. scene planes 120 and 122) are visible in the camera video image. Scene planes 120 and 122 may be instruments on a control panel, visual presentations, or any other regions of visual interest. When the user looks forward, scene planes 120 and 122 are within the field of view of both the person and head-mounted scene camera 116.
  • Eye tracker sensor and optics unit 114 which incorporates the off-axis light source (e.g. the ring light source) produces a video image that is preprocessed and digitized by an eye tracker electronics unit 124 and sent to a computer 126.
  • Computer 126 which is programmed appropriately, uses the resulting digital information from unit 124 to determine the relative locations of the pupil center and the reflection of the near infrared light source on the front surface of the cornea. From the pupil-to-corneal reflection vector, computer 126 determines the pointing direction of the eye with respect to the head mounted optics.
  • the pointing direction of the eye is represented by two coordinates in computer memory that are proportional to eye azimuth and eye elevation angle, respectively (or the equivalent) .
  • eye tracker electronics unit 124 and computer 126 are both part of a eye-head tracker processor 128, which may be commercially obtained from Applied Science Group, Inc. of Waltham, MA. and is identified as the ASL model 4100H-EHS eye-head tracker system.
  • the pupil to corneal reflection technique for measuring eye pointing direction is described in the literature, and is well-known to those skilled in the art. (See, for example, Young and Sheena, Methods & Designs, Survey of eye movement recording methods, Behavior Research Methods and Instrumentation 1975, Vol. 7(5), 397-492; Merchant & Morrisette, Remote measurement of eye direction allowing subject motion over one cubic foot of space, IEEE Transactions on Biomedical Engineering, 1974, BME-21, 309-317; and Borah, "Helmet Mounted Eye Tracking for Virtual Panoramic Display Systems", AAMRL-TR-89-019 , Harry B. Armstrong Aerospace Medical Research Laboratory, Human Systems Division, Air Force Systems Command, Wright-Patterson AFB, August 1989.)
  • Computer 126 maps the resulting coordinate values to a different set of coordinates which represent a horizontal and vertical location on the video image from head mounted scene camera 116.
  • head mounted scene camera 16 moves along with the head and is optically located at the same (or nearly the same) position as the eye, it remains part of the same reference frame as the eye position detection system. In other words, there is a unique relation between eye pointing direction with respect to the head, and point of gaze with respect to the scene camera image.
  • mapping techniques for mapping eye azimuth and elevation values to such a scene camera image field including interpolation techniques and curve fit techniques. These techniques are described in the literature, and are well-known to those practiced in the art .
  • the technique used in the preferred embodiment is a curve fit technique (see, e.g. Sheena & Borah, "Compensation for Some Second order Effects to Improve Eye Position Measurements", for D.F. Fisher, R.A. Monty, and J.W. Sanders (Eds) : Eve Movements: Cognition and Visual Perception, L. Erlbaum Assoc., 1981) .
  • the position coordinates of point of gaze with respect to the head mounted scene camera image can be displayed on a scene monitor 130 as a cursor, cross hairs, or other indicator, superimposed on the video image from head- mounted scene camera 116.
  • the ASL model 4100H-EHS includes the capability for such cursor superimposition.
  • many other commercially available devices exist which superimpose cursors, cross hairs, or other symbols on a video signal at specified locations, whose coordinates are available in computer memory, and such devices can be used. Note that use of head-mounted scene camera 116 is not a necessary part of the system but is described primarily because it is readily available and is a common part of some eye tracking systems .
  • the eye tracking system in Fig. 3 also includes a head tracker which determines the position and orientation of the person's head.
  • the head tracker is a device based on magnetic principles, such as the 3Space Tracker available from Polhemus, a Kaiser Aerospace & Electronics Company, or The BirdTM available from Ascension Technology, Inc.
  • Other possible embodiments could utilize mechanical goniometers; ultrasonic devices, such as one offered commercially by Logitech, Inc.; optical devices; or any other device that can be used to measure head position and orientation.
  • the magnetic head tracker (MHT) shown in Fig. 3 includes an MHT sensor 132, an MHT transmitter 134, and an MHT control unit 136.
  • MHT sensor 132 is fastened to the subjects helmet, and MHT transmitter 134 is fixed to the environment near the subjects head.
  • MHT control unit 136 determines the position of MHT sensor 132 with respect to MHT transmitter 134 in 6 degrees of freedom and communicates this information to computer 126 via an RS-232 interface.
  • a program in computer 126 uses information from the head tracker, information about eye line of gaze with respect to the head (computed as described above) , and stored information about the location of surfaces in the environment (such as scene plane 120 and scene plane 122) , to determine the location and direction of the eye line of gaze vector with respect to the environment, the surface intersected by the line of gaze vector, and the location of the intersection point (point P in Fig. 3) with respect to the surface intersected.
  • this data field which is identified as "RS-232 data output" includes the number of the scene plane being viewed, and the horizontal and vertical coordinates of point of gaze on that surface (with respect to a coordinate frame predefined on that surface) .
  • the data can be read by an external device on a standard RS-232 serial data port.
  • a new data field is available at the same update rate as that being used by the camera imaging the subject's eye. This is generally 60 times per second in the USA, when the eye tracker employs standard NTSC video format cameras, or 50 times per second in Europe or other countries, when the eye tracker employs cameras with standard European PAL video format .
  • Computer 126 uses the point of gaze information to determine the location of gaze within the viewed scene as shown by the video scene monitor and it superimposes a cursor or cross hairs on the image displayed on video scene monitor 130.
  • the system can either use an appropriate set of transformations to map the point of gaze onto the scene image or it can be done by first calibrating to establish a reference point on the scene image. This latter approach involves having the subject look at a fixed reference point in the image scene to determine a reference line-of-gaze direction associated with that point. Then any changes in the line-of-gaze can be readily translated into an appropriate change in the point of gaze in the image scene.
  • FIG. 4 An alternate embodiment of the stationary scene camera implementation is shown in Fig. 4.
  • the standard RS-232 data output available from eye-head tracker processor 128 e.g. ASL model 4100- EHS
  • Eye-head tracker processor 128 e.g. ASL model 4100- EHS
  • External computer 140 is equipped with an NTSC/VGA conversion board 142 such as the Redlake model NTSC 100 Video Digitizer and VGA Overlay Controller.
  • This commercially available board allows computer 140 to display an image from a standard NTSC format video camera 118 on a computer VGA screen 144, and it also allows computer 140 to superimpose VGA graphics on this image.
  • Computer 140 also includes a mouse 146 (or other pointing device) that enables the user to move the cursor about on the video image and it includes programming capable of capturing and recording in memory the VGA coordinates of the cursor at the location at which the mouse is clicked.
  • the stationary scene camera video image is input to NTSC/VGA conversion board 142 in external computer 140.
  • a program in the external computer superimposes a cursor, cross hairs, or other indicator showing the subject's point of gaze, on the VGA image from stationary scene camera 118.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A camera assembly for use in an eye tracking apparatus, the camera assembly including a camera with a lens having an axis (50); and a ring shaped light source (12) disposed around the image axis and near the periphery of the lens aperture, the light source oriented to direct light along the camera axis toward the target.

Description

AN EYE TRACKER USING AN OFF-AXIS, RING ILLUMINATION SOURCE
Background of the Invention Generally, the invention relates to an eye tracker for determining line-of-gaze of a subject and, more particularly, it relates to an eye tracker that uses an improved light source for illuminating the subject's eye. All practical eye movement measurement or eye tracking techniques involve tracking one or more features or reflections that can be optically detected on the eye. Many of the system that are available fall into one of two categories, namely, methods that detect single features on the subject's eye and methods that detect two features on the subject's eye. One way of establishing a feature is by reflecting a light source off of the eye. One feature that has been often used for this purpose is the pupil. In such systems, the equipment determines the position of the pupil center. Through a simple mathematical transformation, changes in the position of the center of the pupil can be easily converted to an indication of the line-of-gaze of the subject.
There are many known algorithms for finding the pupil center. The most appropriate algorithms depend on the type of sensor being used, the desired measurement update rate, and the amount of computer processing power that is available in the equipment.
The retina is highly reflective, but any light reflected back through the pupil will be directed towards its original source. In fact, if the eye is focused at the plane of the source, such retro-reflected light from the retina will be imaged back at the source. Under normal viewing conditions, the retina looks completely black because none of the rays reflected off of the retina return to the observer. If, however, the observer is able to look along the axis of an illumination beam, then the observer will see the retinal reflection and the pupil will appear bright.
Many currently available commercial eye tracker systems use the backlit "bright" pupil effect to perform the eye tracking. This is because the bright pupil tends to be easier to recognize than a dark pupil, especially under low light conditions or if the surrounding features are dark. Also, the bright pupil contrast tends to increase as the detector to the eye distance increase. Thus, systems which use the bright pupil effect tend to operate better with the detector farther from the eye than do systems which use the dark pupil .
It is commonly believed by persons skilled in the art that bright pupil contrast can be adversely affected by any illumination that is not coaxial with the camera or detector. Thus, systems which use the bright pupil technique are designed to keep the illuminating light on the viewing axis. One common approach to achieving this is to use a beam splitter, which typically is a prism with a 45° reflecting surface. The viewing camera looks through the beam splitter (and through the reflecting surface) at the subject's eye. The illumination source, which is off to the side of the viewing axis, directs light at the 45° reflecting surface of the beam splitter which reflects that light along and coaxial with the viewing axis toward the subject. In such implementations, the beam splitter adds to the bulk of the device it tends to attenuate the light reflected back from the subject's eye to the camera.
Another approach is to place the light source in the middle of the viewing lens. This guarantees that the light is coaxial with the viewing axis, but it also blocks part of the lens and thus reduces its efficiency. Summary of the Invention In general, in one aspect, the invention is a camera unit for use in an eye tracking apparatus . The camera unit includes a camera with a lens having an image axis; and a ring shaped light source disposed around the image axis and near the periphery of the lens aperture. The light source is oriented to direct light along the camera axis toward the target .
In general, in another aspect, the invention is an eye line-of-gaze measurement apparatus including an electronic camera with a lens having an image axis; a ring shaped light source disposed around the image axis and near the periphery of the lens aperture and oriented to direct light along the camera axis toward the target; and a digital processor programmed to determine an eye line-of-gaze from the image of a retro-reflection obtained from the subject's eye.
Preferred embodiments include the following features. The the ring-shaped light source includes an array of lights arranged in a circle to form a ring. The plurality of light sources are evenly spaced about the circle. The light sources are LED's.
In general, in still another aspect, the invention is a method of generating a retro-reflection from a subject's eye for use in a line-of-gaze measurement system that utilizes a bright pupil detection technique. The method includes the steps of producing an image of the subject's eye by using a camera that is characterized by a viewing axis; and illuminating the subject's eye with an off-axis illumination to produce a retro- reflection from the retina of the subject's eye.
In preferred embodiments, the step of illuminating the subject's eye involves illuminating with a light source that is distributed around the viewing axis. One advantage of the invention is that even though the illumination is placed off axis from the viewing camera it nevertheless produces a surprisingly effective retro-reflection from the eyes of target subjects.
Another advantage of the invention is that it provides a larger and brighter illumination source which both improves the accuracy of the eye tracking system and makes the overall system less sensitive to variations in ambient light conditions. The improved light source more effectively produces a retro-reflection from the retina of the subject's eye. Also, it produces a large enough reflection off of the cornea and at the same time it produces a bright enough retinal retro-reflection from which the point-of-gaze determinations can be made.
Yet another advantage of the invention is that it eliminates the need for inserting between the viewing camera and the subject's eye a beam splitter which also acts to attenuate the intensity of the image obtained by the camera.
Other advantages and features will become apparent from the following description of the preferred embodiment and from the claims .
Brief Description of the Drawings Fig. 1 is a block diagram of representative components of an eye tracker system including a camera unit which has a ring illumination source;
Fig. 2 is a front view of ring illumination source that is mounted on the front of the camera unit shown in Fig. 1:
Fig. 3 is an embodiment of an eye-head tracker system including a head-mounted scene camera, a stationary scene camera, and a eye tacker camera which includes the ring illumination source; and
Fig. 4 is an alternative embodiment of an eye-head tracker system including a stationary scene camera. Description of the Preferred Embodiments Referring to Fig. 1, an eye tracker system with an improved illumination source includes a solid state camera 10, a ring light source 12 for illuminating the subjects eye 14 (also shown from a front perspective in Fig. 2) , and a digital processor 16 for executing the programs which processes the camera image to locate the pupil, determine its center, and then compute the line- of-gaze of the subject. The novel aspects of the system are in the design and construction of the ring light source that is used to illuminate the subject's eye and its relationship with the camera lens.
The camera can be any commercially available system that is suitable for generating an image of the eye. For example, it can include a solid state linear or two-dimensional array device. The arrays can be made by clustering photocells in tightly packed linear or two- dimensional arrangements. Alternatively, they can include arrays of thousands of charge coupled devices (CCD's) or charge injection devices (CID's) such as are found in commercially available solid state cameras or video cameras. In the described embodiment, the camera is a Sony EVI-D30 pan/tilt color camera, which is a compact solid state device and which uses a CD array. The Sony camera that is used in the described embodiment includes a lens system 20 which focuses the image onto a CCD array 22. It also includes internal electronic circuitry 24 which converts the signals from the CCD array into a form that is displayable and is analyzable by the image processing components of the system. Of course, any camera that generates an image which can be processed and analyzed to extract feature information would be acceptable.
Since the camera was operated at a distance of between 18 to 40 inches from the subjects eye, we further modified the lens system within the Sony camera to magnify the image to a size that was more useful to the image processing system. This involved adding a positive lens 28 before internal lens system 20 and adding a negative lens 30 after the lens system 20. The added front lens was a piano convex lens obtained from Edmund Scientific (part no. 44020) with a 20.5 mm diameter and a 415 mm focal length. the added back lens was a negative lens also obtained from Edmund Scientific (part no. 44090) with a 9.2 mm diameter and a negative focal length of 41 mm.
We also modified the Sony camera by building an addition to the front of the camera which included a plate 40 supporting the ring of illumination 13 that encircles and is in close proximity to the lens opening. The lens system is contained in a housing 21 onto which plate 40 is mounted. Plate 40 has a centrally located circular hole 45 through which the camera views the target scene. The hole in plate 40 is approximately the size of the aperture of the lens. More specifically, the hole is made as small as possible without either compromising the light gathering efficiency of the lens or the quality of the image that is produced. Surrounding hole 45, there is an array of eight, evenly spaced LED's 13 mounted on the plate and oriented to direct light toward the target scene. The LED's have built in lenses which produce a narrow beam than would be generated by the device without the lens. The LED's produce light that is in the near infra-red region, though of course, light of other wavelengths could also be used. The advantage of near infra-red light is that since it is not visible to people, it will not distract them.
In the described embodiment, the LED's were devices sold Siemens Corporation (part no. SFH 484) which produce a 16° beam at a center wavelength of about 880 nm. Also, in the described embodiment, the lens aperture formed in the plate was about 0.5 inch and the diameter of the ring of light was slightly bigger, e.g 0.73 inch. We note that it is desirable to mount the ring of light so that its diameter is as small, i.e., it is as close as possible the central viewing axis 50 (see Fig. 1) of the lens to maximize the retro-reflection that is obtained from the subject's eye. However, as indicated above, the size of the array should not be so small as to interfere with the efficient operation of the lens. If the ring diameter is too large, then the light source will become less effective at producing the retro-reflection. Indeed, at a certain diameter, it will loose all ability to produce a retro-reflection that can be observed by the camera .
As we noted, the array of closely spaced LED's forms a ring of light surrounding the lens aperture. It should be understood, however, that any effective way of producing a ring of light would be acceptable. For example, one could use a single light source and an group of optical fibers to produce the array of individual light sources. Or alternatively, one could use an optical lens system to produce the ring of light. It should also be noted that the retro-reflection can be generated using less than a complete ring of light. Indeed, what was surprising is that it was possible to effectively produce the retro-reflection with an off axis light source. Though we have described the eye tracking system as using the pupil center to determine line-of-gaze, we do not mean to imply that the eye tracking system is limited to using only a single feature of the eye to compute line-of-gaze. In fact, the system may use multiple features also including, for example, corneal reflection of the light source. In the case of corneal reflection, the feature is typically at a different position from the center of the pupil.
Tracking only the position of a single landmark or feature of the eye does not permit the system to distinguish between eye rotation and eye translation with respect to the camera. Thus, further information must be provided such as could be obtained from a head tracker that indicates the position of the head with respect to the camera. Another source of information can be a second feature on the eye, e.g. corneal reflection. Since the second feature is at a different location from the first feature, the system can eliminate the ambiguity the ambiguity between translation and rotation. Of course, the techniques for using two features to perform eye tracking are well known in the art and will not be described here. But it should be understood that the invention is meant to cover systems and method which use these other techniques in addition to the bright pupil technique.
Examples of two embodiments of a complete system which utilize double features to perform eye tracking are shown in Figs. 3 and 4 and will now be described.
Fig. 3 shows a typical eye tracking system which uses the improved light source for bright pupil monitoring. In the system shown in Fig. 3, a person whose point of gaze is being measured wears a helmet 110 on which are mounted a visor 112, an eye tracker sensor and optics unit 114, having the features described above, and a head mounted scene camera 116. Visor 112 is coated to be very reflective in the near infra-red but transparent in the visual spectrum and thus allows the person to look through it while at the same time eye tracker sensor and optics unit 114 is able to "look" at a reflection of the person's eye and head-mounted scene camera 116 is able to see a reflection of the field of view of the subject. A stationary scene camera 118 is mounted on the floor - within proximity of the person. It may be mounted on a tripod, as shown, or fixed to the environment in some other way. Stationary scene camera 118 is aimed so that one or more of the surfaces of interest (e.g. scene planes 120 and 122) are visible in the camera video image. Scene planes 120 and 122 may be instruments on a control panel, visual presentations, or any other regions of visual interest. When the user looks forward, scene planes 120 and 122 are within the field of view of both the person and head-mounted scene camera 116.
Eye tracker sensor and optics unit 114 which incorporates the off-axis light source (e.g. the ring light source) produces a video image that is preprocessed and digitized by an eye tracker electronics unit 124 and sent to a computer 126. Computer 126, which is programmed appropriately, uses the resulting digital information from unit 124 to determine the relative locations of the pupil center and the reflection of the near infrared light source on the front surface of the cornea. From the pupil-to-corneal reflection vector, computer 126 determines the pointing direction of the eye with respect to the head mounted optics. The pointing direction of the eye is represented by two coordinates in computer memory that are proportional to eye azimuth and eye elevation angle, respectively (or the equivalent) .
In the described embodiment, eye tracker electronics unit 124 and computer 126 are both part of a eye-head tracker processor 128, which may be commercially obtained from Applied Science Group, Inc. of Waltham, MA. and is identified as the ASL model 4100H-EHS eye-head tracker system.
The pupil to corneal reflection technique for measuring eye pointing direction is described in the literature, and is well-known to those skilled in the art. (See, for example, Young and Sheena, Methods & Designs, Survey of eye movement recording methods, Behavior Research Methods and Instrumentation 1975, Vol. 7(5), 397-492; Merchant & Morrisette, Remote measurement of eye direction allowing subject motion over one cubic foot of space, IEEE Transactions on Biomedical Engineering, 1974, BME-21, 309-317; and Borah, "Helmet Mounted Eye Tracking for Virtual Panoramic Display Systems", AAMRL-TR-89-019 , Harry B. Armstrong Aerospace Medical Research Laboratory, Human Systems Division, Air Force Systems Command, Wright-Patterson AFB, August 1989.)
Of course, there are also other techniques for determining eye pointing direction with respect to the subject's head and which can be used to place azimuth and elevation eye angle coordinates (or the equivalent) in computer memory. Such techniques are also described in the literature. (See, e.g. Young & Sheena, as referenced above; and Borah, as referenced above) .
Computer 126 maps the resulting coordinate values to a different set of coordinates which represent a horizontal and vertical location on the video image from head mounted scene camera 116. Note that since head mounted scene camera 16 moves along with the head and is optically located at the same (or nearly the same) position as the eye, it remains part of the same reference frame as the eye position detection system. In other words, there is a unique relation between eye pointing direction with respect to the head, and point of gaze with respect to the scene camera image. There are several mapping techniques for mapping eye azimuth and elevation values to such a scene camera image field, including interpolation techniques and curve fit techniques. These techniques are described in the literature, and are well-known to those practiced in the art . The technique used in the preferred embodiment is a curve fit technique (see, e.g. Sheena & Borah, "Compensation for Some Second order Effects to Improve Eye Position Measurements", for D.F. Fisher, R.A. Monty, and J.W. Sanders (Eds) : Eve Movements: Cognition and Visual Perception, L. Erlbaum Assoc., 1981) .
The position coordinates of point of gaze with respect to the head mounted scene camera image, as described in the previous paragraph, can be displayed on a scene monitor 130 as a cursor, cross hairs, or other indicator, superimposed on the video image from head- mounted scene camera 116. The ASL model 4100H-EHS includes the capability for such cursor superimposition. In addition, many other commercially available devices exist which superimpose cursors, cross hairs, or other symbols on a video signal at specified locations, whose coordinates are available in computer memory, and such devices can be used. Note that use of head-mounted scene camera 116 is not a necessary part of the system but is described primarily because it is readily available and is a common part of some eye tracking systems .
The eye tracking system in Fig. 3 also includes a head tracker which determines the position and orientation of the person's head. In the described embodiment, the head tracker is a device based on magnetic principles, such as the 3Space Tracker available from Polhemus, a Kaiser Aerospace & Electronics Company, or The Bird™ available from Ascension Technology, Inc. Other possible embodiments could utilize mechanical goniometers; ultrasonic devices, such as one offered commercially by Logitech, Inc.; optical devices; or any other device that can be used to measure head position and orientation. The magnetic head tracker (MHT) shown in Fig. 3 includes an MHT sensor 132, an MHT transmitter 134, and an MHT control unit 136. MHT sensor 132 is fastened to the subjects helmet, and MHT transmitter 134 is fixed to the environment near the subjects head. MHT control unit 136 determines the position of MHT sensor 132 with respect to MHT transmitter 134 in 6 degrees of freedom and communicates this information to computer 126 via an RS-232 interface. A program in computer 126 uses information from the head tracker, information about eye line of gaze with respect to the head (computed as described above) , and stored information about the location of surfaces in the environment (such as scene plane 120 and scene plane 122) , to determine the location and direction of the eye line of gaze vector with respect to the environment, the surface intersected by the line of gaze vector, and the location of the intersection point (point P in Fig. 3) with respect to the surface intersected. In the illustrated embodiment, this data field, which is identified as "RS-232 data output", includes the number of the scene plane being viewed, and the horizontal and vertical coordinates of point of gaze on that surface (with respect to a coordinate frame predefined on that surface) . The data can be read by an external device on a standard RS-232 serial data port. A new data field is available at the same update rate as that being used by the camera imaging the subject's eye. This is generally 60 times per second in the USA, when the eye tracker employs standard NTSC video format cameras, or 50 times per second in Europe or other countries, when the eye tracker employs cameras with standard European PAL video format .
Computer 126 uses the point of gaze information to determine the location of gaze within the viewed scene as shown by the video scene monitor and it superimposes a cursor or cross hairs on the image displayed on video scene monitor 130. The system can either use an appropriate set of transformations to map the point of gaze onto the scene image or it can be done by first calibrating to establish a reference point on the scene image. This latter approach involves having the subject look at a fixed reference point in the image scene to determine a reference line-of-gaze direction associated with that point. Then any changes in the line-of-gaze can be readily translated into an appropriate change in the point of gaze in the image scene.
An alternate embodiment of the stationary scene camera implementation is shown in Fig. 4. In this embodiment, the standard RS-232 data output available from eye-head tracker processor 128 (e.g. ASL model 4100- EHS) is read by an external PC-AT type computer 140. External computer 140 is equipped with an NTSC/VGA conversion board 142 such as the Redlake model NTSC 100 Video Digitizer and VGA Overlay Controller. This commercially available board allows computer 140 to display an image from a standard NTSC format video camera 118 on a computer VGA screen 144, and it also allows computer 140 to superimpose VGA graphics on this image. Computer 140 also includes a mouse 146 (or other pointing device) that enables the user to move the cursor about on the video image and it includes programming capable of capturing and recording in memory the VGA coordinates of the cursor at the location at which the mouse is clicked. In the embodiment shown by Fig. 4, the stationary scene camera video image is input to NTSC/VGA conversion board 142 in external computer 140. Using information from the eye head tracker, a program in the external computer superimposes a cursor, cross hairs, or other indicator showing the subject's point of gaze, on the VGA image from stationary scene camera 118.
It should be noted that the Sony camera which was described above is actually more appropriate used in a system in which the eye tracker camera is mounted stationary with respect to the floor. In essence, that means moving the head mounted cameras shown in Figs . 3 and 4 to a floor mount system. Such a system would in principle operate the same way though some minor modifications would be necessary, esepcially in the software, due to the fact the frame of reference is now the room rather than the subject's head.
The invention is meant to cover all of the above- mentioned alternative approaches as well as others not specifically mentioned. The above-mentioned embodiments and others are within the following claims.

Claims

Claims :What is claimed is:
1. A camera assembly for use in an eye tracking apparatus, said camera comprising: a camera with a lens having an image axis; and a ring shaped light source disposed around the image axis and near the periphery of the lens aperture, said light source oriented to direct light along the camera axis toward the target.
2. An eye line-of-gaze measurement apparatus comprising : an electronic camera with a lens having an image axis; a ring shaped light source disposed around the image axis and near the periphery of the lens aperture, said light source oriented to direct light along the camera axis toward a target; and a digital processor programmed to determine an eye line-of-gaze from the image of a retro-reflection obtained from the subject's eye.
3. The apparatus of claim 2 wherein the ring- shaped light source comprises an array of lights arranged in a circle to form a ring.
4. The apparatus of claim 3 wherein the plurality of light sources are evenly spaced about the circle.
5. The apparatus of claim 4 wherein the light sources are LED's.
6. A method of generating a retro-reflection from a subject's eye for use in a line-of-gaze measurement system that utilizes a bright pupil detection technique, said method comprising: producing an image of the subject's eye by using a camera that is characterized by a viewing axis; and illuminating the subject's eye with an off-axis illumination to produce a retro-reflection from the retina of the subject's eye.
7. The method of claim 6 wherein the step of illuminating the subject's eye comprises illuminating with a light source that is distributed around the viewing axis.
PCT/US1998/015920 1997-07-30 1998-07-29 An eye tracker using an off-axis, ring illumination source WO1999005988A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US90333397A 1997-07-30 1997-07-30
US08/903,333 1997-07-30

Publications (2)

Publication Number Publication Date
WO1999005988A2 true WO1999005988A2 (en) 1999-02-11
WO1999005988A3 WO1999005988A3 (en) 1999-04-08

Family

ID=25417323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/015920 WO1999005988A2 (en) 1997-07-30 1998-07-29 An eye tracker using an off-axis, ring illumination source

Country Status (1)

Country Link
WO (1) WO1999005988A2 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001074236A1 (en) * 2000-03-31 2001-10-11 University Technologies International Inc. A diagnostic test for attention deficit hyperactivity disorder
FR2846120A1 (en) * 2002-10-21 2004-04-23 Inst Sciences De La Vision Method of revealing information present in impaired sight zone of people with limited vision, uses data on the impaired vision zone of the user to rearrange a screen so all information appears in the normal eye line of the user
WO2004060153A1 (en) * 2002-12-19 2004-07-22 Bausch & Lomb Incorporated System for movement tracking of spherical object
WO2007037751A1 (en) * 2005-09-27 2007-04-05 Penny Ab A device for controlling an external unit
JP2007289658A (en) * 2006-03-27 2007-11-08 Fujifilm Corp Image output apparatus, image output method, and image output program
WO2011024134A1 (en) 2009-08-26 2011-03-03 Ecole Polytechnique Federale De Lausanne (Epfl) Wearable systems for audio, visual and gaze monitoring
RU2442526C1 (en) * 2010-08-05 2012-02-20 Общество с ограниченной ответственностью "АВТЭКС" Specialized video camera for the operator control
EP2551636A1 (en) 2011-07-25 2013-01-30 Leica Geosystems AG Contact-free measuring device and method for controlling the same
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
US9498123B2 (en) 2006-03-27 2016-11-22 Fujifilm Corporation Image recording apparatus, image recording method and image recording program stored on a computer readable medium
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
WO2019187808A1 (en) * 2018-03-29 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program
US11000187B2 (en) 2017-09-07 2021-05-11 Carl Zeiss Meditec, Inc. Systems and methods for improved montaging of ophthalmic imaging data
EP3856003A2 (en) * 2018-09-28 2021-08-04 Carl Zeiss Meditec, Inc. Low cost fundus imager with integrated pupil camera for alignment aid
US11194161B2 (en) 2018-02-09 2021-12-07 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11372476B1 (en) 2018-02-20 2022-06-28 Rockwell Collins, Inc. Low profile helmet mounted display (HMD) eye tracker
US11393251B2 (en) 2018-02-09 2022-07-19 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11412928B2 (en) 2017-08-11 2022-08-16 Carl Zeiss Meditec, Inc. Systems and methods for improved ophthalmic imaging
US11537202B2 (en) 2019-01-16 2022-12-27 Pupil Labs Gmbh Methods for generating calibration data for head-wearable devices and eye tracking system
US11556741B2 (en) 2018-02-09 2023-01-17 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters using a neural network
US11676422B2 (en) 2019-06-05 2023-06-13 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US12140771B2 (en) 2020-02-19 2024-11-12 Pupil Labs Gmbh Eye tracking module and head-wearable device
US12353617B2 (en) 2019-06-18 2025-07-08 Pupil Labs Gmbh Systems and methods for determining one or more parameters of a user's eye

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8885877B2 (en) 2011-05-20 2014-11-11 Eyefluence, Inc. Systems and methods for identifying gaze tracking scene reference locations
US8911087B2 (en) 2011-05-20 2014-12-16 Eyefluence, Inc. Systems and methods for measuring reactions of head, eyes, eyelids and pupils

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4699481A (en) * 1984-09-01 1987-10-13 Canon Kabushiki Kaisha Stereoscopic microscope
JP3016499B2 (en) * 1993-06-03 2000-03-06 キヤノン株式会社 Corneal shape measuring device
US5526148A (en) * 1994-08-02 1996-06-11 Moffat; Robert J. Apparatus and method for full-field calibration of color response to temperature of thermochromic liquid crystals

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001074236A1 (en) * 2000-03-31 2001-10-11 University Technologies International Inc. A diagnostic test for attention deficit hyperactivity disorder
FR2846120A1 (en) * 2002-10-21 2004-04-23 Inst Sciences De La Vision Method of revealing information present in impaired sight zone of people with limited vision, uses data on the impaired vision zone of the user to rearrange a screen so all information appears in the normal eye line of the user
WO2004038662A1 (en) * 2002-10-21 2004-05-06 Institut Des Sciences De La Vision Method of unmasking visual information present under a barely- or non-functional area in the visual field of the eye of a subject and device for implementing said method
WO2004060153A1 (en) * 2002-12-19 2004-07-22 Bausch & Lomb Incorporated System for movement tracking of spherical object
JP2006511305A (en) * 2002-12-19 2006-04-06 ボシュ・アンド・ロム・インコーポレイテッド A system for tracking the movement of spherical objects.
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
WO2007037751A1 (en) * 2005-09-27 2007-04-05 Penny Ab A device for controlling an external unit
US8587514B2 (en) 2005-09-27 2013-11-19 Penny Ab Device for controlling an external unit
EP2004039A4 (en) * 2006-03-27 2011-12-28 Fujifilm Corp PICTURE DISTRIBUTION DEVICE, PICTURE OUTPUT METHOD AND PICTURE OUTPUT PROGRAM
US8243132B2 (en) 2006-03-27 2012-08-14 Fujifilm Corporation Image output apparatus, image output method and image output computer readable medium
US9498123B2 (en) 2006-03-27 2016-11-22 Fujifilm Corporation Image recording apparatus, image recording method and image recording program stored on a computer readable medium
JP2007289658A (en) * 2006-03-27 2007-11-08 Fujifilm Corp Image output apparatus, image output method, and image output program
WO2011024134A1 (en) 2009-08-26 2011-03-03 Ecole Polytechnique Federale De Lausanne (Epfl) Wearable systems for audio, visual and gaze monitoring
RU2442526C1 (en) * 2010-08-05 2012-02-20 Общество с ограниченной ответственностью "АВТЭКС" Specialized video camera for the operator control
EP2551636A1 (en) 2011-07-25 2013-01-30 Leica Geosystems AG Contact-free measuring device and method for controlling the same
WO2013014084A1 (en) 2011-07-25 2013-01-31 Leica Geosystems Ag Measuring device that can be operated without contact and control method for such a measuring device
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
US11412928B2 (en) 2017-08-11 2022-08-16 Carl Zeiss Meditec, Inc. Systems and methods for improved ophthalmic imaging
US11000187B2 (en) 2017-09-07 2021-05-11 Carl Zeiss Meditec, Inc. Systems and methods for improved montaging of ophthalmic imaging data
US11194161B2 (en) 2018-02-09 2021-12-07 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11393251B2 (en) 2018-02-09 2022-07-19 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11556741B2 (en) 2018-02-09 2023-01-17 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters using a neural network
US11340461B2 (en) 2018-02-09 2022-05-24 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US11372476B1 (en) 2018-02-20 2022-06-28 Rockwell Collins, Inc. Low profile helmet mounted display (HMD) eye tracker
US12076087B2 (en) 2018-03-29 2024-09-03 Sony Corporation Information processing apparatus and information processing method
WO2019187808A1 (en) * 2018-03-29 2019-10-03 ソニー株式会社 Information processing device, information processing method, and program
JPWO2019187808A1 (en) * 2018-03-29 2021-04-01 ソニー株式会社 Information processing equipment, information processing methods, and programs
EP3856003A2 (en) * 2018-09-28 2021-08-04 Carl Zeiss Meditec, Inc. Low cost fundus imager with integrated pupil camera for alignment aid
US11537202B2 (en) 2019-01-16 2022-12-27 Pupil Labs Gmbh Methods for generating calibration data for head-wearable devices and eye tracking system
US11676422B2 (en) 2019-06-05 2023-06-13 Pupil Labs Gmbh Devices, systems and methods for predicting gaze-related parameters
US12154383B2 (en) 2019-06-05 2024-11-26 Pupil Labs Gmbh Methods, devices and systems for determining eye parameters
US12353617B2 (en) 2019-06-18 2025-07-08 Pupil Labs Gmbh Systems and methods for determining one or more parameters of a user's eye
US12140771B2 (en) 2020-02-19 2024-11-12 Pupil Labs Gmbh Eye tracking module and head-wearable device

Also Published As

Publication number Publication date
WO1999005988A3 (en) 1999-04-08

Similar Documents

Publication Publication Date Title
WO1999005988A2 (en) An eye tracker using an off-axis, ring illumination source
US5345281A (en) Eye tracking system and method
US6433760B1 (en) Head mounted display with eyetracking capability
US6091378A (en) Video processing methods and apparatus for gaze point tracking
US7533989B2 (en) Sight-line detection method and device, and three-dimensional view-point measurement device
US11238598B1 (en) Estimation of absolute depth from polarization measurements
US6659611B2 (en) System and method for eye gaze tracking using corneal image mapping
US6578962B1 (en) Calibration-free eye gaze tracking
US6637883B1 (en) Gaze tracking system and method
CN100421614C (en) Method and installation for detecting and following an eye and the gaze direction thereof
CN103458770B (en) Optical measuring device and method for capturing at least one parameter of at least one eyes that illumination characteristic can be adjusted
EP3076892B1 (en) A medical optical tracking system
US6394602B1 (en) Eye tracking system
WO2019010214A1 (en) Eye tracking based on light polarization
US6373961B1 (en) Eye controllable screen pointer
US6227667B1 (en) Apparatus for recording the retina reflex image and for superimposing of additional images in the eye
US5555895A (en) Process and device for eye movement analysis
US20140055747A1 (en) Optical Measuring Device and Method for Capturing at Least One Parameter of at Least One Eye Wherein an Illumination Characteristic is Adjustable
US5237351A (en) Visual target apparatus
US11624907B2 (en) Method and device for eye metric acquisition
US10109067B2 (en) Corneal sphere tracking for generating an eye model
EP4158447A1 (en) Systems and methods for providing mixed-reality experiences under low light conditions
CN118394205A (en) Mixed reality interactions using eye tracking techniques
WO2001080566A1 (en) Instrument visualization system
JPH0446570B2 (en)

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

AK Designated states

Kind code of ref document: A3

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA